Repairing the Common Ground between Conversation Analysis and Conversational Technologies
Abstract
Computational linguistics and dialogue systems research share many terms and concepts with conversation analysis, but there are some irreconcilable di5erences in how key conversational phenomena are understood and operationalised between these fields. This leads to misunderstandings (at best) and fully fledged category errors (at worst) when we attempt to collaborate across disciplines that have much to gain from closer cooperation. In this talk, I will use examples from a recent special issue of Discourse Studies (Stokoe, Albert, Buschmeier & Stommel, 2024) to identify opportunities for reconciliation and targets for future cross-disciplinary work.
References
Albert, H., Housley, W., Sikveland, R. O., & Stokoe, E. (forthcoming). The conversational action test: Detecting the artificial sociality of AI. New Media & Society.
Albert, S., & Hall, L. (2024). Distributed agency in smart homecare interactions: A conversation analytic case study. Discourse & Communication, 18(6), 892–904. https://doi.org/10.1177/17504813241267059
Albert, S., Hamann, M., & Stokoe, E. (2023). Conversational user interfaces in smart homecare interactions: A conversation analytic case study. In Proceedings of the 5th International Conference on Conversational User Interfaces (pp. 1–12). ACM. https://doi.org/10.1145/3571884.3597140
Alač, M., Gluzman, Y., Aflatoun, T., Bari, A., Jing, B., & Mozqueda, G. (2020). How everyday interactions with digital voice assistants resist a return to the individual. Evental Aesthetics, 9(1), 51.
Antaki, C., & Crompton, R. J. (2015). Conversational practices promoting a discourse of agency for adults with intellectual disabilities. Discourse & Society, 26(6), 645–661. https://doi.org/10.1177/0957926515592774
Antaki, C., & Kent, A. (2012). Telling people what to do (and, sometimes, why): Contingency, entitlement and explanation in staff requests to adults with intellectual impairments. Journal of Pragmatics, 44(6), 876–889. https://doi.org/10.1016/j.pragma.2012.03.014
Brooker, P., Dutton, W., & Mair, M. (2019). The new ghosts in the machine: ‘Pragmatist’ AI and the conceptual perils of anthropomorphic description. Ethnographic Studies, 16, 272–298. https://doi.org/10.5281/zenodo.3459327
Button, G. (Ed.). (1995). Computers, minds and conduct. Polity Press.
Cooper, S., Di Fava, A., Vivas, C., Marchionni, L., & Ferro, F. (2020). ARI: The social assistive robot and companion. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (pp. 745–751). IEEE. https://doi.org/10.1109/RO-MAN47096.2020.9223470
Craven, A., & Potter, J. (2010). Directives: Entitlement and contingency in action. Discourse Studies, 12(4), 419–442. https://doi.org/10.1177/1461445610370126
Curl, T. S., & Drew, P. (2008). Contingency and action: A comparison of two forms of requesting. Research on Language and Social Interaction, 41(2), 129–153. https://doi.org/10.1080/08351810802028613
Dingemanse, M. (2020). Recruiting assistance and collaboration: A West-African corpus study. In S. Floyd, G. Rossi, & N. J. Enfield (Eds.), Getting others to do things: A pragmatic typology of recruitments (pp. 369–421). Language Science Press. https://doi.org/10.5281/zenodo.4018388
Dreyfus, H. L. (1972). What computers can’t do. MIT Press.
Edwards, D. (1994). Imitation and artifice in apes, humans, and machines. American Behavioral Scientist, 37(6), 754–771. https://doi.org/10.1177/0002764294037006004
Floyd, S., Rossi, G., & Enfield, N. J. (2020). Getting others to do things: A pragmatic typology of recruitments. Zenodo. https://doi.org/10.5281/zenodo.4017493
Garfinkel, H. (2021). Ethnomethodological misreading of Aron Gurwitsch on the phenomenal field. Human Studies, 44(1), 19–42. https://doi.org/10.1007/s10746-020-09566-z
Goodwin, C. (1984). Notes on story structure and the organization of participation. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action: Studies in conversation analysis (pp. 225–246). Cambridge University Press.
Goodwin, C. (2007). Interactive footing. In E. Holt & R. Clift (Eds.), Reporting talk (pp. 16–46). Cambridge University Press. https://doi.org/10.1017/CBO9780511486654.003
Goodwin, C. (2017). Co-operative action. Cambridge University Press. https://doi.org/10.1017/9781139016735
Hall, L., Albert, S., & Peel, E. (2024). Doing virtual companionship with Alexa. Social Interaction. Video-Based Studies of Human Sociality, 7(3), Article 3. https://doi.org/10.7146/si.v7i3.150089
Heinemann, T. (2006). ‘Will you or can’t you?’: Displaying entitlement in interrogative requests. Journal of Pragmatics, 38(7), 1081–1104. https://doi.org/10.1016/j.pragma.2005.09.013
Ivarsson, J., & Lindwall, O. (2023). Suspicious minds: The problem of trust and conversational agents. Computer Supported Cooperative Work (CSCW). https://doi.org/10.1007/s10606-023-09465-8
Jackson, L., Haagaard, A., & Williams, R. (2022). Disability dongle. Platypus: The CASTAC Blog. https://blog.castac.org/2022/04/disability-dongle/
Jaton, F., & Sormani, P. (2023). Enabling ‘AI’? The situated production of commensurabilities. Social Studies of Science, 53(5), 625–634. https://doi.org/10.1177/03063127231194591
Jefferson, G. (1989). Letter to the editor re: Anita Pomerantz’ epilogue to the special issue on sequential organization of conversational activities, Spring 1989. Western Journal of Speech Communication, 53(Fall), 427–429.
Kendrick, K. H., & Drew, P. (2016). Recruitment: Offers, requests, and the organization of assistance in interaction. Research on Language and Social Interaction, 49(1), 1–19. https://doi.org/10.1080/08351813.2016.1126436
Liesenfeld, A., & Dingemanse, M. (2024). Interactive probes: Towards action-level evaluation for dialogue systems. Discourse & Communication. Advance online publication. https://doi.org/10.1177/17504813241267071
Mlynář, J., de Rijk, L., Liesenfeld, A., Stommel, W., & Albert, S. (2024). AI in situated action: A scoping review of ethnomethodological and conversation analytic studies. AI & SOCIETY. https://doi.org/10.1007/s00146-024-01919-x
Pino, M., & Land, V. (2022). How companions speak on patients’ behalf without undermining their autonomy: Findings from a conversation analytic study of palliative care consultations. Sociology of Health & Illness, 44(2), 395–415. https://doi.org/10.1111/1467-9566.13427
Porcheron, M., Fischer, J. E., Reeves, S., & Sharples, S. (2018). Voice interfaces in everyday life. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1–12). ACM. https://doi.org/10.1145/3173574.3174214
Rudaz, D., & Licoppe, C. (2024). ‘Playing the robot’s advocate’: Bystanders’ descriptions of a robot’s conduct in public settings. Discourse & Communication. Advance online publication. https://doi.org/10.1177/17504813241271481
Schütz, A. (2007). The phenomenology of the social world (1932). In Contemporary sociological theory (2nd ed., p. 32). [Original work published 1932]
Stokoe, E., Sikveland, R. O., Albert, S., Hamann, M., & Housley, W. (2020). Can humans simulate talking like other humans? Comparing simulated clients to real customers in service inquiries. Discourse Studies, 22(1), 87–109. https://doi.org/10.1177/1461445619887537
Suchman, L. (2023). The uncontroversial ‘thingness’ of AI. Big Data & Society, 10(2), 20539517231206794. https://doi.org/10.1177/20539517231206794