From tools to social agents
Abstract
Up to now, our understanding of sociality is neatly tied to living beings. However, recent developments in Artificial Intelligence make it conceivable that we may entertain social interactions with artificial systems in the near future. With reference to minimal approaches describing socio-cognitive abilities, this paper presents a strategy of how social interactions between humans and artificial agents can be captured. Taking joint actions as a paradigmatic example, minimal necessary conditions for artificial agents are elaborated. To this end, it is first argued that multiple realizations of socio-cognitive abilities can lead to asymmetric cases of joint actions. In a second step, minimal conditions of agency and coordination in order to qualify as social agents in joint actions are discussed.
References
Austin, J. (1962), How to do Things with Words, Harvard University Press, Cambridge (Mass.).
Baur, T., Mehlmann, G., Damian, I., Gebhard, P., Lingenfelser, F., Wagner, J., Lugrin, B., André, E. (2015), «Context-aware automated analysis and annotation of social human-agent interactions», in ACM Trans. Interact. Intell. Syst., 5 (2), pp. 1-33.
Becker, C., Wachsmuth, I. (2006), «Modeling primary and secondary emotions for a believable communication agent», in Proceedings of the 1st Workshop on Emotion and Computing in conjunction with the 29th Annual German Conference on Artificial Intelligence (KI2006), Bremen, pp. 31-34.
Bratman, M. (2014), Shared agency: A planning theory of acting together, Oxford University Press, Oxford.
Butterfill, S., Apperly, I. (2013), «How to construct a minimal theory of mind», in Mind and Language, 28 (5), pp. 606-637.
Davidson, D. (1980), Essays on actions and events, Oxford University Press, Oxford.
Dennett, D. (1987), The Intentional Stance, The MIT Press, Cambridge (Mass.).
Fletcher, L., Carruthers, P. (2013), Behavior-Reading versus Mentalizing in Animals, in Metcalfe, J., Terrace, H. (eds.) (2013), Agency and Joint Attention, Oxford University Press, Oxford, pp. 82-99.
Fodor, Jerry (1992), «Theory of the Child’s Theory of Mind», in Cognition, 44 (3), pp. 283-296.
Frith, U., Blakemore, S.-J. (2006), Social Cognition, in Morris, R., Tarassenko, L., Kenward M. (eds.) (2006), Cognitive Systems. Information Processing Meets Brain Science, Elsevier Academic Press, Amsterdam, pp. 138-162.
Gilbert, M. (2006), «Rationality in collective action», in Philos. Soc. Sci., 36, pp. 3-17.
Gray, J., Breazeal, C. (2014), «Manipulating Mental States Through Physical Action – A Self-as-Simulator Approach to Choosing Physical Actions Based on Mental State Outcomes», in International Journal of Social Robotics, 6 (3), pp. 315-327.
Halina, M. (2015), «There is no special problem of mindreading in nonhuman animals», in Philosophy of Science, 82(3), pp. 473-490.
Heider, F., Simmel, M. (1944), «An experimental study of apparent behavior», in The American Journal of Psychology, 57, pp. 243-259.
Hevelke, A., Nida-Rümelin, J. (2015), «Responsibility for crashes of autonomous vehicles: An ethical analysis», in J. Sci Eng Ethics, 21, p. 619.
Heyes, C. (1998), «Theory of Mind in Nonhuman Primates», in Behavioral and Brain Sciences, 21 (01), pp. 101-114.
Heyes, C. (2014), «False belief in infancy: a fresh look», in Developmental Science, 17 (5), pp. 647-659.
Heyes, C. (2015), «Animal mindreading: what’s the problem?», in Psychonomic Bulletin & Review, 22 (2), pp. 313-327.
Kang, S., Gratch, J., Sidner, C., Artstein, R., Huang, L., Morency, L.P. (2012), «Towards building a virtual counselor: modeling nonverbal behavior during intimate self-disclosure», in Eleventh International Conference on Autonomous Agents and Multiagent Systems, Valencia, pp. 63-70.
Mattar, N., Wachsmuth, I. (2012), «Small talk is more than chit-chat: Exploiting structures of casual conversations for a virtual agent», in KI 2012: Advances in artificial intelligence, Lecture notes in computer science, vol. 7526, Springer, Berlin, pp. 119-130.
Michael, J., Sebanz, N., Knoblich, G. (2016), «The Sense of Commitment: A Minimal Approach», in Frontiers in Psychology, 6, 1968.
Pacherie, E. (2013), «Intentional joint agency: Shared intention lite», in Synthese, 190 (10), pp. 1817-1839.
Penn, D., Povinelli, D. (2007), «On the Lack of Evidence that Non-human Animals Possess Anything Remotely Resembling a ‘Theory of Mind’», in Philosophical Transactions of the Royal Society B, 362 (1480), pp. 731-744.
Petta, P., Pelachaud, C., Cowie, R. (eds.) (2011), Emotion-Oriented Systems: The Humaine Handbook, Springer, Heidelberg.
Premack, D., Woodruff, G. (1978), «Does the chimpanzee have a theory of mind?», in Behavioral Brain Sciences, 1, pp. 515-526.
Scarborough, J., Bailenson, J. (2014), Avatar Psychology, in Grimshaw, Mark (ed.) (2014), The Oxford Handbook of Virtuality, Oxford University Press, Oxford, pp. 129-144.
Searle, J. (1969), Speech Acts: An Essay in the Philosophy of Language, Cambridge University Press, Cambridge.
Shpall, S. (2014), «Moral and rational commitment», in Philos. Phenomenological Res., 88 (1), pp. 146-172.
Warneken, F., Chen, F., Tomasello, M. (2006), «Cooperative activities in young children and chimpanzees», in Child Dev., 77, pp. 640-663.
Whiten, A. (2013), «Humans are not alone in computing how others see the world», in Animal Behaviour, 86 (2), pp. 213-221.
Copyright (c) 2020 Anna Strasser
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Works published in RIFL are released under Creative Commons Licence:Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.