Abstract
This study explored the potential of artificial emotions displayed by a robot to enhance communication and increase human willingness to assist in situations where the robot is faced with a task it cannot accomplish. Using a process-oriented approach, emotions were viewed as an integral part of the complex dynamics between individuals and their environment, facilitating social cues for coordinated actions. In the first study, participants were shown videos of a robot showing no emotion, as well as sad or angry emotions following a failed task. Participants accurately identified the artificial emotions, and the results indicated that displaying emotions improved overall understanding of the robot's situation. However, it had no significant effect on participants' willingness to help. The second study focused on the robot's role as a collaborator. Participants watched the same videos as in the first study. The results revealed that, on the whole, participants preferred a neutral robot as their collaborator, and showed a particularly strong aversion to the angry robot. While the sad robot increased participants' willingness to help, the study suggests that careful selection of artificial emotions is crucial, taking into account situational appropriateness and the emotional impact on human collaborators. This acknowledges the existence of an affective loop between the robot's artificial emotion and its human counterpart. Overall, this research highlights the potential importance of artificial emotions in human–robot interactions, emphasizes the need for careful consideration when incorporating such emotions, and recognizes the complex interplay between a robot's emotional expression and its impact on human collaborators.





Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
References
Matheson E et al (2019) Human–robot collaboration in manufacturing applications: a review. Robotics 8(4):100
Blaga A, Tamas L (2018) Augmented reality for digital manufacturing. In: 2018 26th Mediterranean Conference on Control and Automation (MED) (pp. 173-178). IEEE
Makris S et al (2016) Augmented reality system for operator support in human–robot collaborative assembly. CIRP Ann 65(1):61–64
Lovasz E-C et al (2017) Design and control solutions for haptic elbow exoskeleton module used in space telerobotics. Mech Mach Theory 107:384–398
Seyitoğlu F et al (2021) Robots as restaurant employees-A double-barrelled detective story. Technol Soc 67:101779
Kupetz M (2014) Empathy display as interactinal achievements - Multimodal and sequential aspects. J Pragmat 61:4–34
Zaki J, Williams WC (2013) Interpersonal emotion regulation. Emotion 13(5):803–810
Butler EA (2015) Interpersonal affect dynamics: It takes two (and time) to tango. Emot Rev 7(4):336–341
Kirby R, Forlizzi J, Simmons R (2010) Affective social robots. Robot Auton Syst 58(3):322–332
Chang WL, Šabanovic S, Huber L (2013) Use of seal-like robot PARO in sensory group therapy for older adults with dementia. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 2013. IEEE
Rosenberg-Kima R et al (2019) Human-Robot-Collaboration (HRC): social robots as teaching assistants for training activities in small groups. In: 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI). 2019. IEEE
de Kervenoael R et al (2020) Leveraging human-robot interaction in hospitality services: Incorporating the role of perceived value, empathy, and information sharing into visitors’ intentions to use social robots. Tour Manage 78:104042
Wykowska A, Chaminade T, Cheng G (2016) Embodied artificial agents for understanding human social cognition. Philosoph Trans Royal Soc B: Biol Sci 371(1693):20150375
Jerčić P et al (2018) The effect of emotions and social behavior on performance in a collaborative serious game between humans and autonomous robots. Int J Soc Robot 10(1):115–129
Onnasch L, Roesler E (2019) Anthropomorphizing robots: The effect of framing in human-robot collaboration. In: Proceedings of the human factors and ergonomics Society annual meeting (Vol. 63, No. 1, pp 1311-1315). Sage CA: Los Angeles, CA: SAGE Publications
Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp 349-356)
Mavridis N (2015) A review of verbal and non-verbal human–robot interactive communication. Robot Auton Syst 63:22–35
Sebanz N, Bekkering H, Knoblich G (2006) Joint action: bodies and minds moving together. Trends Cogn Sci 10(2):70–76
Bauer A, Wollherr D, Buss M (2008) Human–robot collaboration: a survey. Int J Humanoid Rob 5(01):47–66
Ajoudani A et al (2018) Progress and prospects of the human–robot collaboration. Auton Robot 42(5):957–975
Novikova J, Watts L (2015) Towards artificial emotions to assist social coordination in HRI. Int J Soc Robot 7(1):77–88
Stock-Homburg R, (2021) Survey of emotions in human–robot interactions: perspectives from robotic psychology on 20 years of research. Int J Soc Robot p 1–23
Weis PP, Herbert C (2022) Do I still like myself? Human-robot collaboration entails emotional consequences. Comput Hum Behav 127:107060
Smith EE (2007) Cognitive psychology: Mind and brain, Upper Saddle River, NJ: Pearson.
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161
Butler EA (2011) Temporal interpersonal emotion systems: The “TIES” that form relationships. Pers Soc Psychol Rev 15(4):367–393
Cuff BMP et al (2016) Empathy: a review of the concept. Emot Rev 8(2):144–153
Davis MH, Empathy: A social psychological approach. Social psychology series. (1994) Boulder. Westview Press, CO
Zaki J (2014) Empathy: a motivated account. Psychol Bull 140(6):1608–1647
Main A et al (2017) The interpersonal functions of empathy: A relational perspective. Emot Rev 9(4):358–366
Tisseron S, Tordo F, Baddoura R (2015) Testing empathy with robots: A model in four dimensions and sixteen items. Int J Soc Robot 7(1):97–102
Damiano L, Dumouchel P, Lehmann H (2015) Towards human–robot affective co-evolution overcoming oppositions in constructing emotions and empathy. Int J Soc Robot 7(1):7–18
Fischer K, Jung M, Jensen LC, aus der Wieschen MV (2019) Emotion expression in HRI–when and why. In: 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp 29-38)
Read R, Belpaeme T (2014) Situational context directs how people affectively interpret robotic non-linguistic utterances. In: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction (pp 41-48)
Vinciarelli A, Pantic M, Bourlard H (2009) Social signal processing: Survey of an emerging domain. Image Vis Comput 27(12):1743–1759
Vinciarelli A et al (2011) Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE Trans Affect Comput 3(1):69–87
Aylett R et al (2019) An architecture for emotional facial expressions as social signals. IEEE Trans Affect Comput 12(2):293–305
Joo H, Simon T, Cikara M, Sheikh Y (2019) Towards social artificial intelligence: Nonverbal social signal prediction in a triadic interaction. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp 10873-10883)
Cristani M et al (2013) Human behavior analysis in video surveillance: a social signal processing perspective. Neurocomputing 100:86–97
Mehrabian A, Russell JA (1974) An approach to environmental psychology. The MIT Press.
Song S, Yamada S (2017) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In: 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI IEEE)
Bennett CC, Šabanović S (2014) Deriving minimal features for human-like facial expressions in robotic faces. Int J Soc Robot 6(3):367–381
McColl D, Nejat G (2014) Recognizing emotional body language displayed by a human-like social robot. Int J Soc Robot 6(2):261–280
English BA, Coates A, Howard A (2017) Recognition of gestural behaviors expressed by humanoid robotic platforms for teaching affect recognition to children with autism-a healthy subjects pilot study. In: International Conference on Social Robotics, Springer
Urakami J, Sutthithatip S (2021) Building a Collaborative relationship between human and robot through verbal and non-verbal interaction. In: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction
Lehmann H, Broz F (2018) Contagious yawning in human-robot interaction. In: Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
Urakami J et al. (2019) Users' perception of empathic expressions by an advanced intelligent system, in HAI: Kyoto
Vircikova M, Magyar G, Sincak P (2015) The affective loop: A tool for autonomous and adaptive emotional human-robot interaction. Robot intelligence technology and applications 3. Springer, pp 247–254
Johal W, Pellier D, Adam C, Fiorino H, Pesty S (2015) A cognitive and affective architecture for social human-robot interaction. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (pp 71-72)
Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10:569–582
Bourguet ML, Xu M, Zhang S, Urakami J, Venture G (2020) The impact of a social robot public speaker on audience attention. In: Proceedings of the 8th International Conference on Human-Agent Interaction (pp 60-68)
Marmpena M, et al (2019) Generating robotic emotional body language with variational autoencoders. In: 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). 2019. IEEE
Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human–robot interaction. AI Soc 20(2):138–150
Chomeya R (2010) Quality of psychology test between Likert scale 5 and 6 points. J Soc Sci 6(3):399–403
Nadler JT, Weston R, Voyles EC (2015) Stuck in the middle: the use and interpretation of mid-points in items on questionnaires. J Gen Psychol 142(2):71–89
Leung S-O (2011) A comparison of psychometric properties and normality in 4-, 5-, 6-, and 11-point Likert scales. J Soc Serv Res 37(4):412–421
Matell MS, Jacoby J (1972) Is there an optimal number of alternatives for Likert-scale items? effects of testing time and scale properties. J Appl Psychol 56(6):506
Taber KS (2018) The use of Cronbach’s alpha when developing and reporting research instruments in science education. Res Sci Educ 48(6):1273–1296
Bradley MM, Lang PJ (1994) Measuring emotion: the self-assessment manikin and the semantic differential. J Behav Ther Exp Psychiatry 25(1):49–59
Smyth JD, Christian LM, Dillman DA (2008) Does “yes or no” on the telephone mean the same as “check-all-that-apply” on the web? Public Opin Q 72(1):103–113
Smyth JD et al (2006) Comparing check-all and forced-choice question formats in web surveys. Public Opin Q 70(1):66–77
Sudman S, Bradburn NM (1982) Asking questions: a practical guide to questionnaire design. Jossey-Bass
Abebe TH (2019) The Derivation and choice of appropriate test statistic (z, t, f and chi-square test) in research methodology. J Math Lett 5(3):33–40
Bortz Jg, Lienert AL, Boehnke K (2000) Verteilungsfreie Methoden in der Biostatistik. Berlin: Springer
Urakami J, Sutthithatip S, Moore BA (2020) The effect of naturalness of voice and empathic responses on enjoyment, attitudes and motivation for interacting with a voice user interface. In: Human-Computer Interaction. Multimodal and Natural Interaction: Thematic Area, HCI 2020, Held as Part of the 22nd International Conference, HCII 2020, Copenhagen, Denmark, July 19–24, 2020, Proceedings, Part II 22 (pp 244-259). Springer International Publishing
Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308
Smedegaard CV (2019) Reframing the role of novelty within social HRI: from noise to information. In: 2019 14th acm/ieee international conference on human-robot interaction (hri) (pp 411-420). IEEE
Abendschein B, Edwards A, Edwards C (2022) Novelty experience in prolonged interaction: a qualitative study of socially-isolated college students’ in-home use of a robot companion animal. Front Robot AI 9:733078
Acknowledgements
Many thanks to Sujitra Sutthithatip who helped with the preparation and conduction of this research. No grants were received for this research.
Funding
No funding was received to assist with the preparation of this manuscript. The authors have no relevant financial or non-financial interests to disclose.
Author information
Authors and Affiliations
Contributions
All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by all authors. The first draft of the manuscript was written by Jacqueline Urakami and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The author confirms that there are no potential conflicts of interest.
Ethical approval
The study was approved by the Ethical review board at Tokyo University of Technology, approval number 2018135.
Human and animal rights statement
This research involves human participants.
Informed consent
Informed consent was obtained from all participants.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Urakami, J. Do Emotional Robots Get More Help? How a Robots Emotions Affect Collaborators Willingness to Help. Int J of Soc Robotics 15, 1457–1471 (2023). https://doi.org/10.1007/s12369-023-01058-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-023-01058-1