Abstract
In viewing and interacting with robots in social settings, users attribute character traits to the system. This attribution often occurs by coincidence as a result of past experiences, and not by intentional design. This paper presents a flexible, expressive prototype that augments an existing mobile robot platform in order to create intentional attribution through a previously developed design methodology, resulting in an altered perception of the non-anthropomorphic robotic system. The prototype allows customization through five modalities: customizable eyes, a simulated breath motion, movement, color, and form. Initial results with human subject audience members show that, while participants found the robot likable, they did not consider it anthropomorphic. Moreover, individual viewers saw shifts in perception according to performer interactions. Future work will leverage this prototype to modulate the reactions viewers might have to a mobile robot in a variety of environments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bartneck, C., Kulić, D., Croft, E., Zoghbi, S.: Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 1(1), 71–81 (2009)
Bianchini, S., Levillain, F., Menicacci, A., Quinz, E., Zibetti, E.: Towards behavioral objects: a twofold approach for a system of notation to design and implement behaviors in non-anthropomorphic robotic artifacts. In: Laumond, J.-P., Abe, N. (eds.) Dance Notations and Robot Motion. STAR, vol. 111, pp. 1–24. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-25739-6_1
iRobotRoomba Brand Publishing, BuzzFeed News: Every pet on a roomba you ever need to see (2013). https://www.buzzfeed.com/irobotroomba
Breazeal, C.: Emotion and sociable humanoid robots. Int. J. Hum.-Comput. Stud. 59(1), 119–155 (2003)
Broekens, J., Heerink, M., Rosendal, H., et al.: Assistive social robots in elderly care: a review. Gerontechnology 8(2), 94–103 (2009)
Crilly, N., Moultrie, J., Clarkson, P.J.: Seeing things: consumer response to the visual domain in product design. Des. Stud. 25(6), 547–577 (2004)
Cuan, C., Pakrasi, I., LaViers, A.: Time to compile: an interactive art installation. In: 16th Biennial Symposium on Arts & Technology, vol. 51, p. 19 (2018)
Darling, K., Nandy, P., Breazeal, C.: Empathic concern and the effect of stories in human-robot interaction. In: 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 770–775. IEEE (2015)
Dautenhahn, K.: The art of designing socially intelligent agents: Science, fiction, and the human in the loop. Appl. Artif. Intell. 12(7–8), 573–617 (1998)
Eyssel, F., Kuchenbrandt, D.: Social categorization of social robots: anthropomorphism as a function of robot group membership. Br. J. Soc. Psychol. 51(4), 724–731 (2012)
Fischer-Lichte, E.: The transformative power of performance. In: The Transformative Power of Performance, pp. 19–31. Routledge (2008)
Forlizzi, J., DiSalvo, C.: Service robots in the domestic environment: a study of the roomba vacuum in the home. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-robot Interaction, pp. 258–265. ACM (2006)
Hendriks, B., Meerbeek, B., Boess, S., Pauws, S., Sonneveld, M.: Robot vacuum cleaner personality and behavior. Int. J. Soc. Robot. 3(2), 187–195 (2011)
Knight, H., Simmons, R.: Expressive motion with x, y and theta: Laban effort features for mobile robots. In: The 23rd IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2014, pp. 267–273. IEEE (2014)
LaViers, A.: An information theoretic measure for robot expressivity. arXiv preprint arXiv:1707.05365 (2017)
Manyika, J., et al.: A future that works: automation, employment, and productivity. Technical report, San Francisco, CA (2017)
Nagamachi, M.: Kansei engineering: a new ergonomic consumer-oriented technology for product development. Int. J. Ind. Ergon. 15(1), 3–11 (1995)
Pakrasi, I., LaViers, A., Chakraborty, N.: A design methodology for abstracting character archetypes onto robotic systems. In: Paper Present at the 5th International Conference on Movement and Computing (MOCO), June, Genoa, Italy, pp. 28–30 (2018)
Rachel Zarrell, B.N.: This cat dressed as a shark riding a roomba with a shark-baby is the Zen you need today (2014). https://www.buzzfeed.com/rachelzarrell
Simmons, R., et al.: Believable robot characters. AI Mag. 32(4), 39–52 (2011)
Stevens, C.J., et al.: Cognition and the temporal arts: investigating audience response to dance using pdas that record continuous data during live performance. Int. J. Hum.-Comput. Stud. 67(9), 800–813 (2009)
Studd, K., Cox, L.: Everybody is a Body. Dog Ear Publishing, Indianapolis (2013)
Swift-Spong, K., Wen, C.K.F., Spruijt-Metz, D., Matarić, M.J.: Comparing backstories of a socially assistive robot exercise buddy for adolescent youth. In: 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), pp. 1013–1018. IEEE (2016)
Terada, K., Shamoto, T., Ito, A., Mei, H.: Reactive movements of non-humanoid robots cause intention attribution in humans. In: 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2007, pp. 3715–3720. IEEE (2007)
Ward, A.F., Olsen, A.S., Wegner, D.M.: The harm-made mind: observing victimization augments attribution of minds to vegetative patients, robots, and the dead. Psychol. Sci. 24(8), 1437–1445 (2013)
Ward, F.: No Innocent Bystanders: Performance Art and Audience. UPNE, Lebanon (2012)
Young, J.E., Hawkins, R., Sharlin, E., Igarashi, T.: Toward acceptable domestic robots: applying insights from social psychology. Int. J. Soc. Robot. 1(1), 95–108 (2009)
Young, J.E., et al.: Evaluating human-robot interaction. Int. J. Soc. Robot. 3(1), 53–67 (2011)
Acknowledgement
This work was conducted under IRB #17427 and supported by a National Science Foundation (NSF) grant #1528036. The Conference on Research for Choreographic Interfaces (CRCI) provided support for Time to Compile, created by Catie Cuan in collaboration with the RAD Lab.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Pakrasi, I., Chakraborty, N., Cuan, C., Berl, E., Rizvi, W., LaViers, A. (2018). Dancing Droids: An Expressive Layer for Mobile Robots Developed Within Choreographic Practice. In: Ge, S., et al. Social Robotics. ICSR 2018. Lecture Notes in Computer Science(), vol 11357. Springer, Cham. https://doi.org/10.1007/978-3-030-05204-1_40
Download citation
DOI: https://doi.org/10.1007/978-3-030-05204-1_40
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-05203-4
Online ISBN: 978-3-030-05204-1
eBook Packages: Computer ScienceComputer Science (R0)