Abstract
Shimon is a interactive robotic marimba player, developed as part of our ongoing research in Robotic Musicianship. The robot listens to a human musician and continuously adapts its improvisation and choreography, while playing simultaneously with the human. We discuss the robot’s mechanism and motion-control, which uses physics simulation and animation principles to achieve both expressivity and safety. We then present an interactive improvisation system based on the notion of physical gestures for both musical and visual expression. The system also uses anticipatory action to enable real-time improvised synchronization with the human player.
We describe a study evaluating the effect of embodiment on one of our improvisation modules: antiphony, a call-and-response musical synchronization task. We conducted a 3×2 within-subject study manipulating the level of embodiment, and the accuracy of the robot’s response. Our findings indicate that synchronization is aided by visual contact when uncertainty is high, but that pianists can resort to internal rhythmic coordination in more predictable settings. We find that visual coordination is more effective for synchronization in slow sequences; and that occluded physical presence may be less effective than audio-only note generation.
Finally, we test the effects of visual contact and embodiment on audience appreciation. We find that visual contact in joint Jazz improvisation makes for a performance in which audiences rate the robot as playing better, more like a human, as more responsive, and as more inspired by the human. They also rate the duo as better synchronized, more coherent, communicating, and coordinated; and the human as more inspired and more responsive.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Baginsky, N. (2004). The three sirens: a self-learning robotic rock band. http://www.the-three-sirens.info/.
Bainbridge, W., Hart, J., Kim, E., & Scassellati, B. (2008). The effect of presence on human-robot interaction. In Proceedings of the 17th IEEE international symposium on robot and human interactive communication (RO-MAN) 2008.
Cadoz, C., & Wanderley, M.M. (2000). Gesture—music. In M.M. Wanderley & M. Battier (Eds.), Trends in gestural control of music (pp. 71–94). Paris: Ircam—Centre Pompidou.
Crick, C., & Scassellati, B. (2006). Synchronization in social tasks: Robotic drumming. In Proceedings of the 15th IEEE international symposium on robot and human interactive communication (RO-MAN), Reading, UK.
Dannenberg, R. B., Brown, B., Zeglin, G., & Lupish, R. (2005). Mcblare: a robotic bagpipe player. In NIME ’05: proceedings of the 2005 conference on new interfaces for musical expression, (pp. 80–84). Singapore: National University of Singapore.
Degallier, S., Santos, C., Righetti, L., & Ijspeert, A. (2006). Movement generation using dynamical systems: a humanoid robot performing a drumming task. In Proceedings of the IEEE-RAS international conference on humanoid robots (HUMANOIDS06).
Hoffman, G. (2009). Human-robot jazz improvisation (full performance). http://www.youtube.com/watch?v=qy02lwvGv3U.
Hoffman, G., & Breazeal, C. (2004). Collaboration in human-robot teams. In Proc. of the AIAA 1st intelligent systems technical conference. Chicago: AIAA.
Hoffman, G., & Breazeal, C. (2006). Robotic partners’ bodies and minds: an embodied approach to fluid human-robot collaboration. In Fifth international workshop on cognitive robotics, AAAI’06.
Hoffman, G., & Breazeal, C. (2007). Cost-based anticipatory action-selection for human-robot fluency. IEEE Transactions on Robotics and Automation, 23(5), 952–961.
Hoffman, G., & Breazeal, C. (2008). Anticipatory perceptual simulation for human-robot joint practice: theory and application study. In Proceedings of the 23rd AAAI conference for artificial intelligence (AAAI’08).
Hoffman, G., & Weinberg, G. (2010). Gesture-based human-robot jazz improvisation. In Proceedings of the IEEE international conference on robotics and automation (ICRA).
Hoffman, G., Kubat, R., & Breazeal, C. (2008). A hybrid control system for puppeterring a live robotic stage actor. In Proceedings of the 17th IEEE international symposium on robot and human interactive communication (RO-MAN) 2008.
Kidd, C., & Breazeal, C. (2004). Effect of a robot on user perceptions. In Proceedings of the IEEE/RSJ international conference on intelligent robots and systems (IROS2004).
Komatsu, T., & Miyake, Y. (2004). Temporal development of dual timing mechanism in synchronization tapping task. In Proceedings of the 13th IEEE international workshop on robot and human communication (RO-MAN) 2004.
Lasseter, J. (1987). Principles of traditional animation applied to 3d computer animation. Computer Graphics, 21(4), 35–44.
Levenshtein, V.I. (1966). Binary codes capable of correcting deletions, insertions and reversals. Soviet Physics Doklady, 10, 707.
Lim, A., Mizumoto, T., Cahier, L., Otsuka, T., Takahashi, T., Komatani, K., Ogata, T., & Okuno, H. (2010). Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist. In IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1964–1969). doi:10.1109/IROS.2010.5650427.
Meisner, S., & Longwell, D. (1987). Sanford Meisner on acting (1st edn.). New York: Vintage.
Petersen, K., Solis, J., & Takanishi, A. (2010). Musical-based interaction system for the Waseda flutist robot. Autonomous Robots, 28, 471–488. doi:10.1007/s10514-010-9180-5.
Rowe, R. (2001). Machine musicianship. Cambridge: MIT Press.
Singer, E., Larke, K., & Bianciardi, D. (2003). Lemur guitarbot: Midi robotic string instrument. In NIME ’03: Proceedings of the 2003 conference on new interfaces for musical expression (pp. 188–191). Singapore: National University of Singapore.
Solis, J., Taniguchi, K., Ninomiya, T., Petersen, K., Yamamoto, T., & Takanishi, A. (2009). Implementation of an auditory feedback control system on an anthropomorphic flutist robot inspired on the performance of a professional flutist. Advanced Robotics, 23, 1849–1871. doi:10.1163/016918609X12518783330207, http://www.ingentaconnect.com/content/vsp/arb/2009/00000023/00000014/art00003.
Toyota (2010). Trumpet robot. http://www2.toyota.co.jp/en/tech/robot/p_robot/.
Weinberg, G., & Driscoll, S. (2006a). Robot-human interaction with an anthropomorphic percussionist. In Proceedings of international ACM computer human interaction conference (CHI) (pp. 1229–1232), Montréal, Canada.
Weinberg, G., & Driscoll, S. (2006b). Toward robotic musicianship. Computer Music Journal, 30(4), 28–45.
Weinberg, G., & Driscoll, S. (2007). The design of a perceptual and improvisational robotic marimba player. In Proceedings of the 18th IEEE symposium on robot and human interactive communication (RO-MAN 2007) (pp. 769–774), Jeju, Korea.
Ye, P., Kim, M., & Suzuki, K. (2010). A robot musician interacting with a human partner through initiative exchange. In Proc. of 10th intl. conf. on new interfaces for musical expression (NIME2010).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Hoffman, G., Weinberg, G. Interactive improvisation with a robotic marimba player. Auton Robot 31, 133–153 (2011). https://doi.org/10.1007/s10514-011-9237-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10514-011-9237-0