[go: up one dir, main page]

Skip to main content
Log in

From expressive gesture to sound

The development of an embodied mapping trajectory inside a musical interface

  • Original Article
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bernardini N http://www.cost287.org/

  2. Bevilacqua F, Ridenour J, Cuccia DJ (2002) 3D motion capture data: motion analysis and mapping to music. In: Proceedings of the workshop/symposium on sensing and input for media-centric systems

  3. Bianchi-Berthouze N, Cairns P, Cox A, Jennett C, Kim WW (2006) On posture as a modality for expressing and recognizing emotions. In: Emotion and HCI workshop at BCS HCI London

  4. Cadoz C, Wanderley MM (2000) Gesture-music. In: Trends in gestural control of music, pp 71–93

  5. Camurri A, De Poli G, Leman M, Volpe G (2001) A multi-layered conceptual framework for expressive gesture applications. In: Proc. intl MOSART workshop, Barcelona

  6. Camurri A, Mazzarino B, Ricchetti M, Timmers R, Volpe G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Lecture notes in computer science, pp 20–39

  7. Camurri A, De Poli G, Friberg A, Leman M, Volpe G (2005) The MEGA project: analysis and synthesis of multisensory expressive gesture in performing art applications. J New Music Res 34(1):5–21

    Article  Google Scholar 

  8. Camurri A, Volpe G, De Poli G, Leman M (2005) Communicating expressiveness and affect in multimodal interactive systems. IEEE Multimedia 12(1):43–53

    Article  Google Scholar 

  9. Colombetti G, Thompson E (2007) The feeling body: toward an enactive approach to emotion. Body in mind, mind in body: developmental perspectives on embodiment and consciousness. Hillsdale, Erlbaum

    Google Scholar 

  10. Coulson M (2008) Expressing emotion through body movement. In: Animating expressive characters for social interaction, p 71

  11. Craenen P (2007) Music from some (no) where, here and there: reflections over the space of sounding compositions. Tijdschr Muziektheor 12(1):122

    Google Scholar 

  12. Dahl S (2005) On the beat: Human movement and timing in the production and perception of music. PhD thesis, KTH School of Computer Science and Communications, SE-100 44 Stockholm, Sweden

  13. Davidson JW (1994) What type of information is conveyed in the body movements of solo musician performers. J Hum Mov Stud 6:279–301

    Google Scholar 

  14. De Silva PR, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Anim Virtual Worlds 15

  15. De Silva PR, Osano M, Marasinghe A, Madurapperuma AP (2006) Towards recognizing emotion with affective dimensions through body gestures. In: 7th international conference on automatic face and gesture recognition, 2006. FGR 2006, pp 269–274

  16. Eitan Z, Granot RY (2004) Musical parameters and images of motion. In: Proceedings of the conference on interdisciplinary musicology (CIM04), Graz/Austria, pp 15–18

  17. Eitan Z, Granot RY (2006) How music moves. Music Percept 23(3):221–248

    Article  Google Scholar 

  18. Farne A, Dematte ML, Ladavas E (2005) Neuropsychological evidence of modular organization of the near peripersonal space. Neurology 65(11):1754–1758

    Article  Google Scholar 

  19. Fenza D, Mion L, Canazza S, Roda A (2005) Physical movement and musical gestures: a multilevel mapping strategy. In: Proceedings of sound and music (Computing’05)

  20. Hunt A, Wanderley M, Kirk R (2000) Towards a model for instrumental mapping in expert musical interaction. In: International computer music conference, pp 209–212

  21. Hurley SL (2002) Consciousness in action. Harvard University Press, Cambridge

    Google Scholar 

  22. Jensenius AR (2007) Action-sound: developing methods and tools to study music-related body movement. PhD thesis, Department of Musicology, University of Oslo

  23. Kleinsmith A, Fushimi T, Bianchi-Berthouze N (2005) An incremental and interactive affective posture recognition system. In: International workshop on adapting the interaction style to affective factors, Edinburgh, UK

  24. Kurtenbach G, Hulteen EA (1990) Gestures in human-computer communication. In: The art of human-computer interface design, pp 309–317

  25. Laban R, Lawrence FC (1947) Effort. Macdonald & Evans, London

    Google Scholar 

  26. Leman M (2007) Embodied music cognition and mediation technology. MIT Press, Cambridge

    Google Scholar 

  27. Leman M, Camurri A (2006) Understanding musical expressiveness using interactive multimedia platforms. Music Sci 10(I):209

    Google Scholar 

  28. Lesaffre M, Voogdt LD, Leman M, Baets BD, Meyer HD, Martens JP (2008) How potential users of music search and retrieval systems describe the semantic quality of music. J Am Soc Inf Sci Technol 59(5)

  29. McGinley H, LeFevre R, McGinley P (1975) The influence of a communicator’s body position on opinion change in others. J Pers Soc Psychol 31(4):686–690

    Article  Google Scholar 

  30. Mehrabian A, Friar JT (1969) Encoding of attitude by a seated communicator via posture and position cues. J Consult Clin Psychol

  31. Noë A (2004) Action in perception. MIT Press, Cambridge

    Google Scholar 

  32. Ofli F, Demir Y, Yemez Y, Erzin E, Tekalp AM, Balcı K, Kızoğlu İ, Akarun L, Canton-Ferrer C, Tilmanne J, et al. (2008) An audio-driven dancing avatar. J Multimodal User Interfaces 2(2):93–103

    Article  Google Scholar 

  33. Repp BH (1993) Music as motion: a synopsis of Alexander Truslit’s (1938) Gestaltung und bewegung in der musik. Psychol Music 21(1):48

    Article  Google Scholar 

  34. Rovan JB, Wanderley MM, Dubnov S, Depalle P (1997) Instrumental gestural mapping strategies as expressivity determinants in computer music performance. In: Proceedings of Kansei—the technology of emotion workshop, pp 3–4

  35. Scherer KR (2003) Why music does not produce basic emotions: pleading for a new approach to measuring the emotional effects of music. In: Proc. Stockholm music acoustics conference SMAC-03, pp 25–28

  36. Schindler K, Van Gool L, de Gelder B (2008) Recognizing emotions expressed by body pose: A biologically inspired neural model. Neural Netw 21(9):1238–1246

    Article  Google Scholar 

  37. Tarabella L, Bertini G (2004) About the role of mapping in gesture-controlled live computer music. In: Lecture notes in computer science, pp 217–224

  38. Taraborelli D, Mossio M (2008) On the relation between the enactive and the sensorimotor approach to perception. Conscious Cogn 17(4):1343–1344

    Article  Google Scholar 

  39. von Laban R, Lawrence FC (1967) Effort. Macdonald & Evans, London

    Google Scholar 

  40. Winkler T (1995) Making motion musical: Gesture mapping strategies for interactive computer music. In: Proceedings of the 1995 international computer music conference, pp 261–264

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pieter-Jan Maes.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Maes, PJ., Leman, M., Lesaffre, M. et al. From expressive gesture to sound. J Multimodal User Interfaces 3, 67–78 (2010). https://doi.org/10.1007/s12193-009-0027-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-009-0027-3

Navigation