Skip to main content
This paper explores the use of motion capture data to provide intuitive input to a spatialization system that extends the resonating capabilities of the piano. A camera is placed inside of an amplified piano, which tracks the motion of... more
This paper explores the use of motion capture data to provide intuitive input to a spatialization system that extends the resonating capabilities of the piano. A camera is placed inside of an amplified piano, which tracks the motion of internal mechanisms such as hammer movement. This data is processed so that singular pitches, chords, and other motions inside the piano can produce sonic results. The data is subsequently sent to an ambisonics software module, which spatializes the piano sounds in an effort to provide a meaningful connection between the sound location and the perceived vertical frequency space produced when creating sound on the piano. This provides an audible link between the pitches performed and their location, and an effective option for integrating spatialization of the piano during performance.
Motion-capture is a popular tool used in musically expressive performance systems. Several motion-tracking systems have been used in laptop orchestras, alongside acoustic instruments, and for other musically related purposes such as score... more
Motion-capture is a popular tool used in musically expressive performance systems. Several motion-tracking systems have been used in laptop orchestras, alongside acoustic instruments, and for other musically related purposes such as score following. Some examples of such systems include camera tracking, Kinect for Xbox, and computer vision algorithms. However, these systems lack the ability to track at high resolutions and primarily track larger body motions. The Leap Motion, released in 2013, allows for high resolution tracking of very fine and specific finger and hand gestures, thus presenting an alternative option for composers, performers, and programmers seeking tracking of finer, more specialized movements. Current third party externals for the device are noncustomizable; MRLeap is an external, programmed by one of the authors of this paper, that allows for diverse and customizable interfacing between the Leap Motion and Max/MSP, enabling a user of the software to select and a...
The Black Swan is a networked performance system for two groups of non-specific performers. The work derives its title and inspiration from Nicolas Taleb’s description of extreme and catastrophic events. These “black swan” events are... more
The Black Swan is a networked performance system for two groups of non-specific performers. The work derives its title and inspiration from Nicolas Taleb’s description of extreme and catastrophic events. These “black swan” events are characterized as being outliers, unpredictable, and yet completely explainable when viewed in retrospect. The Black Swan uses this concept in performance; throughout the piece a group of instrumentalists is solely responsible for interpreting the score while a group of motion-tracked performers advance the score. However, when the “Black Swan” occurs, the motion-tracked group begins to generate sound, an event that the instrumentalists could not have anticipated. A third party is responsible for distributing instructions to each performance group over the network during the performance. Therefore, The Black Swan explores the way networked performers communicate with each other as well as the dramaturgy between ensemble members in a networked setting.
This paper examines the use of motion capture to control ambisonic spatialization and sound diffusion parameters in real time. The authors use several software programs, developed in Max, to facilitate the gestural control of... more
This paper examines the use of motion capture to control ambisonic spatialization and sound diffusion parameters in real time. The authors use several software programs, developed in Max, to facilitate the gestural control of spatialization. Motion tracking systems using cameras and peripheral devices such as the Leap Motion are explored as viable and expressive means to provide sound localization. This enables the performer to therefore use movement through personal space to control the placement of the sound in a larger performance environment. Three works are discussed, each using a different method: an approach derived from sound diffusion practices, an approach using sonification, and an approach in which the gestures controlling the spatialization are part of the drama of the work. These approaches marry two of the most important research trajectories of the perfor-mance practice of electroacoustic and computer music; the geographical dislocation between the sound source and t...
Music facilitated by technology has led to an unprecedented development in performancepractice: the ability to generate sound without the physical gesture required when performing on an acoustic instrument. Response to this development... more
Music facilitated by technology has led to an unprecedented development in performancepractice: the ability to generate sound without the physical gesture required when performing on an acoustic instrument. Response to this development has resulted in divergent performance aesthetic preferences, ranging from the emphasis of acousmatically– based listening practices to the development of electronic instruments that replicate the type of human gestural interaction present when using acoustic instruments. This paper examines the incorporation of both previously described aesthetics on a continuum as compositional devices that provide dramatic and narrative elements to electroacoustic works with live performers. Two recent electroacoustic works composed by the author are discussed: Memento Mori (2014) for saxophone and live electronics, and Ecclesiastical Echoes (2015) for piano trio. Analytical focus is placed on how the geographical displacement, or lack thereof, between the agent cre...
This paper explores the use of motion capture data to provide intuitive input to a spatialization system that extends the resonating capabilities of the piano. A camera is placed inside of an amplified piano, which tracks the motion of... more
This paper explores the use of motion capture data to provide intuitive input to a spatialization system that extends the resonating capabilities of the piano. A camera is placed inside of an amplified piano, which tracks the motion of internal mechanisms such as hammer movement. This data is processed so that singular pitches, chords, and other motions inside the piano can produce sonic results. The data is subsequently sent to an ambisonics software module, which spatializes the piano sounds in an effort to provide a meaningful connection between the sound location and the perceived vertical frequency space produced when creating sound on the piano. This provides an audible link between the pitches performed and their location, and an effective option for integrating spatialization of the piano during performance.
Research Interests:
This paper examines the use of motion capture to control ambisonic spatialization and sound diffusion parameters in real time. The authors use several software programs , developed in Max, to facilitate the gestural control of... more
This paper examines the use of motion capture to control ambisonic spatialization and sound diffusion parameters in real time. The authors use several software programs , developed in Max, to facilitate the gestural control of spatialization. Motion tracking systems using cameras and peripheral devices such as the Leap Motion are explored as viable and expressive means to provide sound localization. This enables the performer to therefore use movement through personal space to control the placement of the sound in a larger performance environment. Three works are discussed, each using a different method: an approach derived from sound diffusion practices, an approach using sonification, and an approach in which the gestures controlling the spatialization are part of the drama of the work. These approaches marry two of the most important research trajectories of the performance practice of electroacoustic and computer music; the geographical dislocation between the sound source and the actual, perceived sound, and the dislocation of physical causality to the sound.
Music facilitated by technology has led to an unprecedented development in performance​ practice: the ability to generate sound without the physical gesture required when performing on an acoustic instrument. Response to this development... more
Music facilitated by technology has led to an unprecedented development in performance​ practice: the ability to generate sound without the physical gesture required when performing on an acoustic instrument. Response to this development has resulted in divergent performance aesthetic preferences, ranging from the emphasis of acousmatically​based listening practices to the development of electronic instruments that replicate the type of human gestural interaction present when using acoustic instruments. This paper examines the incorporation of both previously described aesthetics on a continuum as compositional devices that provide dramatic and narrative elements to electroacoustic works with live performers. Two recent electroacoustic works composed by the author are discussed: Me​mento Mori (​2​014) for saxophone and live electronics, and E​cclesiastical Echoes (​2015) for piano trio. Analytical focus is placed on how the geographical displacement, or lack thereof, between the agent creating the sound and the sound itself affects the narrative of the composition. Future directions of these compositional methods and analysis strategies to examine them are explored.
Research Interests:
This paper explores the impact of creatively informed testing on the development of gestural control interfaces. Two interfaces are explored: a motion tracking system contributed as part of the Integrated Multimodal Score Following... more
This paper explores the impact of creatively informed testing on the development of gestural control interfaces. Two interfaces are explored: a motion tracking system contributed as part of the Integrated Multimodal Score Following Environment (IMuSE), and MRleap for use with the Leap Motion. Both systems have thus far been primarily implemented in conjunction with live, acoustic instruments. The IMuSE research project was initiated to design a visual tracking system that, coupled with frequency tracking and amplitude tracking, could follow a performer and determine their temporal location in a score. However, the first use of the software in performance exploited its creative potential as an expressive musical interface. Prior to this creative use, all software testing was done in a controlled environment. Testing in performance where many factors are unforeseen allowed for significant advancements in software design. Additionally, through experimentation with motion tracking software during performance, the discovery was made that there is a lack of fine motor tracking devices, especially for performers on instruments where fine gestures are required, such as the piano. MRleap was subsequently developed to track these fine and detailed gestures. Creatively informed research has therefore served an extremely useful purpose in the development of these expressive systems and determining their effectiveness, as well as informing future research methodologies on part of the authors.
Research Interests:
This paper examines the performance practice of two recent works by the author, Memento Mori (2014) for Saxophone and electronics, and Ecclesiastic Utterances (2014), for piano trio and electronics. The combination of acoustic instruments... more
This paper examines the performance practice of two
recent works by the author, Memento Mori (2014) for
Saxophone and electronics, and Ecclesiastic Utterances
(2014), for piano trio and electronics. The combination of
acoustic instruments and electronics within a piece has
consistently provided a challenge for composers, especially
in regards to obtaining organic performances.
Scholars and composers alike have experimented with
many approaches to this genre, including substituting
tape for live electronics, gesture based instruments, and
human-machine interaction systems. Memento Mori and
Ecclesiastic Utterances are primarily concerned with the
performance practice of the electroacoustic components
and how they integrate with the live instruments, and use
the differences between performance methods as a compositional
and narrative device.
Research Interests:
The Black Swan is a networked performance system for two groups of non-specific performers. The work derives its title and inspiration from Nicolas Taleb’s description of extreme and catastrophic events. These “black swan” events are... more
The Black Swan is a networked performance system for two groups of non-specific performers. The work derives its title and inspiration from Nicolas Taleb’s description of extreme and catastrophic events. These “black swan” events are characterized as being outliers, unpredictable, and yet completely explainable when viewed in retro- spect. The Black Swan uses this concept in performance; throughout the piece a group of instrumentalists is solely responsible for interpreting the score while a group of motion-tracked performers advance the score. However, when the “Black Swan” occurs, the motion-tracked group begins to generate sound, an event that the instrumental- ists could not have anticipated. A third party is responsi- ble for distributing instructions to each performance group over the network during the performance. There- fore, The Black Swan explores the way networked per- formers communicate with each other as well as the dramaturgy between ensemble members in a networked setting.
Research Interests:
Motion-capture is a popular tool used in musically ex- pressive performance systems. Several motion-tracking systems have been used in laptop orchestras, alongside acoustic instruments, and for other musically related pur- poses such as... more
Motion-capture is a popular tool used in musically ex- pressive performance systems. Several motion-tracking systems have been used in laptop orchestras, alongside acoustic instruments, and for other musically related pur- poses such as score following. Some examples of such systems include camera tracking, Kinect for Xbox, and computer vision algorithms. However, these systems lack the ability to track at high resolutions and primarily track larger body motions. The Leap Motion, released in 2013, allows for high resolution tracking of very fine and spe- cific finger and hand gestures, thus presenting an alterna- tive option for composers, performers, and programmers seeking tracking of finer, more specialized movements. Current third party externals for the device are non- customizable; MRLeap is an external, programmed by one of the authors of this paper, that allows for diverse and customizable interfacing between the Leap Motion and Max/MSP, enabling a user of the software to select and apply data streams to any musical, visual, or other parameters. This customization, coupled with the specific type of motion-tracking capabilities of the Leap, make the object an ideal environment for designers of gestural controllers or performance systems.
Research Interests:
Research Interests:
Working draft of survey textbook on the subject. Excerpt. Chapter 6 Terminology and Concepts in Music for Interactive Media.
(the rest of the textbook can be found on my website).
Research Interests: