[go: up one dir, main page]

Academia.eduAcademia.edu
E XPLORING U NCHARTED S OUNDSCAPES : I NNOVATIVE I NTERACTION AND M APPING T ECHNIQUES WITH B ODYHARP AND THE T EINOPHON Leo F OGADI Ć1 , Doga Buse C AVDIR2 , and Dan OVERHOLT3 1,2,3 Aalborg University, Copenhagen, Denmark ABSTRACT The evolution of Digital Musical Instruments (DMIs) has pushed the boundaries of musical expression and interaction. These instruments leverage advanced technologies to offer new dimensions of creative possibilities. This paper investigates the musical possibilities of Bodyharp [1] and the Teinophon [2], each designed to go beyond the limits of traditional instruments. We aim to uncover new perspectives on performer interaction and mapping, using the capabilities of these instruments to surpass traditional stringed-instrument boundaries with their infusion of sensor-based technologies. The study focuses on crafting and analyzing inventive mappings that connect the performer’s gestures to sound, extending digital musical interactions beyond traditional instrument techniques dictated by physical and acoustical limitations. We explore uncharted sonic territories, drawing inspiration from the instruments’ familiarities, yet extending them to create unique musical interactions. Through qualitative and performance-based examination of the interaction between body movements and sonic outcomes, we identify mappings that offer the most compelling and engaging results. This research recognizes the personal nature of musical preferences while encouraging creative exploration following a workshop-style evaluation method. We collected data from eight participants based on their experience with exploring both instruments’ affordances and with solo and collaborative performance practices. Our work contributes to the broader discussion on sensorbased instruments, providing insights into their potential for expanding musical expression beyond established norms. By sparking further innovation and exploration, we hope to deepen our understanding of the creative possibilities embedded in Bodyharp and Teinophon, thus paving the way for new dimensions in contemporary musical expression. 1. INTRODUCTION In recent years, the intersection of musical instruments and sensor-based technologies has given rise to a new realm of possibilities in musical expression. This study Copyright: © 2024. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 Unported License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. explores two similar yet distinctive instruments, Bodyharp and the Teinophon (see Figure 1), both designed to exceed the conventional boundaries of traditional string instruments through innovative interaction and mapping techniques 1 .The real-time sensor technology in these instruments opens avenues for performers to engage with sound in unprecedented ways, breaking away from established practices. As sensor technologies advance, so does the potential for creating instruments that respond intimately to the performer’s gestures. This builds upon existing research in the field of Digital Musical Instruments (DMI), emphasizing the tools we can borrow from Human-Computer Interaction (HCI) [3]. Furthermore, our study aligns with the broader discourse on musical gestures and their correlation with sound, as explored by Godøy and Leman [4], and the conceptual framework of 4E cognition and dynamical systems theory proposed by van der Schyff [5]. Figure 1. BodyHarp and Teinophon co-performance Through this investigation, we seek to answer the question: "How can Bodyharp and the Teinophon facilitate novel interaction and mapping techniques, transcending the constraints of traditional string instruments, and which mappings yield the most musically compelling outcomes?" By addressing these things, we aim to uncover fresh perspectives on performer interaction, offering insights into the expressive possibilities embedded in these instruments. 1 A performance excerpt of the two instruments collaboration can be accessed here: https://youtu.be/lDolhm11xfg The objectives of the research are: • Exploring inventive mappings connecting performer gestures to sonic outputs on Bodyharp and the Teinophon. • Identifying mappings that yield the most compelling and engaging musical outcomes. a tremolo effect with the dabbing gesture; (3) One slider controls the gain and the other one controls the note duration by changing the time constant of the string model; (4) The square pressure sensor, positioned on the back of the hand controller, increases the drive of the filter. With a delay coupled with this effect, touch interaction creates an echo effect. • Contributing to the broader discussion on sensorbased instruments, expanding the understanding of musical expression with DMIs. 2. BACKGROUND AND RELATED WORK In his research around action-sound couplings and relationships, Jensenius draws attention to how artificial action-sound relationships will never be as solid as an action-sound couplings [6]. However, in a scenario when designing DMIs and testing them with various types of sound engines, Jensenius describes it as a potentially powerful way of exploring action-sound relationships in practice. This approach may provide valuable insights into some of the underlying features of our perception of action-sound relationships. In addition, we take inspiration from Overholt’s discussions concerning how best to leverage certain properties of acoustic instruments and synthesis algorithms, introducing them into the development of DMIs [7]. Finding mappings that excel at translating a performer’s gestures into sound in ways that can evoke the intended affective qualities, while maintaining each instrument’s musical identity is a complex process. The many types of effort involved are nicely outlined in Baalman’s book on mapping, Composing Interactions [8]. Figure 2. Bodyharp 3. INSTRUMENTS OVERVIEW Bodyharp, initially developed in 2018 [1], is a semiwearable instrument, consisting of an enclosure where strings are placed, and a wearable arm piece [1]. The iteration used in this study, shown in Figure 2, features a redesigned 3D-printed enclosure and a wearable hand controller [9]. The hand controller serves as an interface with digital sensors, such as buttons, force-sensing resistors, and an accelerometer, allowing the musician to have more nuanced control over the sound parameters. Additionally, the controller houses the main controller, a Teensy board 2 , which handles all the input data and sends it to a computer for further processing. Bodyharp’s sound mapping follows a gesture-based mapping model [10] on two scales of gestures. The larger-scale gestures contribute to sound production (through plucking or stretching the strings) while small-scale, nuanced gestures control the sound effects with the following mapping strategy: (1) The push buttons on the hand controller change the chord progression in three scales; (2) The pressure (or force-sensitive resistor, FSR) sensor, positioned under the thumb, controls the quality of the filter, creating 2 https://www.pjrc.com/teensy/ Figure 3. The Teinophon The Teinophon, firstly constructed in 2021 [2], is a tabletop instrument with a simple interface consisting of 7 horizontally laid strings parallel to one another. The iteration used in this study, shown in Figure 3, is an improved version of the Teinophon. It is built with a more durable custom-made wooden enclosure. Furthermore, the spring mechanism uses a different, more robust design. Lastly, this latest iteration of the Teinophon features a Bela board 3 for processing input signals and synthesizing sound outputs. Teinophon’s sound mapping also follows a gesture-based mapping model, focusing on the interaction between the performer and the strings. One mapping involves the detection of plucking events, where the velocity and intensity of the sound are directly influenced by the displacement 3 https://bela.io and release speed of the strings. Furthermore, the instrument maps the lateral movements and pressure exerted on the strings to control parameters bow velocity and bow position. Additionally, low-pass filters are applied and their cutoff frequency is mapped to the height of each pulled string. The interactions are processed in real-time, allowing for dynamic and expressive performances. These intuitive mappings enables performers to produce a wide range of sounds by varying their gestures and interactions with the strings. 4. METHODOLOGY 4.1 Participants Eight participants attended the workshop, ranging in age from 26 to 52 years. Although participants had different levels of musical experiences, a background in music was not required for this study. All participants were informed about the nature of the study and signed our consent form before participating. 4.2 Instruments The instruments were introduced to participants at the beginning of each session. Detailed explanations of each instrument’s components, playing techniques, and sonic possibilities were provided to ensure participants had a foundational understanding before engaging in the evaluation sessions. 4.3 Evaluation Sessions The study employed a mixed-methods approach, combining quantitative data collection through Likert-style questionnaires with qualitative insights gathered during postperformance discussions. The participants’ interactions were observed by the researchers in both solo and coperformance sessions. 4.3.1 Solo Performances Each participant was given dedicated time for solo exploration with both Bodyharp and the Teinophon. During this phase, participants were encouraged to experiment with various playing techniques, such as plucking, pulling, and other techniques such as transverse string displacements. The goal was to allow participants to familiarize themselves with the instruments and uncover potential nuances in sonic expression. 4.3.2 Impressions And Questions Participants were given a chance to provide short feedback about the two instruments, and get more detailed information, about playing techniques. 4.3.3 Co-Performance Following the solo sessions, participants engaged in collaborative performances in pairs and improvised Bodyharp/Teinophon duets. This segment of the mapping workshop aimed to investigate how the instruments interacted in a shared musical space, exploring potential synergies and challenges in combining Bodyharp and Teinophon. The co-performance sessions were recorded for further analysis, and all participants were given a post-workshop interview as a chance to provide any final reflections. 4.4 Data Collection Data collection occurred through a two-phase process: 4.4.1 Questionnaires Structured questionnaires were designed to capture participants’ subjective experiences, preferences, and challenges with each instrument. The questions assessed factors such as ease of use, expressiveness, and overall satisfaction [11]. Open-ended questions allowed participants to provide detailed qualitative feedback. 4.4.2 Post-Performance Discussions The post-performance discussion was conducted in pairs. This qualitative phase aimed to delve deeper into participants’ experiences, uncovering insights that might not be captured by quantitative measures. Participants were encouraged to share preferences, and any notable challenges encountered during the performances. 5. MAPPING TECHNIQUES Besides the sound synthesis engine and the control interface, mapping is one of the vital aspects of designing digital musical instruments. With endless possibilities, it can be a fun but challenging task to implement mappings which will make the instrument expressive and intuitive to use. Common techniques in mapping include parameter scaling, non-linear functions, and convergent mappings. In a study of a mapping design process, West et al. point out that effective mappings should consider the balance of musical agency between the player and the instrument, primarily empowering the player to perform specific sounds as they intend to, but perhaps sometimes allowing the instrument to behave unexpectedly [12]. The practice of mapping dates back as far as the inception of acoustic instruments themselves. However, it is only with the development of real-time electronic instruments that designers have actively integrated flexible mappings into each instrument. It is proven that mappings are more effective at eliciting a good performance from a human player when the performer is confronted with multiparametric tasks, more so than a series of one-to-one mappings [13]. The mappings implemented for the workshop were both loosely inspired by traditional string-instruments, slightly more directly on the Teinophon than on BodyHarp. Synthesis algorithms used on the Teinophon include both plucked (Karplus-Strong) and bowed (Waveguide) string physical models, written in C++ on Bela; mappings involve event detection for plucks, amount of pulling the string for bow pressure and velocity parameters, and detection of transverse string displacement for determining the bowing position parameters of each string. For BodyHarp, complex mappings strategies, using Faust physical models that are implemented in ChucK audio programming language, were utilized. The mappings include layers of one-to-one mapping, such as mapping string height to chords with higher pitch classes, to many-to-many parameters, such as mapping force sensitive resistor (FSR) and accelerometer data to sound synthesis algorithms parameters (for a detailed review on sound design, please see [9]). 6. EVALUATION RESULTS 6.1 Participant Experience 6.1.1 Instrument Familiarity and Musicality Participants’ evaluations shed light on their perceptions of the instruments’ familiarity and musicality, crucial aspects in understanding their engagement with Bodyharp and the Teinophon. Bodyharp’s more complex mappings presented challenges in translating gestural input into expressive musical output. On the other hand, participants deemed the Teinophon more intuitive to play, attributing it higher musicality ratings. This distinction in perceived familiarity could influence participants’ comfort levels and creativity with each instrument. 6.1.2 Control and Intuitive Interaction Control dynamics and the perceived naturalness of interactions played pivotal roles in participants’ evaluations of the instruments. Participants reported a moderate sense of control over Bodyharp, and minor concerns were raised regarding the control of the produced sounds with the gestural interactions. The more traditional-looking string interface of the Teinophon seemed to resonate positively with participants, contributing to a more transparent experience. However, no conclusions can be drawn from this due to the lack of any longitudinal evaluations. 6.2 Sound Preferences and Creativity Collaborative performances yielded generally positive feedback, with participants navigating the challenge of combining the unique sonic characteristics of both instruments. Furthermore, the participants acknowledged the complementarity of the instruments and expressed enjoying how the two instruments quickly went from being harmonious to sounding dissonant. These moments evoked certain emotions with the participants. They described that during the performances, their emotions ranged from peaceful to spooky. Additionally, the participants expressed positive remarks on the joint movement efforts. Many of them noted how they were observing the other performer and trying to coordinate their movements. One participant phrased the experience as: "...showing people sound ... almost like a dance". Lastly, instances of sonic overlaps were noted during collaborative sessions, signaling potential areas for adjusting mappings to maintain clarity and coherence in joint performances. 6.3 Mapping Feedback Participants provided nuanced insights into their experiences with the instrument mappings, addressing specific aspects related to the gestural mappings employed for sonic expression. 6.3.1 Bodyharp Mappings Overall, positive feedback was received for the mappings. The participants expressed their appreciation of using their whole bodies to play the instruments. One participant even compared their movements to the martial arts Tai Chi. However, the test subjects suggested clearer representations of gesture-to-sound relationships. Participants emphasized the importance of understanding which sounds were being controlled to enhance their expressive capabilities. Positive feedback was expressed on the sound qualities of the instruments. The participants evaluated the physical models as sounding good, and detailed. However, some of them have expressed that they would enjoy hearing completely different and unexpected sounds. Feedback on Bodyharp indicated positive responses to the produced sounds. Participants reported feeling creatively inspired, exploring unconventional sonic outcomes. However, there were suggestions for expanding the variety of sounds. Similar positive sentiments were expressed regarding the Teinophon, with participants appreciating the instrument’s capacity to evoke creative exploration. However, similar to Bodyharp, participants suggested more diverse sonic outputs. 6.3.2 Teinophon Mappings 6.2.1 Collaborative Performance Feedback 7.1.1 Bodyharp’s Unique Affordances Participants’ reflections on collaborative performances provided insights into the instruments’ synergy and their impact on shared musical spaces. Bodyharp’s innovative gestural interactions presented a distinctive way for performers to engage with music. When the performer is embodied with the instrument it Participants appreciated the coherence between gestural input and sonic output. Generally, positive feedback was received for the mappings. Specific suggestions were made for Teinophon, to refine plucking control with adding a mute feature when touching the strings in their resting position, and plucking one side of the strings while they are raised. Other suggestions were made such as adding higher level mappings, for example detecting the acceleration and jerk of movements to control the sound output. Additionally, the participants suggested adding effects like vibrato. 7. DISCUSSION 7.1 Familiarity and Musicality opens up new possibilities for creative exploration. Future improvements should aim to focus even more on these unique affordances, providing clearer mappings to enhance performers’ understanding of the instrument and its sonic capabilities. opinion that emphasized the need for future studies to consider longer practice times. This would impact performers’ comfort levels and creative exploration. 7.1.2 Teinophon’s Intuitive Design Our study emphasizes the positive aspects of both Bodyharp and the Teinophon while providing constructive feedback for future development. Bodyharp, with its unique affordances and gestural interaction, presents an excellent example of embodied interaction and a holistic approach to performances. Through our study, participants highlighted the instrument’s capacity to induce creative exploration. Despite the suggestion for clearer mappings, the overall opinion leaned towards a positive experience in solo and/or collaborative performance. Turning the focus to the Teinophon, our study has shed light on its intuitive interface, providing the performers with a familiar yet technologically enhanced string interface. Participants complimented the transparency it offers, emphasizing the ease of play and instant musicality. Suggestions for more refined plucking mappings and incorporating new mappings open the door for future developments. Teinophon’s more traditional looking string interface was perceived as familiar, reminiscent of a harp or a guitar, contributing to an intuitive playing experience. This feedback suggest incorporating familiar elements into novel digital musical instruments, facilitating a smoother learning experience. Future iterations of Teinophon should keep a good balance of familiarity and innovative interaction. 7.2 Sonic Preferences and Creativity 7.2.1 Diverse Sonic Palette Even though an appreciation of both instruments’ somewhat traditional sounding qualities was expressed, the participants would have liked a more diverse sonic palette and hearing unexpected sounds. Future enhancements could potentially facilitate more variety of sounds with different synthesis algorithms. 7.2.2 Collaborative Performance Dynamics In the collaborative performances, the instruments demonstrated a capability to induce emotional responses. This positive feedback highlights the instruments’ suitability for collaborative contexts. It was shown that playing these instruments together, the performers compliment each other and are aware of their joint performance through sonic and visual feedback. 7.3 Mapping Feedback 7.3.1 Bodyharp’s Expressive Mappings Participants have expressed appreciation for Bodyharp’s mappings and praised the engagement of the whole body when playing. The comparison to Tai Chi underscores the instrument’s holistic and embodied experience. Future iterations should focus on refining mappings to provide clearer representations, empowering performers to fully harness the expressive capabilities inherent in Bodyharp. 8. CONCLUSION Acknowledgments We would like to thank all the participants who attended the workshop and contributed to our research. We are also thankful for their wonderful performances. 9. REFERENCES [1] D. B. Cavdir, R. Michon, and G. Wang, “The bodyharp: Designing the intersection between the instrument and the body,” in Proceedings of the 15th Sound and Music Computing Conference, 2018, pp. 498–503. [2] L. Fogadić and D. Overholt, “The Teinophon,” in Proceedings of the 19th Sound and Music Computing Conference, 2022, pp. 442–445. [3] N. Orio, N. Schnell, and M. Wanderley, “Input Devices for Musical Expression: Borrowing Tools from HCI,” 01 2001. 7.3.2 Teinophon’s Successful Translation The instrument’s well-received mappings proved a successful translation of gestural input to sonic output. Participants valued the coherence between actions and sounds. Improvements, such as the suggested mute feature and higher-level mappings, open paths for further innovation, maintaining the instrument’s positive trajectory in delivering intuitive and expressive mappings. 7.4 Practice and Time Investment Since each session lasted around 1 hour and 30 minutes per pair, participants expressed a shared desire for extended practice time. This would have allowed them to get more comfortable with the instruments. This was a collective [4] R. Godøy and M. Leman, “Musical Gestures: Sound, Movement, and Meaning,” 01 2009. [5] D. van der Schyff, A. Schiavio, A. Walton, V. Velardo, and A. Chemero, “Musical creativity and the embodied mind: Exploring the possibilities of 4E cognition and dynamical systems theory,” vol. 1, pp. 1–18, 09 2018. [6] A. Jensenius, “ACTION–SOUND - Developing Methods and Tools to Study Music-Related Body Movement,” Ph.D. dissertation, 01 2007. [7] D. Overholt, “Designing Interactive Musical Interfaces,” in Designing Interactions for Music and Sound, M. Filimowicz, Ed. Focal Press, 2022, pp. 1–29. [8] M. Baalman, Composing Interactions - An Artist’s Guide to Building Expressive Interactive Systems. V2 Publishing, 2022. [9] D. Cavdir and G. Wang, “Designing felt experiences with movement-based, wearable musical instruments: From inclusive practices toward participatory design,” Wearable Technologies, vol. 3, p. e19, 2022. [10] Cavdir, Doga and Wang, Ge, “Borrowed Gestures: The Body as an Extension of the Musical Instrument,” Computer Music Journal, vol. 45, no. 3, pp. 58–80, 09 2021. [Online]. Available: https://doi.org/10.1162/ comj_a_00617 [11] G.-M. Schmid, Evaluating the Experiential Quality of Musical Instruments. Springer, 2017. [12] T. West, B. Caramiaux, S. Huot, and M. Wanderley, “Making Mappings: Design Criteria for Live Performance,” in International Conference on New Interfaces for Musical Expression, 06 2021. [13] A. Hunt and M. Wanderley, “Mapping performer parameters to synthesis engines,” Organised Sound, vol. 7, pp. 97 – 108, 08 2002.