[go: up one dir, main page]

skip to main content
research-article

Exploring Audience Response in Performing Arts with a Brain-Adaptive Digital Performance System

Published: 04 December 2017 Publication History

Abstract

Audience response is an important indicator of the quality of performing arts. Psychophysiological measurements enable researchers to perceive and understand audience response by collecting their bio-signals during a live performance. However, how the audience respond and how the performance is affected by these responses are the key elements but are hard to implement. To address this issue, we designed a brain-computer interactive system called Brain-Adaptive Digital Performance (BADP) for the measurement and analysis of audience engagement level through an interactive three-dimensional virtual theater. The BADP system monitors audience engagement in real time using electroencephalography (EEG) measurement and tries to improve it by applying content-related performing cues when the engagement level decreased.
In this article, we generate EEG-based engagement level and build thresholds to determine the decrease and re-engage moments. In the experiment, we simulated two types of theatre performance to provide participants a high-fidelity virtual environment using the BADP system. We also create content-related performing cues for each performance under three different conditions. The results of these evaluations show that our algorithm could accurately detect the engagement status and the performing cues have a positive impact on regaining audience engagement across different performance types. Our findings open new perspectives in audience-based theatre performance design.

References

[1]
Jennifer Radbourne, Hilary Glow, and Katya Johanson. 2010. Hidden stories: Listening to the audience at the live performance. Double Dialogues 13 (2010), 1--14.
[2]
Jennifer Radbourne, Katya Johanson, Hilary Glow, and Tabitha White. 2009. The audience experience: Measuring quality in the performing arts. International Journal of Arts Management 11, 16--29.
[3]
Inma Álvarez, Héctor J. Pérez, and Francisca Pérez-Carreño. 2010. Expression in the Performing Arts. Cambridge Scholars Publishing.
[4]
Giulio Jacucci, Stephen Fairclough, and Erin T. Solovey. 2015. Physiological computing. Computer 48, 10 (2015), 12--16.
[5]
Gabor Aranyi, Fred Charles, and Marc Cavazza. 2015. Anger-based BCI Using fNIRS Neurofeedback. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology. 511--521.
[6]
Gabor Aranyi, Florian Pecune, Fred Charles, Catherine Pelachaud, and Marc Cavazza. 2016. Affective interaction with a virtual character through an fNIRS brain-computer interface. Frontiers in Computational Neuroscience 10 (2016), 70.
[7]
T. O. Zander and C. Kothe. 2011. Towards passive brain--computer interfaces: Applying brain--computer interface technology to human--machine systems in general. Journal of Neural Engineering.
[8]
L. George and A. Lécuyer. 2010. An overview of research on “passive” brain-computer interfaces for implicit human-computer interaction. In International Conference on Applied Bionics and Biomechanics (ICABB’10) -- Workshop W1 “Brain-Computer Interfacing and Virtual Reality.”
[9]
Louise Barkhuus and Chiara Rossitto. 2016. Acting with technology: Rehearsing for mixed-media live performances. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 864--875.
[10]
Steve Dixon. 2007. Digital Performance: A History of New Media in Theater, Dance, Performance Art, and Installation. MIT Press.
[11]
Christa Williford. 2000. A computer reconstruction of Richelieu's Palais Cardinal theatre, 1641. Theatre Research International 25, 3 (2000), 233--247.
[12]
David Z. Saltz. 2008. Performing arts. In A Companion to Digital Humanities, Susan Schreibman, Ray Siemens, and John Unsworth (Ed.). John Wiley 8 Sons Press, 121--131.
[13]
Thomas B. Moeslund, Adrian Hilton, and Volker Krüger. 2006. A survey of advances in vision-based human motion capture and analysis. Computer Vision and Image Understanding 104, 2 (2006), 90--126.
[14]
David Z. Saltz. 2004. Virtual Vaudeville. http://www.virtualvaudeville.com/index.htm.
[15]
Peter Dalsgaard and Lone Koefoed Hansen. 2008. Performing perception—Staging aesthetics of interaction. ACM Transactions on Computer-Human Interaction (TOCHI) 15, 3 (2008), 13.
[16]
Stuart Reeves. 2011. Designing Interfaces in Public Settings: Understanding the Role of the Spectator in Human-Computer Interaction. Springer Science 8 Business Media.
[17]
Steve Benford and Gabriella Giannachi. 2011. Performing Mixed Reality. The MIT Press.
[18]
Jacucci Giulio. 2015. Interaction as performance: Performative strategies in designing interactive experiences. Ubiquitous Computing, Complexity, and Culture. Routledge, New York.
[19]
Sander Koelstra, Christian Muhl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, and Touradj Ebrahimi. 2012. DEAP: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing 3, 1 (2012), 18--31.
[20]
Mojtaba Khomami Abadi, Ramanathan Subramanian, Seyed Mostafa Kia, Paolo Avesani, Ioannis Patras, and Nicu Sebe. 2015. DECAF: MEG-based multimodal database for decoding affective physiological responses. IEEE Transactions on Affective Computing 6, 3 (2015), 209--222.
[21]
Giulio Jacucci, Anna Spagnolli, Alessandro Chalambalakis, Ann Morrison, Lassi Liikkanen, Stefano Roveda, and Massimo Bertoncini. 2009. Bodily explorations in space: Social experience of a multimodal art installation. In Proceedings of the IFIP Conference on Human-Computer Interaction. 62--75.
[22]
Celine Latulipe, Erin A. Carroll, and Danielle Lottridge. 2011. Love, hate, arousal and engagement: Exploring audience responses to performing arts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1845--1854.
[23]
Peter J. Lang. 1995. The emotion probe: Studies of motivation and attention. American Psychologist 50, 5.
[24]
Wolfram Boucsein. 2012. Electrodermal Activity. Springer Science 8 Business Media.
[25]
Chen Wang, Erik N. Geelhoed, Phil P. Stenton, and Pablo Cesar. 2014. Sensing a live audience. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. 1909--1912.
[26]
Beatriz Calvo-Merino, Corinne Jola, Daniel E. Glaser, and Patrick Haggard. 2008. Towards a sensorimotor aesthetics of performing art. Consciousness and Cognition 17, 3 (2008), 911--922.
[27]
Desney Tan and Anton Nijholt. 2010. Brain-Computer Interface: Applying Our Minds to Human-Computer Interaction. Springer.
[28]
Pia Tikka, Rasmus Vuori, and Mauri Kaipainen. 2006. Narrative logic of enactive cinema: Obsession. Digital Creativity 17, 4, 205--212.
[29]
Mauri Kaipainen, Niklas Ravaja, Pia Tikka, Rasmus Vuori, Roberto Pugliese, Marco Rapino, and Tapio Takala. 2011. Enactive systems and enactive media: Embodied human-machine coupling beyond interfaces. Leonardo 44, 5, 433--438.
[30]
Stephen Gilroy, Julie Porteous, Fred Charles, and Marc Cavazza. 2012. Exploring passive user interaction for adaptive narratives. In Proceedings of the 2012 ACM International Conference on Intelligent User Interfaces. 119--128.
[31]
Pia Tikka, Aleksander Väljamäe, Aline W. de Borst, Roberto Pugliese, Niklas Ravaja, Mauri Kaipainen, and Tapio Takala. 2012. Enactive cinema paves way for understanding complex real-time social interaction in neuroimaging experiments. Frontiers in Human Neuroscience 6, 298.
[32]
Stephen Gilroy, Julie Porteous, Fred Charles, Marc Cavazza, Eyal Soreq, Gal Raz, Limor Ikar, Ayelet Or-Borichov, Udi Ben-Arie, Ilana Klovatch, and Talma Hendler. 2013. A brain-computer interface to a plan-based narrative. In Proceedings of IJCAI. 1997--2005.
[33]
Alexis Kirke. 2013. Many Worlds: The movie that watches its audience. BBC News. Retrieved from http://www.bbc.co.uk/news/technology-21429437.
[34]
Matthew Pike, Richard Ramchurn, Steve Benford, and Max L. Wilson. 2016. #Scanners: Exploring the control of adaptive films using brain-computer interaction. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 5385--5396.
[35]
Chris Berka, Daniel J. Levendowski, Michelle N. Lumicao, Alan Yau, Gene Davis, Vladimir T. Zivkovic, Richard E. Olmstead, Patrice D. Tremoulet, and Patrick L. Craven. 2007. EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. Aviation, Space, and Environmental Medicine78, 1 (2007), B231--B244.
[36]
Ernst Niedermeyer and F. H. Lopes Da Silva. 1993. Electroencephalography: Basic Principles, Clinical Applications and Related Fields, 3rd ed. Lippincott, Williams 8 Wilkins, Philadelphia.
[37]
Michal Teplan. 2002. Fundamentals of EEG measurement. Measurement Science Review 2, 2 (2002), 1--11.
[38]
Edward Cutrell and Desney Tan. 2007. BCI for passive input in HCI. In Proceedings of CHI’07.
[39]
Laurent George and Anatole Lecuyer. 2010. An overview of research on “passive” brain-computer interfaces for implicit human-computer interaction. In Proceedings of the International Conference on Applied Bionics and Biomechanics (ICABB 2010), Workshop “Brain-Computer Interfacing and Virtual Reality.”
[40]
G. Garcia-Molina, T. Tsoneva, and A. Nijholt. 2013. Emotional brain--computer interfaces. International Journal of Autonomous and Adaptive Communications Systems 6, 1, 9--25.
[41]
Pierre W. Ferrez and José del R. Millán. 2005. You are wrong! Automatic detection of interaction errors from brain waves. In Proceedings of the 19th International Joint Conference on Artificial Intelligence. 83269.
[42]
Desney Tan. 2006. Brain-computer interfaces: Applying our minds to human-computer interaction. In Proceedings of the CHI Workshop: “What is the Next Generation of Human-Computer Interaction?”
[43]
Alan T. Pope, Edward H. Bogart, and Debbie S. Bartolome. 1995. Biocybernetic system evaluates indices of operator engagement in automated task. Biological Psychology 40, 1 (1995), 187--195.
[44]
Anton Nijholt, Danny Plass-Oude Bos, and Boris Reuderink. 2009. Turning shortcomings into challenges: Brain--computer interfaces for games. Entertainment Computing 1, 2 (2009), 85--94.
[45]
Johnny Chung Lee and Desney S. Tan. 2006. Using a low-cost electroencephalograph for task classification in HCI research. In Proceedings of the UIST’06. 81--90.
[46]
Marvin Andujar and Juan E. Gilbert. 2013. Let's learn!: Enhancing user's engagement levels through passive brain-computer interfaces. In CHI'13 Extended Abstracts on Human Factors in Computing Systems. 703--708.
[47]
Jin Huang, Chun Yu, Yuntao Wang, Yuhang Zhao, Siqi Liu, Chou Mo, Jie Liu, Lie Zhang, and Yuanchun Shi. 2014. FOCUS: Enhancing children's engagement in reading by using contextual BCI training sessions. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems. 1905--1908.
[48]
Yomna Abdelrahman, Mariam Hassib, Maria Guinea Marquez, Markus Funk, and Albrecht Schmidt. 2015. Implicit engagement detection for interactive museums using braincomputer interfaces. In Proceedings of the 17th International Conference on HumanComputer Interaction with Mobile Devices and Services Adjunct (MobileHCI’15). ACM, New York, 838--845.
[49]
Daniel Szafir and Bilge Mutlu. 2012. Pay attention!: Designing adaptive agents that monitor and improve user engagement. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 11--20.
[50]
Nicolas Schulz. 2016. CryENGINE Manual. Retrieved from http://docs.cryengine.com/display/SDKDOC1/Home.
[51]
Herbert H. Jasper. 1958. The ten twenty electrode system of the international federation. Electroencephalography and Clinical Neurophysiology 10 (1958), 371--375.
[52]
Noppadon Jatupaiboon, Setha Pan-ngum, and Pasin Israsena. 2013. Real-time EEG-based happiness detection system. The Scientific World Journal 13 (2013).
[53]
Shuo Yan, Gangyi Ding, Hongsong Li, Ningxiao Sun, Yufeng Wu, Zheng Guan, Longfei Zhang, and Tianyu Huang. 2016. Enhancing audience engagement in performing arts through an adaptive virtual environment with a brain-computer interface. In Proceedings of the 21st International Conference on Intelligent User Interfaces.
[54]
Maureen Heneghan Tripp. 2015. Stagecraft theatre. Encyclopaedia Britannica, https://global.britannica.com/art/stagecraft.
[55]
Chuck B. Gloman and Rob Napoli. 2007. Scenic Design and Lighting Techniques: A Basic Guide for Theatre. Taylor 8 Francis.
[56]
Wade M. Vagias. 2006. Likert-Type Scale Response Anchors. Clemson International Institute for Tourism 8 Research Development, Recreation and Tourism Management. Clemson University.

Cited By

View all
  • (2024)Heart and Soul: The Ethics of Biometric Capture in Immersive Artistic PerformanceProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642309(1-23)Online publication date: 11-May-2024
  • (2022)Review of latest noninvasive EEG-based robotic devices2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)10.1109/AIM52237.2022.9863374(599-606)Online publication date: 11-Jul-2022
  • (2022)An Overview on Technologies for the Distribution and Participation in Live EventsExtended Reality10.1007/978-3-031-15546-8_26(312-323)Online publication date: 6-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Interactive Intelligent Systems
ACM Transactions on Interactive Intelligent Systems  Volume 7, Issue 4
Special Issue on IUI 2016 Highlights
December 2017
134 pages
ISSN:2160-6455
EISSN:2160-6463
DOI:10.1145/3166060
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 December 2017
Accepted: 01 May 2017
Revised: 01 May 2017
Received: 01 July 2016
Published in TIIS Volume 7, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Audience engagement
  2. adaptive user interface
  3. brain-computer interface (BCI)
  4. digital performance

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • National Natural Science Foundation of China

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)55
  • Downloads (Last 6 weeks)9
Reflects downloads up to 14 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Heart and Soul: The Ethics of Biometric Capture in Immersive Artistic PerformanceProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642309(1-23)Online publication date: 11-May-2024
  • (2022)Review of latest noninvasive EEG-based robotic devices2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM)10.1109/AIM52237.2022.9863374(599-606)Online publication date: 11-Jul-2022
  • (2022)An Overview on Technologies for the Distribution and Participation in Live EventsExtended Reality10.1007/978-3-031-15546-8_26(312-323)Online publication date: 6-Jul-2022
  • (2021)Adapting Software with Affective Computing: A Systematic ReviewIEEE Transactions on Affective Computing10.1109/TAFFC.2019.290237912:4(883-899)Online publication date: 1-Oct-2021
  • (2019)Moderate Recursion: A Digital Artifact of Interactive Dance10.1007/978-3-030-06134-0_6(48-57)Online publication date: 31-Jan-2019

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media