default search action
5th MOCO 2018: Genoa, Italy
- Proceedings of the 5th International Conference on Movement and Computing, MOCO 2018, Genoa, Italy, June 28-30, 2018. ACM 2018
Full and Short Papers: Conceptual frameworks and methodologies
- Jan C. Schacher:
What Quality?: Performing Research on Movement and Computing. 1:1-1:9 - Katerina El Raheb, Sarah Whatley, Antonio Camurri:
A Conceptual Framework for Creating and Analyzing Dance Learning Digital Content. 2:1-2:8 - Ian Jarvis, Doug Van Nort:
Posthuman Gesture. 3:1-3:8
Full and Short Papers: Movement perception and learning
- Andrea Giomi, Federica Fratagnoli:
Listening Touch: A Case Study about Multimodal Awareness in Movement Analysis with Interactive Sound Feedback. 4:1-4:8 - Isabella Poggi, Alessandro Ansani:
The lexicon of the Conductor's gaze. 5:1-5:8 - Jean-Philippe Rivière, Sarah Fdili Alaoui, Baptiste Caramiaux, Wendy E. Mackay:
How Do Dancers Learn To Dance?: A first-person perspective of dance acquisition by expert contemporary dancers. 6:1-6:7 - Jeehyun Joung, Jeounghoon Kim:
Interactive Effect of Tempo and Rhythm on the Emotional Perception of Dance Movements. 7:1-7:4
Full and Short Papers: Annotation, transcription, and visualization
- Katerina El Raheb, Aristotelis Kasomoulis, Akrivi Katifori, Marianna Rezkalla, Yannis E. Ioannidis:
A Web-based system for annotation of dance multimodal recordings by dance practitioners and experts. 8:1-8:8 - Claudia S. Bianchini, Léa Chèvrefils, Claire Danet, Patrick Doan, Morgane Rébulard, Adrien Contesse, Dominique Boutet:
Coding Movement in Sign Languages: the Typannot Approach. 9:1-9:8 - Rafael Kuffner dos Anjos, Cláudia Sofia Ribeiro, Carla Fernandes:
Three-Dimensional Visualization of Movement Qualities in Contemporary Dance. 10:1-10:7 - Ewelina Bakala, Yaying Zhang, Philippe Pasquier:
MAVi: A Movement Aesthetic Visualization Tool for dance video making and prototyping. 11:1-11:5
Full and Short Papers: Movement analysis
- Augusto Dias Pereira dos Santos, Lie Ming Tang, Lian Loke, Roberto Martínez Maldonado:
You Are Off The Beat!: Is Accelerometer Data Enough for Measuring Dance Rhythm? 12:1-12:8 - Sam Amin, Jeff Burke:
OpenMoves: A System for Interpreting Person-Tracking Data. 13:1-13:4 - Luke Dahl, Federico Visi:
Modosc: A Library of Real-Time Movement Descriptors for Marker-Based Motion Capture. 14:1-14:4 - Olga Perepelkina, Galina Arina:
Kinematic predictors for the moving hand illusion. 15:1-15:3 - Erica Volta, Maurizio Mancini, Giovanna Varni, Gualtiero Volpe:
Automatically measuring biomechanical skills of violin performance: an exploratory study. 16:1-16:4 - Alexandra Bacula, Amy LaViers:
Character Recognition on a Humanoid Robotic Platform via a Laban Movement Analysis. 17:1-17:8 - William Li, Omid Alemi, Jianyu Fan, Philippe Pasquier:
Ranking-Based Affect Estimation of Motion Capture Data in the Valence-Arousal Space. 18:1-18:8
Full and Short Papers: Movement and interaction
- Roshni Kaushik, Ilya Vidrin, Amy LaViers:
Quantifying Coordination in Human Dyads via a Measure of Verticality. 19:1-19:8 - Radoslaw Niewiadomski, Lea Chauvigne, Maurizio Mancini, Antonio Camurri:
Towards a model of nonverbal leadership in unstructured joint physical activity. 20:1-20:8 - Behzad Momahed Heravi, Jenny L. Gibson, Stephen Hailes, David Skuse:
Playground Social Interaction Analysis using Bespoke Wearable Sensors for Tracking and Motion Capture. 21:1-21:8
Full and Short Papers: Virtual and robotic characters
- Marco Gillies:
Creating Virtual Characters. 22:1-22:8 - Soumia Dermouche, Catherine Pelachaud:
Attitude Modeling for Virtual Character Based on Temporal Sequence Mining: Extraction and Evaluation. 23:1-23:8 - Ishaan Pakrasi, Novoneel Chakraborty, Amy LaViers:
A Design Methodology for Abstracting Character Archetypes onto Robotic Systems. 24:1-24:8 - Lamtharn Hantrakul, Zachary Kondak, Gil Weinberg:
Practice Makes Perfect: Towards Learned Path Planning for Robotic Musicians using Deep Reinforcement Learning. 25:1-25:4
Full and Short Papers: Case studies and applications
- John Sullivan, Alexandra Tibbitts, Brice Gatinet, Marcelo M. Wanderley:
Gestural Control of Augmented Instrumental Performance: A Case Study of the Concert Harp. 26:1-26:8 - Dom Brown, Chris Nash, Tom Mitchell:
Understanding User-Defined Mapping Design in Mid-Air Musical Performance. 27:1-27:8 - Valerio Lorenzoni, Pieter-Jan Maes, Pieter Van den Berghe, Dirk De Clercq, Tijl De Bie, Marc Leman:
A biofeedback music-sonification system for gait retraining. 28:1-28:5 - Rosella P. Galindo Esparza, Patrick G. T. Healey, Lois Weaver, Matthew Delbridge:
Augmented Embodiment: Developing Interactive Technology for Stroke Survivors. 29:1-29:4
Doctoral Consortium
- Erica Volta, Gualtiero Volpe:
Exploiting multimodal integration in adaptive interactive systems and game-based learning interfaces. 30:1-30:4 - Melanie Irrgang, Jochen Steffens, Hauke Egermann:
Smartphone-Assessed Movement Predicts Music Properties: Towards Integrating Embodied Music Cognition into Music Recommender Services via Accelerometer Motion Data. 31:1-31:4 - Kensho Miyoshi:
Where Kinesthetic Empathy meets Kinetic Design. 32:1-32:4 - Annamaria Minafra:
Exploring the effect of simulated movements on body self-awareness and performance. 33:1-33:4 - Eleonora Ceccaldi, Gualtiero Volpe:
The role of emotion in movement segmentation. 34:1-34:4 - Stefania Moretti, Alberto Greco:
Assessing with the head: a motor compatibility effect. 35:1-35:4 - Thembi Rosa, Carlos Falci:
Registers as inventions: body, dance, memory and digital medias. 36:1-36:3 - Garrett Laroy Johnson, Britta Joy Peterson, Todd Ingalls, Sha Xin Wei:
Lanterns: An Enacted and Material Approach to Ensemble Group Activity with Responsive Media. 37:1-37:4
Posters and Practice Works
- Sofia Dahl, George Sioros:
Rhythmic recurrency in dance to music with ambiguous meter. 38:1-38:6 - Catherine Massie-Laberge, Marcelo M. Wanderley, Isabelle Cossette:
Kinetic Analysis of Hip Motion During Piano Playing. 39:1-39:6 - Michaela Honauer:
Designing a Remote-Controlled Interactive Dance Costume. 40:1-40:6 - Joseph W. Newbold, Nicolas E. Gold, Nadia Bianchi-Berthouze:
Visual cues effect on the impact of sonification on movement. 41:1-41:6 - Frédéric Bevilacqua, Maël Segalen, Véronique Marchand-Pauvert, Iseline Peyre, Pascale Pradat-Diehl, Agnès Roby-Brami:
Exploring different movement sonification strategies for rehabilitation in clinical settings. 42:1-42:6 - Alexandra Q. Nilles, Mattox Beckman, Chase Gladish, Amy LaViers:
Improv: Live Coding for Robot Motion Design. 43:1-43:6 - Georges Gagneré, Andy Lavender, Cédric Plessiet, Tim White:
Challenges of movement quality using motion capture in theatre. 44:1-44:6 - Cumhur Erkut, Sofia Dahl:
Incorporating Virtual Reality in an Embodied Interaction Course. 45:1-45:6 - Ana Tajadura-Jiménez, Francisco Cuadrado, Patricia Rick, Nadia Bianchi-Berthouze, Aneesha Singh, Aleksander Väljamäe, Frédéric Bevilacqua:
Designing a gesture-sound wearable system to motivate physical activity by altering body perception. 46:1-46:6 - Kristi Kuusk, Aleksander Väljamäe, Ana Tajadura-Jiménez:
Magic lining: an exploration of smart textiles altering people's self-perception. 47:1-47:6 - Antonio Camurri, Gualtiero Volpe, Stefano Piana, Maurizio Mancini, Paolo Alborno, Simone Ghisio:
The Energy Lift: automated measurement of postural tension and energy transmission. 48:1-48:3 - Rafael Ramírez, Corrado Canepa, Simone Ghisio, Ksenia Kolykhalova, Maurizio Mancini, Erica Volta, Gualtiero Volpe, Sergio I. Giraldo, Oscar Mayor, Alfonso Pérez, George Waddell, Aaron Williamon:
Enhancing Music Learning with Smart Technologies. 49:1-49:4 - Erica Volta, Paolo Alborno, Monica Gori, Simone Ghisio, Stefano Piana, Gualtiero Volpe:
Enhancing children understanding of mathematics with multisensory technology. 50:1-50:4 - Federico Visi:
SloMo study #2. 51:1-51:2 - Lauren Hayes:
Live Electronic Music Performance: Embodied and Enactive Approaches. 52:1-52:3 - Catie Cuan, Ishaan Pakrasi, Amy LaViers:
Time to Compile. 53:1-53:4 - Carina Karner, Chiara Cardelli, Adrián Artacho:
Embodied Simulations of physical phenomena. 54:1-54:7 - Georges Gagneré, Cédric Plessiet:
Experiencing avatar direction in low cost theatrical mixed reality setup. 55:1-55:6
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.