[go: up one dir, main page]

skip to main content
10.1145/3212721.3212813acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

Ranking-Based Affect Estimation of Motion Capture Data in the Valence-Arousal Space

Published: 28 June 2018 Publication History

Abstract

Affect estimation consists of building a predictive model of the perceived affect given stimuli. In this study, we are looking at the perceived affect in full-body motion capture data of various movements. There are two parts to this study. In the first part, we conduct groundtruthing on affective labels of motion capture sequences by hosting a survey on a crowdsourcing platform where participants from all over the world ranked the relative valence and arousal of one motion capture sequences to another. In the second part, we present our experiments with training a machine learning model for pairwise ranking of motion capture data using RankNet. Our analysis shows a reasonable strength in the inter-rater agreement between the participants. The evaluation of the RankNet demonstrates that it can learn to rank the motion capture data, with higher confidence in the arousal dimension compared to the valence dimension.

References

[1]
Omid Alemi, William Li, and Philippe Pasquier. 2015. Affect-expressive movement generation with factored conditional Restricted Boltzmann Machines. In Affective Computing and Intelligent Interaction (ACII), 2015 International Conference on. IEEE, 442--448.
[2]
Sharifa Alghowinem, Roland Goecke, Michael Wagner, Gordon Parkerx, and Michael Breakspear. 2013. Head pose and movement analysis as an indicator of depression. In Proceedings of the 2013 Conference on Affective Computing and Intelligent Interaction. IEEE Computer Society, 283--288.
[3]
Michael Argyle. 1988. Bodily Communications. Methuen & Co. Ltd.
[4]
Yoann Baveye, Emmanuel Dellandrea, Christel Chamaret, and Liming Chen. 2014. From crowdsourced rankings to affective ratings. In Multimedia and Expo Workshops (ICMEW), 2014 IEEE International Conference on. IEEE, 1--6.
[5]
Yoann Baveye, Emmanuel Dellandrea, Christel Chamaret, and Liming Chen. 2015. Liris-accede: A video database for affective content analysis. IEEE Transactions on Affective Computing 6, 1 (2015), 43--55.
[6]
Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.
[7]
Chris Burges, Tal Shaked, Erin Renshaw, Ari Lazier, Matt Deeds, Nicole Hamilton, and Greg Hullender. 2005. Learning to rank using gradient descent. In Proceedings of the 22nd international conference on Machine learning. ACM, 89--96.
[8]
Antonio Camurri, Ingrid Lagerl Ãűf, and Gualtiero Volpe. 2003. Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. International Journal of Human-Computer Studies 59, 1-2 (2003), 213--225.
[9]
Ginevra Castellano, Santiago D Villalba, and Antonio Camurri. 2007. Recognising Human Emotions from Body Movement and Gesture Dynamics. In Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, Vol. 4738. Springer Berlin Heidelberg, 71--82.
[10]
Beatrice de Gelder. 2006. Towards the neurobiology of emotional body language. Nature Reviews Neurosciences 7, 3(2006), 242--249.
[11]
Marco de Meijer. 1989. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior 13, 4 (1989), 247--268.
[12]
Jianyu Fan, Kıvanç Tatar, Miles Thorogood, and Philippe Pasquier. 2017. Ranking-based emotion recogntion for experimental music. Intern (2017).
[13]
Beat Fasel and Juergen Luettin. 2003. Automatic facial expression analysis: a survey. Pattern recognition 36, 1 (2003), 259--275.
[14]
Johnny RJ Fontaine, Klaus R Scherer, Etienne B Roesch, and Phoebe C Ellsworth. 2007. The world of emotions is not two-dimensional. Psychological science 18, 12 (2007), 1050--1057.
[15]
Nickolaos Fragopanagos and John G Taylor. 2005. Emotion recognition in human computer interaction. Neural Networks 18, 4 (2005), 389--405.
[16]
Yoav Freund, Raj Iyer, Robert E Schapire, and Yoram Singer. 2003. An efficient boosting algorithm for combining preferences. Journal of machine learning research 4, Nov (2003), 933--969.
[17]
Leo A Goodman and William H Kruskal. 1954. Measures of association for cross classifications. J. Amer. Statist. Assoc. 49, 268 (1954), 732--764.
[18]
Sebastian Grassia. 1998. Practical parameterization of rotations using the exponential map. Journal of graphics tools 3, 3 (1998), 29--48.
[19]
Shan He, Shangfei Wang, Wuwei Lan, Huan Fu, and Qiang Ji. 2013. Facial Expression Recognition Using Deep Boltzmann Machine from Thermal Infrared Images. In Proceedings of the 2013 Conference on Affective Computing and Intelligent Interaction. IEEE Computer Society, 239--244.
[20]
Ralf Herbrich, Thore Graepel, and Klaus Obermayer. 2000. Large margin rank boundaries for ordinal regression. (2000).
[21]
Martin Inderbitzin, Aleksander Väljamäe, Jose Maria Blanco Calvo, Paul FMJ Verschure, and Ulysses Bernardet. 2011. Expression of emotional states during locomotion based on canonical parameters. In Automatic Face & Gesture Recognition and Workshops (FG 2011), 2011 IEEE International Conference on. IEEE, 809--814.
[22]
Shihoko Kamisato, Satoru Odo, Yoshino Ishikawa, and Kiyoshi Hoshino. 2004. Extraction of Motion Characteristics Corresponding to Sensitivity Information Using Dance Movement. Journal of Advanced Computational Intelligence and Intelligent Informatics 8, 2 (2004), 168--180. http://dblp.uni-trier.de/db/journals/jaciii/jaciii8.html#KamisatoOIH04
[23]
Asha Kapur, Ajay Kapur, Naznin Virji-Babul, George Tzanetakis, and Peter Driessen. 2005. Gesture-Based Affective Computing on Motion Capture Data. In Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, Vol. 3784. Springer Berlin Heidelberg, 1--7.
[24]
Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective Body Expression Perception and Recognition: A Survey. IEEE Transactions on Affective Computing 4, 1 (2013), 15--33.
[25]
Tie-Yan Liu. 2009. Learning to rank for information retrieval. Foundations and Trends® in Information Retrieval 3, 3 (2009), 225--331.
[26]
Yingliang Ma, Helena M Paterson, and Frank E Pollick. 2006. A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behavior research methods 38, 1 (2006), 134--141.
[27]
Nikolaos Malandrakis, Alexandros Potamianos, Georgios Evangelopoulos, and Nancy Zlatintsi. 2011. A supervised approach to movie emotion tracking. In IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2376--2379.
[28]
Albert Mehrabian. 1996. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology 14, 4 (1996), 261--292.
[29]
Albert Mehrabian and James A Russell. 1974. An approach to environmental psychology. M.I.T. Press.
[30]
Saif M Mohammad and Peter D Turney. 2013. Crowdsourcing a word-emotion association lexicon. Computational Intelligence 29, 3 (2013), 436--465.
[31]
Mihalis A Nicolaou, Hatice Gunes, and Maja Pantic. 2011. Continuous prediction of spontaneous affect from multiple cues and modalities in valence-arousal space. Affective Computing, IEEE Transactions on 2, 2 (2011), 92--105. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5740839
[32]
Michael Nixon, Ulysses Bernardet, Sarah Alaoui, Omid Alemi, Ankit Gupta, Thecla Schiphorst, Steve DiPaola, and Philippe Pasquier. 2015. MoDa: an open source movement database. In Proceedings of the 2nd International Workshop on Movement and Computing. ACM.
[33]
Hanhoon Park, Jong-Il Park, Un-Mi Kim, and Woontack Woo. 2004. Emotion Recognition from Dance Image Sequences Using Contour Approximation. In Structural, Syntactic, and Statistical Pattern Recognition. Lecture Notes in Computer Science, Vol. 3138. Springer Berlin Heidelberg, 547--555.
[34]
Frank E Pollick, Vaia Lestou, Jungwon Ryu, and Sung-Bae Cho. 2002. Estimating the efficiency of recognizing gender and affect from biological motion. Vision Research 42, 20 (2002), 2345 - 2355.
[35]
Justus J Randolph. 2005. Free-Marginal Multirater Kappa (multirater K {free}): An Alternative to Fleiss' Fixed-Marginal Multirater Kappa. Online submission (2005).
[36]
Leonardo Rigutini, Tiziano Papini, Marco Maggini, and Franco Scarselli. 2011. SortNet: Learning to rank by a neural preference function. IEEE transactions on neural networks 22, 9 (2011), 1368--1380.
[37]
James A Russell. 1980. A circumplex model of affect. Journal of personality and social psychology 39, 6 (1980), 1161--1178.
[38]
Ali-Akbar Samadani, Rob Gorbet, and Dana Kulic. 2014. Affective Movement Recognition Based on Generative and Discriminative Stochastic Dynamic Models. Human-Machine Systems, IEEE Transactions on 44, 4 (2014), 454--467.
[39]
Daniel L Schacter, Daniel T Gilbert, and Daniel M Wegner. 2009. Introducing psychology. Macmillan.
[40]
Jan Van den Stock, Ruthger Righart, and Beatrice De Gelder. 2007. Body expressions influence recognition of emotions in the face and voice. Emotion 7, 3 (2007), 487--494.
[41]
Harald G Wallbott. 1998. Bodily expression of emotion. European journal of social psychology 28, 6 (1998), 879--896.
[42]
Georgios N Yannakakis and Héctor Perez Martínez. 2015. Ratings are Overrated! Frontiers in ICT 2, 13 (2015).

Cited By

View all
  • (2024)Self-administered questionnaires enhance emotion estimation of individuals with autism spectrum disorders in a robotic interview settingFrontiers in Psychiatry10.3389/fpsyt.2024.124900015Online publication date: 6-Feb-2024

Index Terms

  1. Ranking-Based Affect Estimation of Motion Capture Data in the Valence-Arousal Space

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    MOCO '18: Proceedings of the 5th International Conference on Movement and Computing
    June 2018
    329 pages
    ISBN:9781450365048
    DOI:10.1145/3212721
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Affective computing
    2. machine learning
    3. motion capture

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    MOCO '18

    Acceptance Rates

    Overall Acceptance Rate 85 of 185 submissions, 46%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)24
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 04 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Self-administered questionnaires enhance emotion estimation of individuals with autism spectrum disorders in a robotic interview settingFrontiers in Psychiatry10.3389/fpsyt.2024.124900015Online publication date: 6-Feb-2024

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media