Facial Emotion Recognition Using Context Based Multimodal Approach
DOI:
https://doi.org/10.9781/ijimai.2011.142Keywords:
Emotion recognition, Body posture recognition system, Multimodal, Facial recognitionAbstract
Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of human-computer interaction that enable the computer to be more aware of the user's emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion. Multimodal system gives more accurate result than a signal or bimodal system.Downloads
References
[1] Irfan A. Essa and Alex P. Pentland “Coding, Analysis, Interpretation, and Recognition of Facial Expressions” IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 19, NO. 7, JULY 1997
[2] Hatice Gunes and Massimo Piccardi A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior Computer Vision Research Group, University of Technology, Sydney (UTS) P.O. Box 123, Broadway 2007, NSW, Australia {haticeg, massimo} @ it.uts.edu.au Proceedings of the 18th International Conference on Pattern Recognition (ICPR'06) 0-7695-2521-0/06 $20.00 © 2006 IEEE.
[3] Loic Kessous · Ginevra Castellano · George Caridakis Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis Received: 16 April 2009 / Accepted: 11 November 2009 © OpenInterface Association 2009 J Multimodal User Interfaces DOI 10.1007/s12193-009-0025-5 L. Kessous (_) 30 chemin du Lancier, Marseille 13008, France e-mail: loic.kessous@gmail.com G. Castellano Department of Computer Science, School of Electronic Engineering and Computer Science, Queen Mary University of London, Mile End Road, London E1 4NS, UK e-mail: ginevra@dcs.qmul.ac.uk G. Caridakis Image, Video and Multimedia Systems Laboratory, School of Electrical and Computer Engineering, National Technical University of Athens, Athens, Greece e-mail: gcari@image.ece.ntua.gr Springer.
[4] Ming-Hsuan Yang, Member, IEEE, David J. Kriegman, Senior Member, IEEE, and Narendra Ahuja, Fellow, IEEE. Detecting Faces in Images: A SurveyIEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, VOL. 24, NO. 1, JANUARY 2002
[5] Paul Ekman and Wallance V. Friesen University of California, Facial signs of emotional experience Journal of Personality and Social Psychology.1980 Vol 19,No 6,1123 1134
[6] Nicu Sebea, Ira Cohenb, Theo Geversa, and Thomas S. Huangc Multimodal Approaches for Emotion Recognition: A Survey Faculty of Science, University of Amsterdam, The Netherlands; bHP Labs, USA; cBeckman Institute, University of Illinois at Urbana-Champaign, USA
[7] Majdi Dammak, Mohamed Ben Ammar, Adel M. Alimi A New Approach to Emotion Recognition REGIM: REsearch Group on Intelligent Machines, University of Sfax, National school of engineers (ENIS), BP1173, Sfax, 3038, Tunisia majdi.dammak@ieee.org, mohamed.benammar@ieee.org, adel.alimi@ieee.org 2011 International Conference on Innovations in Information Technology
[8] Caifeng Shan, Shaogang Gong, and Peter W. McOwan Beyond Facial Expressions: Learning Human Emotion from Body Gestures Department of Computer Science Queen Mary, University of London Mile End Road, London E1 4NS, UK fcfshan, sgg, pmcog@dcs.qmul.ac.uk
[9] Angeliki Metallinou1, Athanassios Katsamanis1, Yun Wang2 and Shrikanth Narayanan1 TRACKING CHANGES IN CONTINUOUS EMOTION STATES USING BODY LANGUAGE AND PROSODIC CUES University of Southern California, Los Angeles, CA, 2 Carnegie Mellon University, Pittsburgh, PA 78-1-4577-0539-7/11/$26.00 ©2011 IEEE 2288 ICASSP 2011
[10] Konrad Schindler a; Luc Van Gool a;b, Beatrice de Gelder c Recognizing Emotions Expressed by Body Pose: a Biologically Inspired Neural Model aBIWI, Eidgen• ossische Technische Hochschule, Z• urich, Switzerland bVISICS, Katholieke Universiteit Leuven, Heverlee, Belgium cCognitive & A_ective Neurosciences Lab, Tilburg University, Netherlands Article published in Neural Networks, 2008
[11] Gary R. Bradski, Microcomputer Research Lab, Santa Clara, CA, Intel Corporation Computer Vision Face Tracking For Use in a Perceptual User Interface Intel Technology Journal Q2 ‘98
[12] Paul Ekman David Mastumoto and Wallance V. Friesen University of California, Facial expression in Affective disorders ,New york Oxford University Press 1997
Downloads
Published
-
Abstract60
-
PDF17






