Abstract
![CDATA[In this paper we describe a way to enhance human computer interaction using facial Electromyographic (EMG) sensors. Indeed, to know the emotional state of the user enables adaptable interaction specific to the mood of the user. This way, Human Computer Interaction (HCI) will gain in ergonomics and ecological validity. While expressions recognition systems based on video need exaggerated facial expressions to reach high recognition rates, the technique we developed using electrophysiological data enables faster detection of facial expressions and even in the presence of subtle movements. Features from 8 EMG sensors located around the face were extracted. Gaussian models for six basic facial expressions - anger, surprise, disgust, happiness, sadness and neutral - were learnt from these features and provide a mean recognition rate of 92%. Finally, a prototype of one possible application of this system was developed wherein the output of the recognizer was sent to the expressions module of a 3D avatar that then mimicked the expression.]]
Original language | English |
---|---|
Title of host publication | OZCHI 2009: Conference Proceedings, 23-27 November, Melbourne |
Publisher | ACM Press |
Pages | 421-424 |
Number of pages | 4 |
ISBN (Print) | 9781605588544 |
DOIs | |
Publication status | Published - 2009 |
Event | Australasian Computer Human Interaction Conference - Duration: 23 Nov 2009 → … |
Conference
Conference | Australasian Computer Human Interaction Conference |
---|---|
Period | 23/11/09 → … |