Abstract
To examine the stability of visible speech articulation (a potentially useful biometric) we examined the degree of similarity of a speaker's mouth movements when uttering the same sentence on six different occasions. We tested four speakers of differing language background and compared within- and across speaker variability. We obtained mouth motion data using an inexpensive 3D close range sensor and commercial face motion capture software. These data were exported as c3d files and the analysis was based on guided principal components derived from custom Matlab scripts. We showed that within-speaker repetitions were more similar than between speaker ones; that language background did not affect the stability of the utterances and that the patterns of articulation from different speakers were relatively distinctive.
Original language | English |
---|---|
Title of host publication | Proceedings of the 1st Joint Conference on Facial Analysis, Animation and Auditory-Visual Speech Processing, FAAVSP 2015, 11–13 September 2015, Vienna, Austria |
Publisher | International Speech Communication Association |
Pages | 99-102 |
Number of pages | 4 |
Publication status | Published - 2015 |
Event | Joint Conference on Facial Analysis_Animation and Auditory-Visual Speech Processing - Duration: 11 Sept 2015 → … |
Conference
Conference | Joint Conference on Facial Analysis_Animation and Auditory-Visual Speech Processing |
---|---|
Period | 11/09/15 → … |
Keywords
- articulation (speech)
- motion
- mouth