The stability of mouth movements for multiple talkers over multiple sessions

Chris Davis, Jeesun Kim, Vincent Aubanel, Greg Zelic, Yatin Mahajan

    Research output: Chapter in Book / Conference PaperConference Paperpeer-review

    2 Citations (Scopus)

    Abstract

    To examine the stability of visible speech articulation (a potentially useful biometric) we examined the degree of similarity of a speaker's mouth movements when uttering the same sentence on six different occasions. We tested four speakers of differing language background and compared within- and across speaker variability. We obtained mouth motion data using an inexpensive 3D close range sensor and commercial face motion capture software. These data were exported as c3d files and the analysis was based on guided principal components derived from custom Matlab scripts. We showed that within-speaker repetitions were more similar than between speaker ones; that language background did not affect the stability of the utterances and that the patterns of articulation from different speakers were relatively distinctive.
    Original languageEnglish
    Title of host publicationProceedings of the 1st Joint Conference on Facial Analysis, Animation and Auditory-Visual Speech Processing, FAAVSP 2015, 11–13 September 2015, Vienna, Austria
    PublisherInternational Speech Communication Association
    Pages99-102
    Number of pages4
    Publication statusPublished - 2015
    EventJoint Conference on Facial Analysis_Animation and Auditory-Visual Speech Processing -
    Duration: 11 Sept 2015 → …

    Conference

    ConferenceJoint Conference on Facial Analysis_Animation and Auditory-Visual Speech Processing
    Period11/09/15 → …

    Keywords

    • articulation (speech)
    • motion
    • mouth

    Fingerprint

    Dive into the research topics of 'The stability of mouth movements for multiple talkers over multiple sessions'. Together they form a unique fingerprint.

    Cite this