Exploring nonlinear relationships between speech face motion and tongue movements using Mutual Information

Christian Kroos, Rikke L. Bundgaard-Nielsen, Catherine T. Best

    Research output: Chapter in Book / Conference PaperConference Paper

    1 Citation (Scopus)

    Abstract

    ![CDATA[In the current study, we used Mutual Information (MI) to determine the amount of shared information between face motion and tongue movements during speech. We tracked face motion using a passive marker-based motion capture system and measured tongue motion employing Electromagnetic Articulography. The results show widespread associations between the two types of motion, albeit with predominantly relatively low MI values. More importantly with regard to practical applications, a pronounced speaker variability was observed. We further investigated the temporal frequency distribution of the shared information using a wavelet decomposition yielding one-octave band-limited subbands. It was found that the lowest frequency subband contained substantially higher amounts of shared information.]]
    Original languageEnglish
    Title of host publicationProceedings of the 10th International Seminar on Speech Production (ISSP): 5-8 May 2014, Cologne Germany
    PublisherInternational Seminar on Speech Production
    Pages237-240
    Number of pages4
    Publication statusPublished - 2014
    EventInternational Seminar on Speech Production -
    Duration: 5 May 2014 → …

    Conference

    ConferenceInternational Seminar on Speech Production
    Period5/05/14 → …

    Fingerprint

    Dive into the research topics of 'Exploring nonlinear relationships between speech face motion and tongue movements using Mutual Information'. Together they form a unique fingerprint.

    Cite this