Infants match auditory and visual speech in schematic point-light displays

Christine Kitamura, Jeesun Kim

    Research output: Chapter in Book / Conference PaperConference Paper

    Abstract

    Infants' sensitivity to visual prosodic motion in infant-directed speech was examined by testing whether 8-month-olds can match an audio-only sentence with its visual-only schematic point-light display. The visual stimuli were sentence pairs of equal duration but unequal syllable number recorded using Optotrak. Twelve of the fourteen 8-month-olds tested looked longer at visual speech motion that matched the audio version of a sentence. This result suggests that the infants can perceive the underlying speech gestures signalled by schematic pointlight displays, and more importantly that they are sensitive to, and able to extract the syllable structure of speech from the talker's moving face and head.
    Original languageEnglish
    Title of host publicationProceedings: AVSP 2010: International Conference on Audio-Visual Speech Processing: Hakone, Kanagawa, Japan, September 30-October 3, 2010
    PublisherInternational Speech Communication Association
    Pages121-124
    Number of pages4
    Publication statusPublished - 2010
    EventInternational Conference on Auditory-Visual Speech Processing -
    Duration: 29 Aug 2013 → …

    Conference

    ConferenceInternational Conference on Auditory-Visual Speech Processing
    Period29/08/13 → …

    Keywords

    • speech perception in infants
    • infant-directed speech

    Fingerprint

    Dive into the research topics of 'Infants match auditory and visual speech in schematic point-light displays'. Together they form a unique fingerprint.

    Cite this