Visual speech speeds up auditory identification responses

    Research output: Chapter in Book / Conference PaperConference Paperpeer-review

    4 Citations (Scopus)

    Abstract

    ![CDATA[Auditory speech perception is more accurate when combined with visual speech. Recent ERP studies suggest that visual speech helps 'predict' which phoneme will be heard via feedback from visual to auditory areas, with more visual salient articulations associated with greater facilitation. Two experiments tested this hypothesis with a speeded auditory identification measure. Stimuli consisted of the sounds 'apa', 'aka' and 'ata', with matched and mismatched videos that showed the talker's whole face or upper face (control). The percentage of matched AV videos was set at 85% in Experiment 1 and 15% in Experiment 2. Results showed that responses to matched whole face stimuli were faster than both upper face and mismatched videos in both experiments. Furthermore, salient phonemes (aPa) showed a greater reduction in reaction times than ambiguous ones (aKa). The current study provides support for the proposal that visual speech speeds up processing of auditory speech.]]
    Original languageEnglish
    Title of host publicationProceedings of the 12th Annual Conference of the International Speech Communication Association (INTERSPEECH 2011), Florence, Italy, 27 - 31 August 2011
    PublisherCausal Productions
    Pages2469-2472
    Number of pages4
    Publication statusPublished - 2011
    EventInternational Speech Communication Association. Conference -
    Duration: 9 Sept 2012 → …

    Publication series

    Name
    ISSN (Print)1990-9772

    Conference

    ConferenceInternational Speech Communication Association. Conference
    Period9/09/12 → …

    Fingerprint

    Dive into the research topics of 'Visual speech speeds up auditory identification responses'. Together they form a unique fingerprint.

    Cite this