Visual speech influences speeded auditory identification

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

1 Citation (Scopus)

Abstract

Auditory speech perception is faster and more accurate when combined with visual speech. We attempted to replicate previous findings that suggested visual speech facilitates auditory processing when speech is paired with matching video and interferes with processing when paired with mismatched videos. Crucially we employed button presses instead of a vocal response to determine if previous results could be attributed to the specific nature of the task. Stimuli consisted of the sounds 'apa', 'aka' and 'ata', with matched and mismatched videos that showed the talker's whole face or upper face (control). The percentage of matched AV videos was set at 85% in the congruent condition and 15% in the incongruent condition. The results show that speeded identification decisions influence auditory processing. Furthermore, this influence is moderated by (a) visual speech acting as a temporal cue to the acoustic signal and (b) resolving the perceived differences between visual and auditory modalities. The current study builds on previous results suggesting visual speech plays a role in the temporal processing of auditory speech.
Original languageEnglish
Title of host publicationProceedings of the International Conference on Audio-Visual Speech Processing (AVSP2011), Aug 31 - Sep 3, 2011, Volterra, Italy
PublisherKTH, Computer Science and Communication
Pages5-8
Number of pages4
ISBN (Print)9789175010809
Publication statusPublished - 2011
EventInternational Conference on Audio-Visual Speech Processing -
Duration: 31 Aug 2011 → …

Publication series

Name
ISSN (Print)1680-8908

Conference

ConferenceInternational Conference on Audio-Visual Speech Processing
Period31/08/11 → …

Fingerprint

Dive into the research topics of 'Visual speech influences speeded auditory identification'. Together they form a unique fingerprint.

Cite this