Abstract
Visual information from a speakers face can enhance or interfere with accurate auditory perception. This integration of information across auditory and visual streams has been observed in functional imaging studies, and has typically been attributed to the frequency and robustness with which perceivers jointly encounter event-specific information from these two modalities. Adding the tactile modality has long been considered a crucial next step in understanding multisensory integration. However, previous studies have found an influence of tactile input on speech perception only under limited circumstances, either where perceivers were aware of the task or where they had received training to establish a cross-modal mapping. Here we show that perceivers integrate naturalistic tactile information during auditory speech perception without previous training. Drawing on the observation that some speech sounds produce tiny bursts of aspiration (such as English p), we applied slight, inaudible air puffs on participants skin at one of two locations: the right hand or the neck. Syllables heard simultaneously with cutaneous air puffs were more likely to be heard as aspirated (for example, causing participants to mishear b as p). These results demonstrate that perceivers integrate event-relevant tactile information in auditory perception in much the same way as they do visual information.
Original language | English |
---|---|
Pages (from-to) | 502-504 |
Number of pages | 3 |
Journal | Nature |
Volume | 462 |
Issue number | 7272 |
DOIs | |
Publication status | Published - 2009 |
Keywords
- auditory perception
- speech perception
- visual perception