Temporal relationship between auditory and visual prosodic cues

Erin Cvejic, Jeesun Kim, Chris Davis

    Research output: Chapter in Book / Conference PaperConference Paperpeer-review

    3 Citations (Scopus)

    Abstract

    It has been reported that non-articulatory visual cues to prosody tend to align with auditory cues, emphasizing auditory events that are in close alignment (visual alignment hypothesis). We investigated the temporal relationship between visual and auditory prosodic cues in a large corpus of utterances to determine the extent to which non-articulatory visual prosodic cues align with auditory ones. Six speakers saying 30 sentences in three prosodic conditions (x2 repetitions) were recorded in a dialogue exchange task, to measure how often eyebrow movements and rigid head tilts aligned with auditory prosodic cues, the temporal distribution of such movements, and the variation across prosodic conditions. The timing of brow raises and head tilts were not aligned with auditory cues, and the occurrence of visual cures was inconsistent, lending little support for the visual alignment hypothesis. Different types of visual cues may combine with auditory cues in different ways to signal prosody.
    Original languageEnglish
    Title of host publicationProceedings of the 12th Annual Conference of the International Speech Communication Association (INTERSPEECH 2011), Florence, Italy, 27 - 31 August 2011
    PublisherCausal Productions
    Pages981-984
    Number of pages4
    Publication statusPublished - 2011
    EventInternational Speech Communication Association. Conference -
    Duration: 9 Sept 2012 → …

    Publication series

    Name
    ISSN (Print)1990-9772

    Conference

    ConferenceInternational Speech Communication Association. Conference
    Period9/09/12 → …

    Fingerprint

    Dive into the research topics of 'Temporal relationship between auditory and visual prosodic cues'. Together they form a unique fingerprint.

    Cite this