TY - JOUR
T1 - Articulatory constraints on spontaneous entrainment between speech and manual gesture
AU - Zelic, Gregory
AU - Kim, Jeesun
AU - Davis, Chris
PY - 2015
Y1 - 2015
N2 - The present study examined the extent to which speech and manual gestures spontaneously entrain in a non-communicative task. Participants had to repeatedly utter nonsense /CV/ syllables while continuously moving the right index finger in flexion/extension. No instructions to coordinate were given. We manipulated the type of syllable uttered (/ba/ vs. /sa/), and vocalization (phonated vs. silent speech). Assuming principles of coordination dynamics, a stronger entrainment between the fingers oscillations and the jaw motion was predicted (1) for /ba/, due to expected larger amplitude of jaw motion and (2) in phonated speech, due to the auditory feedback. Fifteen out of twenty participants showed simple ratios of speech to finger cycles (1:1, 1:2 or 2:1). In contrast with our predictions, speech–gesture entrainment was stronger when vocalizing /sa/ than /ba/, also more widely distributed on an in-phase mode. Furthermore, results revealed a spatial anchoring and an increased temporal variability in jaw motion when producing /sa/. We suggested that this indicates a greater control of the speech articulators for /sa/, making the speech performance more receptive to environmental forces, resulting in the greater entrainment observed to gesture oscillations. The speech–gesture coordination was maintained in silent speech, suggesting a somatosensory basis for their endogenous coupling.
AB - The present study examined the extent to which speech and manual gestures spontaneously entrain in a non-communicative task. Participants had to repeatedly utter nonsense /CV/ syllables while continuously moving the right index finger in flexion/extension. No instructions to coordinate were given. We manipulated the type of syllable uttered (/ba/ vs. /sa/), and vocalization (phonated vs. silent speech). Assuming principles of coordination dynamics, a stronger entrainment between the fingers oscillations and the jaw motion was predicted (1) for /ba/, due to expected larger amplitude of jaw motion and (2) in phonated speech, due to the auditory feedback. Fifteen out of twenty participants showed simple ratios of speech to finger cycles (1:1, 1:2 or 2:1). In contrast with our predictions, speech–gesture entrainment was stronger when vocalizing /sa/ than /ba/, also more widely distributed on an in-phase mode. Furthermore, results revealed a spatial anchoring and an increased temporal variability in jaw motion when producing /sa/. We suggested that this indicates a greater control of the speech articulators for /sa/, making the speech performance more receptive to environmental forces, resulting in the greater entrainment observed to gesture oscillations. The speech–gesture coordination was maintained in silent speech, suggesting a somatosensory basis for their endogenous coupling.
KW - auditory perception
KW - rhythm
KW - speech perception
UR - http://handle.uws.edu.au:8081/1959.7/uws:30356
U2 - 10.1016/j.humov.2015.05.009
DO - 10.1016/j.humov.2015.05.009
M3 - Article
SN - 0167-9457
VL - 42
SP - 232
EP - 245
JO - Human Movement Science
JF - Human Movement Science
ER -