Time series analysis of real-time music perception : approaches to the assessment of individual and expertise differences in perception of expressed affect

Roger T. Dean, Freya Bailes, William T. M. Dunsmuir

    Research output: Contribution to journalArticlepeer-review

    14 Citations (Scopus)

    Abstract

    We use time series analysis methods to detect differences between individuals and expertise groups in continuous perceptions of the arousal expressed by Wishart's electroacoustic piece Red Bird. The study is part of a project in which we characterise dynamic perception of the structure and affective expression of music. We find that individual series of perceptions of expressed arousal often show considerable periods of stasis. This may challenge conventional time series methodologies, so we test their validity by application of a general linear autoregressive moving average (GLARMA) approach, which supports it. Acoustic intensity is a dominant predictor of perceived arousal in this piece. We show that responses are time-variant and that animate sounds influence the conditional variance of perceived arousal. Using vector autoregression and cross-sectional time series analysis (which preserves the integrity of each individual response series), we find differences between musical expertise groups (non-musicians, musicians, and electroacoustic musicians). Individual differences within each group are greater than those between expertise groups. The companion paper applies the developed methods to all four pieces in our overall project (Dean, R.T., F. Bailes, and W.T.M. Dunsmuir. 2014. "Shared and Distinct Mechanisms of Individual and Expertise-Group Perception of Expressed Arousal in Four Works."
    Original languageEnglish
    Pages (from-to)183-205
    Number of pages23
    JournalJournal of Mathematics and Music
    Volume8
    Issue number3
    DOIs
    Publication statusPublished - 2014

    Fingerprint

    Dive into the research topics of 'Time series analysis of real-time music perception : approaches to the assessment of individual and expertise differences in perception of expressed affect'. Together they form a unique fingerprint.

    Cite this