Abstract
![CDATA[Background in computing/performance analysis. Past analyses of performance has utilised solely audio [Lindstedt,2004] or MIDI recordings, but further research into the importance of bodily movements and physical gestures [Davidson,2005] has since exposed a wealth of untapped information. The use of multi-dimensional recordings has only recently become prominent [Camurri,2003], transcending the shortcomings of each individual data stream. Performance information can be used to emphasise higher-level compositional issues that may not be obvious through traditional analysis methods. Background in music theory. The finale of Chopin's B flat minor piano sonata Op.35 has long proved problematic for traditional musical analysts and recently for computer-assisted analysis [Lindstedt,2004]. In this composition, the search for immanent structural elements such as the initiation of new musical themes, arrival points and climax points is not immediately obvious from simply looking at the score. The quantitative methods and multi-modal view of this proposed performance analysis provides tools to view the music beyond the score and so enable an extended analysis of the piece. Aims. We aim to analyse a number of live professional piano performances of first Chopin's Prelude in A Major Op.28 No.7 and secondly the finale of the B Flat minor sonata Op.35, the former providing control results enabling accurate analysis of the latter. These performances are recorded through audio, MIDI and video, making use of data storage technologies developed at Glasgow University's Centre for Music Technology, to investigate issues in thematic/harmonic structure in an otherwise structurally ambiguous composition. Main Contribution Method: Each performance is recorded using microphones, an optical motion capture system and a Moog Piano bar. The bar rests on top of any 88-key piano and using infra-red beams, converts movements of keys into MIDI. Timing information is retrieved from MIDI using a note-matching algorithm [McGilvray,2008], dynamics from the spectrum of the audio and movements from image-processing software. The exact duration of notes are calculated from the combination of audio and MIDI. The information is stored as performance data in PML [McGilvray,2008] – an extension of MusicXML enabling the inclusion of performance information - and movement data in the GMS(Gesture,Motion,Signal) format [Luciani,2006]. These files are stored alongside the score in a PostgreSQL database [Pullinger,2007] within which queries are performed based on points of agreement and change within the performances. The results are then displayed over the score in a format readable to musical analysts. Implications This work will provide further insight into the correlation of performance parameters with structural form and enable more extensive musical analysis of structurally ambiguous pieces. The intricate measurements of each performance require expertise in such areas as signal processing, image processing and physical modelling, whilst storing and manipulating the gathered data requires extensive database and computer systems knowledge. Expertise in music theory and performance is crucial to the design of all aforementioned systems and analysis of the results, making inter-disciplinary collaborations essential to project.]]
Original language | English |
---|---|
Title of host publication | Musical Structure: Proceedings of the 4th Conference on Interdisciplinary Musicology (CIM08), 2-6 July, 2008, Thessaloniki, Greece |
Publisher | Aristotle University of Thessaloniki |
Number of pages | 9 |
Publication status | Published - 2008 |
Event | Conference on Interdisciplinary Musicology - Duration: 2 Jul 2008 → … |
Conference
Conference | Conference on Interdisciplinary Musicology |
---|---|
Period | 2/07/08 → … |