Emotion control system for MIDI excerpts: MOR2ART

  • Authors:
  • Noritaka Moriguchi;Emi Wada;Masanobu Miura

  • Affiliations:
  • Ryukoku University, Seta, Otsu, Japan;Ryukoku University, Seta, Otsu, Japan;Ryukoku University, Seta, Otsu, Japan

  • Venue:
  • Proceedings of the 3rd International Conference on Fun and Games
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Emotional expression when performing music (singing or playing musical instruments) requires skill, but such a skill is generally difficult to learn. Computer systems that can make it easy for non-musicians to express any emotion have been proposed[1]. These systems can be used to express five or six emotions during a musical performance, but cannot be used to control the degree of an emotion such as savage or calm anger. It is necessary for the user, not only musicians but also non-musicians, to continuously manipulate emotions with immediate results for the audience. Therefore, we propose a system for controlling degrees of emotions in MIDI files. We call our proposed system Mood Operator Realized as an Application of Affective Rendering Techniques (MOR2ART), and it is designed to control expressed emotion during a musical performance using excerpts of a standard MIDI file (SMF) format. In musical performances, an emotion is expressed by the use of several performance profiles [2]. An emotion plane, which was defined in a previous study, is used in our system to allow manipulation of a pointer for continuously changing several performance profiles, such as timbre, tempo, number of performance tracks, and loudness of a given excerpt in that plane. Therefore, users can easily control the emotional expression in an excerpt. The emotions are expressed in the music when played back to the listener. Listeners can easily identify the expressed emotion with this playback. In an experimental evaluation, we confirmed that MOR2ART enables a non-musician to express emotion through his/her performance.