Moodtrack

Moodtrack is an adaptive music arrangement system that can assemble programmatic music given temporal mood cues. The focus of the research is to explore links between language (describing mood, action and setting) and musical features that can be extracted from audio files. The visual, setting, dialog, and mood narratives of film are described over time. In parallel, film scores are analyzed to codify mood domains, and to partition according to formal musical structure. Music features, contexts and genres are represented in a musical semantic network. Relationships are based on causal, temporal, emotive, and (non)musical concepts. The network provides a means for relating appropriate music features. Using data gathered from numerous film score analyses, a selector/compiler is constructed. The selector model’s choice of segments in the resulting soundtrack can be user-evaluated for appropriateness and continuity.

Project page: http://web.media.mit.edu/~scottyv/emo/