21-07-2016, 03:09 PM
Abstract
We present MIRToolbox, an integrated set of functions written in Matlab, dedicated to the extraction from audio files of musical features related, among others, to timbre, tonality, rhythm or form. The objective is to offer a state of the art of computational approaches in the area of Music Information Retrieval (MIR). The design is based on a modular framework: the different algorithms are decomposed into stages, formalized using a minimal set of elementary mechanisms, and integrating different variants proposed by alternative approaches — including new strategies we have developed —, that users can select and parametrize. These functions can adapt to a large area of objects as input.
This paper offers an overview of the set of features that can be extracted with MIRToolbox, illustrated with the description of three particular musical features. The toolbox also includes functions for statistical analysis, segmentation and clustering.
One of our main motivations for the development of the toolbox is to facilitate investigation of the relation between musical features and music-induced emotion. Preliminary results show that the variance in emotion ratings can be explained by a small set of acoustic features.
We present MIRToolbox, an integrated set of functions written in Matlab, dedicated to the extraction from audio files of musical features related, among others, to timbre, tonality, rhythm or form. The objective is to offer a state of the art of computational approaches in the area of Music Information Retrieval (MIR). The design is based on a modular framework: the different algorithms are decomposed into stages, formalized using a minimal set of elementary mechanisms, and integrating different variants proposed by alternative approaches — including new strategies we have developed —, that users can select and parametrize. These functions can adapt to a large area of objects as input.
This paper offers an overview of the set of features that can be extracted with MIRToolbox, illustrated with the description of three particular musical features. The toolbox also includes functions for statistical analysis, segmentation and clustering.
One of our main motivations for the development of the toolbox is to facilitate investigation of the relation between musical features and music-induced emotion. Preliminary results show that the variance in emotion ratings can be explained by a small set of acoustic features.