Computational auditory scene analysis
Computational auditory scene analysis
Application of the Bayesian probability network to music scene analysis
Computational auditory scene analysis
Sound-source recognition: a theory and computational model
Sound-source recognition: a theory and computational model
Musical instrument recognition using cepstral coefficients and temporal features
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 02
EURASIP Journal on Applied Signal Processing
Learning from Soft-Computing Methods on Abnormalities in Audio Data
RSCTC '08 Proceedings of the 6th International Conference on Rough Sets and Current Trends in Computing
Journal of Intelligent Information Systems
Identification of dominating instrument in mixes of sounds of the same pitch
ISMIS'08 Proceedings of the 17th international conference on Foundations of intelligent systems
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part III
Analysis of Recognition of a Musical Instrument in Sound Mixes Using Support Vector Machines
Fundamenta Informaticae
Hi-index | 0.00 |
This paper describes a musical instrument identification method that takes into consideration the pitch dependency of timbres of musical instruments. The difficulty in musical instrument identification resides in the pitch dependency of musical instrument sounds, that is, acoustic features of most musical instruments vary according to the pitch (fundamental frequency, F0). To cope with this difficulty, we propose an F0-dependent multivariate normal distribution, where each element of the mean vector is represented by a function of F0. Our method first extracts 129 features (e.g., the spectral centroid, the gradient of the straight line approximating the power envelope) from a musical instrument sound and then reduces the dimensionality of the feature space into 18 dimension. In the 18-dimensional feature space, it calculates an F0-dependent mean function and an F0-normalized covariance, and finally applies the Bayes decision rule. Experimental results of identifying 6,247 solo tones of 19 musical instruments shows that the proposed method improved the recognition rate from 75.73% to 79.73%.