Cooperative Answering through Controlled Query Relaxation
IEEE Expert: Intelligent Systems and Their Applications
Introduction to MPEG-7: Multimedia Content Description Interface
Introduction to MPEG-7: Multimedia Content Description Interface
Adaptive signal models: theory, algorithms, and audio applications
Adaptive signal models: theory, algorithms, and audio applications
Research on the Integration in E-government Based on Multi-agent
WI-IATW '06 Proceedings of the 2006 IEEE/WIC/ACM international conference on Web Intelligence and Intelligent Agent Technology
Musical Sound Classification based on Wavelet Analysis
Fundamenta Informaticae - Intelligent Systems
Analysis of Sound Features for Music Timbre Recognition
MUE '07 Proceedings of the 2007 International Conference on Multimedia and Ubiquitous Engineering
Maximum Likelihood Study for Sound Pattern Separation and Recognition
MUE '07 Proceedings of the 2007 International Conference on Multimedia and Ubiquitous Engineering
Multi-way Hierarchic Classification of Musical Instrument Sounds
MUE '07 Proceedings of the 2007 International Conference on Multimedia and Ubiquitous Engineering
Blind signal separation of similar pitches and instruments in a noisy polyphonic domain
ISMIS'06 Proceedings of the 16th international conference on Foundations of Intelligent Systems
Action Rules Discovery without Pre-existing Classification Rules
RSCTC '08 Proceedings of the 6th International Conference on Rough Sets and Current Trends in Computing
On Reaching Consensus by a Group of Collaborating Agents
FQAS '09 Proceedings of the 8th International Conference on Flexible Query Answering Systems
Multi-label automatic indexing of music by cascade classifiers
Web Intelligence and Agent Systems
Hi-index | 0.00 |
With the fast booming of online music repositories, there is a need for content-based automatic indexing which will help users to find their favorite music objects in real time. Recently, numerous successful approaches on musical data feature extraction and selection have been proposed for instrument recognition in monophonic sounds. Unfortunately, none of these methods can be successfully applied to polyphonic sounds. Identification of music instruments in polyphonic sounds is still difficult and challenging, especially when harmonic partials are overlapping with each other. This has stimulated the research on music sound separation and new features development for content-based automatic music information retrieval. Our goal is to build a cooperative query answering system (QAS), for a musical database, retrieving from it all objects satisfying queries like "find all musical pieces in pentatonic scale with a viola and piano where viola is playing for minimum 20 seconds and piano for minimum 10 seconds". We use the database of musical sounds, containing almost 4000 sounds taken from the MUMs (McGill University Master Samples), as a vehicle to construct several classifiers for automatic instrument recognition. Classifiers showing the best performance are adopted for automatic indexing of musical pieces by instruments. Our musical database has an FS-tree (Frame Segment Tree) structure representation. The cooperativeness of QAS is driven by several hierarchical structures used for classifying musical instruments.