Query by humming: musical information retrieval in an audio database
Proceedings of the third ACM international conference on Multimedia
Hierarchical filtering method for content-based music retrieval via acoustic input
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Discrete Time Processing of Speech Signals
Discrete Time Processing of Speech Signals
Art and Theory of Dynamic Programming
Art and Theory of Dynamic Programming
Super MBox: an efficient/effective content-based music retrieval system
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Microcontroller implementation of melody recognition: a prototype
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
Research and developments of a multi-modal MIR engine for commercial applications in East Asia
Journal of the American Society for Information Science and Technology - Music information retrieval
Media Converter with Impression Preservation Using a Neuro-Genetic Approach
International Journal of Hybrid Intelligent Systems
Tuning the Feature Space for Content-Based Music Retrieval
Proceedings of the 2006 conference on STAIRS 2006: Proceedings of the Third Starting AI Researchers' Symposium
Rhythm Speech Lyrics Input for MIDI-Based Singing Voice Synthesis
PCM '09 Proceedings of the 10th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
On the use of anti-word models for audio music annotation and retrieval
IEEE Transactions on Audio, Speech, and Language Processing
Adaptive content-based music retrieval system
Multimedia Tools and Applications
A tempo-sensitive music search engine with multimodal inputs
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
An initial study on progressive filtering based on dynamic programming for query-by-singing/humming
PCM'06 Proceedings of the 7th Pacific Rim conference on Advances in Multimedia Information Processing
Online music search by tapping
Ambient Intelligence in Everyday Life
Finding my beat: personalised rhythmic filtering for mobile music interaction
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.00 |
This paper presents a query by tapping system, which represents a new paradigm for CBMRAI (content-based music retrieval via acoustic input) systems. Most CBMRAI systems take the user acoustic input in the format of singing or humming, and the timing or beat information (durations of notes) is sometimes discarded during the retrieval process in order to save computation. Our query by tapping" mechanism, on the other hand, takes the user input in the format of tapping on the microphone and the extracted duration of notes is then used to retrieve the intended song in the database. Since there is no singing or humming, no pitch information is used in the retrieval process at all. Most people would think that it is hard to do music retrieval via beat information alone. However, our experiments demonstrate that beat information is also an effective feature in the sense that it can be used to retrieve the intended song from a large collection of music database with a satisfactory recognition rate.