Document language models, query models, and risk minimization for information retrieval
Proceedings of the 24th annual international ACM SIGIR conference on Research and development in information retrieval
Query by Tapping: A New Paradigm for Content-Based Music Retrieval from Acoustic Input
PCM '01 Proceedings of the Second IEEE Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Rhythmic interaction with a mobile device
Proceedings of the third Nordic conference on Human-computer interaction
OZCHI '05 Proceedings of the 17th Australia conference on Computer-Human Interaction: Citizens Online: Considerations for Today and the Future
Query by tapping system based on alignment algorithm
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
TapSongs: tapping rhythm-based passwords on a single binary sensor
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Usable gestures for mobile interfaces: evaluating social acceptability
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
PocketTouch: through-fabric capacitive touch input
Proceedings of the 24th annual ACM symposium on User interface software and technology
Rhythmic interaction for song filtering on a mobile device
HAID'06 Proceedings of the First international conference on Haptic and Audio Interaction Design
Using rhythmic patterns as an input method
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Headphone taps: a simple technique to add input function to regular headphones
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion
Focused and casual interactions: allowing users to vary their level of engagement
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
A novel interaction style is presented, allowing in-pocket music selection by tapping a song's rhythm on a device's touchscreen or body. We introduce the use of rhythmic queries for music retrieval, employing a trained generative model to improve query recognition. We identify rhythm as a fundamental feature of music which can be reproduced easily by listeners, making it an effective and simple interaction technique for retrieving music. We observe that users vary in which instruments they entrain with and our work is the first to model such variability. An experiment was performed, showing that after training the generative model, retrieval performance improved two-fold. All rhythmic queries returned a highly ranked result with the trained generative model, compared with 47% using existing methods. We conclude that generative models of subjective user queries can yield significant performance gains for music retrieval and enable novel interaction techniques such as rhythmic filtering.