Finding my beat: personalised rhythmic filtering for mobile music interaction

  • Authors:
  • Daniel Boland;Roderick Murray-Smith

  • Affiliations:
  • University of Glasgow, Glasgow, United Kingdom;University of Glasgow, Glasgow, United Kingdom

  • Venue:
  • Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

A novel interaction style is presented, allowing in-pocket music selection by tapping a song's rhythm on a device's touchscreen or body. We introduce the use of rhythmic queries for music retrieval, employing a trained generative model to improve query recognition. We identify rhythm as a fundamental feature of music which can be reproduced easily by listeners, making it an effective and simple interaction technique for retrieving music. We observe that users vary in which instruments they entrain with and our work is the first to model such variability. An experiment was performed, showing that after training the generative model, retrieval performance improved two-fold. All rhythmic queries returned a highly ranked result with the trained generative model, compared with 47% using existing methods. We conclude that generative models of subjective user queries can yield significant performance gains for music retrieval and enable novel interaction techniques such as rhythmic filtering.