Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Rhythmic interaction with a mobile device
Proceedings of the third Nordic conference on Human-computer interaction
Gait phase effects in mobile interaction
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Navigation via continuously adapted music
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Using rhythmic patterns as an input method
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Finding my beat: personalised rhythmic filtering for mobile music interaction
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.00 |
This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.