Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Foreground and background interaction with sensor-enhanced mobile devices
ACM Transactions on Computer-Human Interaction (TOCHI)
Continuous Sonic Feedback from a Rolling Ball
IEEE MultiMedia
Accelerometer-based gesture control for a design environment
Personal and Ubiquitous Computing
Bearing-based selection in mobile spatial interaction
Personal and Ubiquitous Computing
Probing the potential of non-verbal group communication
Proceedings of the ACM 2009 international conference on Supporting group work
Real Walking in Virtual Learning Environments: Beyond the Advantage of Naturalness
EC-TEL '09 Proceedings of the 4th European Conference on Technology Enhanced Learning: Learning in the Synergy of Multiple Disciplines
Usable gestures for mobile interfaces: evaluating social acceptability
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Wearable-object-based interaction for a mobile audio device
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Gesture and voice prototyping for early evaluations of social acceptability in multimodal interfaces
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Pinstripe: eyes-free continuous input on interactive clothing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
Proceedings of the 24th annual ACM symposium on User interface software and technology
Multimodal mobile interactions: usability studies in real world settings
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Foot tapping for mobile interaction
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Focused and casual interactions: allowing users to vary their level of engagement
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mo!Games: evaluating mobile gestures in the wild
Proceedings of the 15th ACM on International conference on multimodal interaction
Hi-index | 0.00 |
We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of the body. We demonstrate a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based techniques can shape gestural interaction.