Of moles and men: the design of foot controls for workstations
CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploratory evaluation of a planar foot-operated cursor-positioning device
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Smart-Its Friends: A Technique for Users to Easily Establish Connections between Smart Artefacts
UbiComp '01 Proceedings of the 3rd international conference on Ubiquitous Computing
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Bayesian approach to sensor-based context awareness
Personal and Ubiquitous Computing
Appropriateness of foot interaction for non-accurate spatial tasks
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Interactive therapy with instrumented footwear
CHI '04 Extended Abstracts on Human Factors in Computing Systems
A gesture-based authentication scheme for untrusted public terminals
Proceedings of the 17th annual ACM symposium on User interface software and technology
An empirical comparison of supervised learning algorithms
ICML '06 Proceedings of the 23rd international conference on Machine learning
Gait analyzer based on a cell phone with a single three-axis accelerometer
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
ACM SIGCOMM Computer Communication Review
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Wrist rotation for interaction in mobile contexts
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Dealing with sensor displacement in motion-based onbody activity recognition systems
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
Tilt techniques: investigating the dexterity of wrist-based input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A gesture-based and eyes-free control method for mobile devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Motion marking menus: An eyes-free approach to motion input for handheld devices
International Journal of Human-Computer Studies
Head tilting for interaction in mobile contexts
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Whack gestures: inexact and inattentive interaction with mobile devices
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Usable gestures for mobile interfaces: evaluating social acceptability
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi-modal text entry and selection on a mobile device
Proceedings of Graphics Interface 2010
Where am i: recognizing on-body positions of wearable sensors
LoCA'05 Proceedings of the First international conference on Location- and Context-Awareness
Design and development of eyes- and hands-free voice interface for mobile phone
HCD'11 Proceedings of the 2nd international conference on Human centered design
Kick: investigating the use of kick gestures for mobile interactions
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Humantenna: using the body as an antenna for real-time whole-body interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Putting your best foot forward: investigating real-world mappings for foot-based gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using rhythmic patterns as an input method
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 10th asia pacific conference on Computer human interaction
FingerPad: private and subtle interaction using fingertips
Proceedings of the 26th annual ACM symposium on User interface software and technology
Mirage: exploring interaction modalities using off-body static electric field sensing
Proceedings of the 26th annual ACM symposium on User interface software and technology
ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments
Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
Augmenting the input space of portable displays using add-on hall-sensor grid
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
Visually demanding interfaces on a mobile phone can diminish the user experience by monopolizing the user's attention when they are focusing on another task and impede accessibility for visually impaired users. Because mobile devices are often located in pockets when users are mobile, explicit foot movements can be defined as eyes-and-hands-free input gestures for interacting with the device. In this work, we study the human capability associated with performing foot-based interactions which involve lifting and rotation of the foot when pivoting on the toe and heel. Building upon these results, we then developed a system to learn and recognize foot gestures using a single commodity mobile phone placed in the user's pocket or in a holster on their hip. Our system uses acceleration data recorded by a built-in accelerometer on the mobile device and a machine learning approach to recognizing gestures. Through a lab study, we demonstrate that our system can classify ten different foot gestures at approximately 86% accuracy.