Applying electric field sensing to human-computer interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The magic carpet: physical sensing for immersive environments
CHI EA '97 CHI '97 Extended Abstracts on Human Factors in Computing Systems
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Toward natural interaction through visual recognition of body gestures in real-time
Interacting with Computers
Browsing patient records during ward rounds with a body worn gyroscope
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Touchless Interaction-Novel Chances and Challenges
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Exploring the potential for touchless interaction in image-guided interventional radiology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An adaptive solution for intra-operative gesture-based human-machine interaction
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Putting your best foot forward: investigating real-world mappings for foot-based gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Sterility restrictions in surgical settings make touch-less interaction an interesting solution for surgeons to interact directly with digital images. The HCI community has already explored several methods for touch-less interaction including those based on camera-based gesture tracking and voice control. In this paper, we present a system for gesture-based interaction with medical images based on a single wristband sensor and capacitive floor sensors, allowing for hand and foot gesture input. The first limited evaluation of the system showed an acceptable level of accuracy for 12 different hand & foot gestures; also users found that our combined hand and foot based gestures are intuitive for providing input.