Hands-free mouse-pointer manipulation using motion-tracking and speech recognition
OZCHI '07 Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces
Evaluating the Hands-Free Mouse Control System: An Initial Case Study
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Blinkbot: look at, blink and move
UIST '10 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology
The potential of dwell-free eye-typing for fast assistive gaze communication
Proceedings of the Symposium on Eye Tracking Research and Applications
Typing with eye-gaze and tooth-clicks
Proceedings of the Symposium on Eye Tracking Research and Applications
Hi-index | 0.00 |
This is to propose a demo and poster about a tool designed to assist persons who are temporarily or permanently unable to reliably operate the buttons of a physical pointing device, for example because of tenosynovitis (TSV). It monitors a dedicated muscle of the user and emulates a click event at the current position of the mouse pointer in response to a contraction of that muscle (as small as raising the eyebrow). The ClickType (= type of the click) - left, right, single, double, drag - is selected by the user (who is also responsible for moving the mouse pointer) and stays valid until the selection of a new one.