Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
On the usability of gesture interfaces in virtual reality environments
CLIHC '05 Proceedings of the 2005 Latin American conference on Human-computer interaction
Accelerometer-based gesture control for a design environment
Personal and Ubiquitous Computing
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
A non-contact mouse for surgeon-computer interaction
Technology and Health Care
Modeling of interaction design by end users through discourse modeling
Proceedings of the 13th international conference on Intelligent user interfaces
Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors
Proceedings of the 14th international conference on Intelligent user interfaces
Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems
Orientation sensing for gesture-based interaction with smart artifacts
Computer Communications
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Collaborative co-design of emerging multi-technologies for surgery
Journal of Biomedical Informatics
Exploring the potential for touchless interaction in image-guided interventional radiology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
OR specific domain model for usability evaluations of intra-operative systems
IPCAI'11 Proceedings of the Second international conference on Information processing in computer-assisted interventions
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A flexible platform for developing context-aware 3D gesture-based interfaces
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
The challenges and potential of end-user gesture customization
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch-less interaction with medical images using hand & foot gestures
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
Touchless gestural interaction with small displays: a case study
Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI
Hi-index | 0.00 |
Computerized medical systems play a vital role in the operating room, however, sterility requirements and interventional workflow often make interaction with these devices challenging for surgeons. Typical solutions, such as delegating physical control of keyboard and mouse to assistants, add an undesirable level of indirection. We present a touchless, gesture-based interaction framework for the operating room that lets surgeons define a personalized set of gestures for controlling arbitrary medical computerized systems. Instead of using cameras for capturing gestures, we rely on a few wireless inertial sensors, placed on the arms of the surgeon, eliminating the dependence on illumination and line-of-sight. A discriminative gesture recognition approach based on kernel regression allows us to simultaneously classify performed gestures and to track the relative spatial pose within each gesture, giving surgeons fine-grained control of continuous parameters. An extensible software architecture enables a dynamic association of learned gestures to arbitrary intraoperative computerized systems. Our experiments illustrate the performance of our approach and encourage its practical applicability.