A study in interactive 3-D rotation using 2-D control devices
SIGGRAPH '88 Proceedings of the 15th annual conference on Computer graphics and interactive techniques
A probabilistic approach to modeling two-dimensional pointing
ACM Transactions on Computer-Human Interaction (TOCHI)
PRISM interaction for enhancing control in immersive virtual environments
ACM Transactions on Computer-Human Interaction (TOCHI)
A user interface for VR-ready 3D medical imaging by off-the-shelf input devices
Computers in Biology and Medicine
Communications of the ACM
Evaluation of gesture based interfaces for medical volume visualization tasks
Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
International Journal of Human-Computer Studies
Interaction proxemics and image use in neurosurgery
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A handle bar metaphor for virtual object manipulation with mid-air interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
On the naturalness of touchless: Putting the “interaction” back into NUI
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on the theory and practice of embodied interaction in HCI and interaction design
View-independent Hand Posture Recognition from Single Depth Images Using PCA and Flusser Moments
SITIS '12 Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems
View-independent Hand Posture Recognition from Single Depth Images Using PCA and Flusser Moments
SITIS '12 Proceedings of the 2012 Eighth International Conference on Signal Image Technology and Internet Based Systems
Hand shape classification using depth data for unconstrained 3D interaction
Journal of Ambient Intelligence and Smart Environments - Ambient and Smart Component Technologies for Human Centric Computing
Hi-index | 0.00 |
During the last few years, we have been witnessing a widespread adoption of touchless technologies in the context of surgical procedures. Touchless interfaces are advantageous in that they can preserve sterility around the patient, allowing surgeons to visualize medical images without having to physically touch any control or to rely on a proxy. Such interfaces have been tailored to interact with 2D medical images but not with 3D reconstructions of anatomical data, since such an interaction requires at least three degrees of freedom. In this paper, we discuss the results of a user study in which a mouse-based interface has been compared with two Kinect-based touchless interfaces which allow users to interact with 3D data with up to nine degrees of freedom. The experimental results show that there is a significant relation between the number of degrees of freedom simultaneously controlled by the user and the number of degrees of freedom required to perform, in a touchless way, an accurate manipulation task.