The role of natural language in a multimodal interface
UIST '92 Proceedings of the 5th annual ACM symposium on User interface software and technology
The go-go interaction technique: non-linear mapping for direct manipulation in VR
Proceedings of the 9th annual ACM symposium on User interface software and technology
Aperture based selection for immersive virtual environments
Proceedings of the 9th annual ACM symposium on User interface software and technology
Proceedings of the 1997 symposium on Interactive 3D graphics
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Moving objects in space: exploiting proprioception in virtual-environment interaction
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
Voodoo dolls: seamless interaction at multiple scales in virtual environments
I3D '99 Proceedings of the 1999 symposium on Interactive 3D graphics
Towards usable VR: an empirical study of user interfaces for immersive virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Making graphics physically tangible
Communications of the ACM
The role of kinesthetic reference frames in two-handed input performance
Proceedings of the 12th annual ACM symposium on User interface software and technology
Putting the feel in ’look and feel‘
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
A Hybrid 2D / 3D User Interface for Immersive Object Modeling
CGI '97 Proceedings of the 1997 Conference on Computer Graphics International
ISAAC: A Virtual Environment Tool for the Interactive Construction of Virtual Worlds
ISAAC: A Virtual Environment Tool for the Interactive Construction of Virtual Worlds
Working in a Virtual World: Interaction Techniques Used in the Chapel Hill Immersive Modeling Program
Hi-index | 0.00 |
3D environments are designed to be intuitive and easy to use. However, when defining interaction in 3D virtual environments, suitable paradigms for accessing objects and user interface elements are often difficult to determine. Several solutions currently exist, all with their strengths and weaknesses, but due to the complexity of the human senses and technical and financial restrictions, none of them is ideal. In this paper, we describe a first step in our research investigating how 3D interaction can be improved by introducing a technique that uses proprioception together with realistic force feedback in order to more easily access objects and widgets in 3D space. In a user experiment, we also validate our newly proposed solution, and compare it to our earlier work.