Image plane interaction techniques in 3D immersive environments
Proceedings of the 1997 symposium on Interactive 3D graphics
Using transparent props for interaction with the virtual table
I3D '99 Proceedings of the 1999 symposium on Interactive 3D graphics
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
Perceptual Stability During Head Movement in Virtual Reality
VR '02 Proceedings of the IEEE Virtual Reality Conference 2002
Virtual Environment Interaction Techniques
Virtual Environment Interaction Techniques
Passive haptics significantly enhances virtual environments
Passive haptics significantly enhances virtual environments
3D User Interfaces: Theory and Practice
3D User Interfaces: Theory and Practice
Visual touchpad: a two-handed gestural input device
Proceedings of the 6th international conference on Multimodal interfaces
Proceedings of the 17th annual ACM symposium on User interface software and technology
Multi-finger gestural interaction with 3D volumetric displays
ACM SIGGRAPH 2005 Papers
The Hand is Slower than the Eye: A Quantitative Exploration of Visual Dominance over Proprioception
VR '05 Proceedings of the 2005 IEEE Conference 2005 on Virtual Reality
Low-cost multi-touch sensing through frustrated total internal reflection
Proceedings of the 18th annual ACM symposium on User interface software and technology
Combining passive haptics with redirected walking
Proceedings of the 2005 international conference on Augmented tele-existence
Sphere: multi-touch interactions on a spherical display
Proceedings of the 21st annual ACM symposium on User interface software and technology
CubTile: a multi-touch cubic interface
Proceedings of the 2008 ACM symposium on Virtual reality software and technology
Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback
CW '08 Proceedings of the 2008 International Conference on Cyberworlds
Bimanual Interaction with Interscopic Multi-Touch Surfaces
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Estimation of Detection Thresholds for Redirected Walking Techniques
IEEE Transactions on Visualization and Computer Graphics
I3D '11 Symposium on Interactive 3D Graphics and Games
2d touching of 3d stereoscopic objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Toucheo: multitouch and stereo combined in a seamless workspace
Proceedings of the 24th annual ACM symposium on User interface software and technology
Triangle cursor: interactions with objects above the tabletop
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
smARTbox: out-of-the-box technologies for interactive art and exhibition
Proceedings of the 2012 Virtual Reality International Conference
Touch-consistent perspective for direct interaction under motion parallax
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Blending real and virtual worlds using self-reflection and fiducials
ICEC'12 Proceedings of the 11th international conference on Entertainment Computing
SmurVEbox: a smart multi-user real-time virtual environment for generating character animations
Proceedings of the Virtual Reality International Conference: Laval Virtual
Proceedings of the 1st symposium on Spatial user interaction
Investigating mobile stereoscopic 3D touchscreen interaction
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.00 |
Touch-sensitive screens enable natural interaction without any instrumentation and support tangible feedback on the touch surface. In particular multi-touch interaction has proven its usability for 2D tasks, but the challenges to exploit these technologies in virtual reality (VR) setups have rarely been studied. In this paper we address the challenge to allow users to interact with stereoscopically displayed virtual environments when the input is constrained to a 2D touch surface. During interaction with a large-scale touch display a user changes between three different states: (1) beyond the arm-reach distance from the surface, (2) at arm-reach distance and (3) interaction. We have analyzed the user's ability to discriminate stereoscopic display parallaxes while she moves through these states, i. e., if objects can be imperceptibly shifted onto the interactive surface and become accessible for natural touch interaction. Our results show that the detection thresholds for such manipulations are related to both user motion and stereoscopic parallax, and that users have problems to discriminate whether they touched an object or not, when tangible feedback is expected.