Image plane interaction techniques in 3D immersive environments
Proceedings of the 1997 symposium on Interactive 3D graphics
Selection using a one-eyed cursor in a fish tank VR environment
ACM Transactions on Computer-Human Interaction (TOCHI)
Using transparent props for interaction with the virtual table
I3D '99 Proceedings of the 1999 symposium on Interactive 3D graphics
Multi-finger gestural interaction with 3D volumetric displays
ACM SIGGRAPH 2005 Papers
Low-cost multi-touch sensing through frustrated total internal reflection
Proceedings of the 18th annual ACM symposium on User interface software and technology
Precise selection techniques for multi-touch screens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Psychophysics 101: how to run perception experiments in computer graphics
ACM SIGGRAPH 2008 classes
Bringing physics to the surface
Proceedings of the 21st annual ACM symposium on User interface software and technology
Sphere: multi-touch interactions on a spherical display
Proceedings of the 21st annual ACM symposium on User interface software and technology
CubTile: a multi-touch cubic interface
Proceedings of the 2008 ACM symposium on Virtual reality software and technology
Bimanual Interaction with Interscopic Multi-Touch Surfaces
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
A screen-space formulation for 2D and 3D direct manipulation
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of 3D positioning techniques for multi-touch displays
3DUI '10 Proceedings of the 2010 IEEE Symposium on 3D User Interfaces
Touching floating objects in projection-based virtual reality environments
EGVE - JVRC'10 Proceedings of the 16th Eurographics conference on Virtual Environments & Second Joint Virtual Reality
Touching the 3rd dimension (T3D)
CHI '11 Extended Abstracts on Human Factors in Computing Systems
3D spatial interaction: applications for art, design, and science
ACM SIGGRAPH 2011 Courses
Toucheo: multitouch and stereo combined in a seamless workspace
Proceedings of the 24th annual ACM symposium on User interface software and technology
Triangle cursor: interactions with objects above the tabletop
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
The 3rd dimension of CHI (3DCHI): touching and designing 3D user interfaces
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Balloon selection revisited: multi-touch selection techniques for stereoscopic data
Proceedings of the International Working Conference on Advanced Visual Interfaces
Nailing down multi-touch: anchored above the surface interaction for 3D modeling and navigation
Proceedings of Graphics Interface 2012
smARTbox: out-of-the-box technologies for interactive art and exhibition
Proceedings of the 2012 Virtual Reality International Conference
Proceedings of the 10th asia pacific conference on Computer human interaction
Is stereoscopic 3D a better choice for information representation in the car?
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Direct manipulation and the third dimension: co-planar dragging on 3d displays
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Evaluation of depth perception for touch interaction with stereoscopic rendered objects
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Touch-consistent perspective for direct interaction under motion parallax
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Blending real and virtual worlds using self-reflection and fiducials
ICEC'12 Proceedings of the 11th international conference on Entertainment Computing
Analysis of user requirements in interactive 3D video systems
Advances in Human-Computer Interaction
Binocular cursor: enabling selection on transparent displays troubled by binocular parallax
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pointing at 3d target projections with one-eyed and stereo cursors
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 1st symposium on Spatial user interaction
Direct 3D object manipulation on a collaborative stereoscopic display
Proceedings of the 1st symposium on Spatial user interaction
TouchMover: actuated 3D touchscreen with haptic feedback
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Investigating mobile stereoscopic 3D touchscreen interaction
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.01 |
Recent developments in the area of touch and display technologies have suggested to combine multi-touch systems and stereoscopic visualization. Stereoscopic perception requires each eye to see a slightly different perspective of the same scene, which results in two distinct projections on the display. Thus, if the user wants to select a 3D stereoscopic object in such a setup, the question arises where she would touch the 2D surface to indicate the selection. A user may apply different strategies, for instance touching the midpoint between the two projections, or touching one of them. In this paper we analyze the relation between the 3D positions of stereoscopically rendered objects and the on-surface touch points, where users touch the surface. We performed an experiment in which we determined the positions of the users' touches for objects, which were displayed with positive, negative or zero parallaxes. We found that users tend to touch between the projections for the two eyes with an offset towards the projection for the dominant eye. Our results give implications for the development of future touch-enabled interfaces, which support 3D stereoscopic visualization.