Evaluating 3D task performance for fish tank virtual worlds
ACM Transactions on Information Systems (TOIS)
Toolglass and magic lenses: the see-through interface
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ZoneZoom: map navigation for smartphones with recursive view segmentation
Proceedings of the working conference on Advanced visual interfaces
Navigating on handheld displays: Dynamic versus static peephole navigation
ACM Transactions on Computer-Human Interaction (TOCHI)
Map navigation with mobile devices: virtual versus physical movement with and without visual context
Proceedings of the 9th international conference on Multimodal interfaces
User evaluations on form factors of tangible magic lenses
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Pose tracking from natural features on mobile phones
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Impact of item density on magic lens interactions
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Parallel Tracking and Mapping on a camera phone
ISMAR '09 Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality
Pacer: fine-grained interactive paper via camera-touch hybrid gestures on a cell phone
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Head pose estimation with one camera, in uncalibrated environments
Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction
Utilizing sensor fusion in markerless mobile augmented reality
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Who's that girl? handheld augmented reality for printed photo books
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part III
KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera
Proceedings of the 24th annual ACM symposium on User interface software and technology
Gravity-aware handheld Augmented Reality
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
Virtual transparency: Introducing parallax view into video see-through AR
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
Online user survey on current mobile augmented reality applications
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
The Design of Everyday Things
DTAM: Dense tracking and mapping in real-time
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
A hand-held AR magic lens with user-perspective rendering
ISMAR '12 Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
Creating a stereoscopic magic-lens to improve depth perception in handheld augmented reality
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.00 |
In handheld Augmented Reality (AR) the magic-lens paradigm is typically implemented by rendering the video stream captured by the back-facing camera onto the device's screen. Unfortunately, such implementations show the real world from the device's perspective rather than the user's perspective. This dual-perspective results in misaligned and incorrectly scaled imagery, a predominate cause for the dual-view problem with potential to distort user's spatial perception. This paper presents a user study that analyzes users' expectations, spatial-perception, and their ability to deal with the dual-view problem, by comparing device-perspective and fixed Point-of-View (POV) user-perspective rendering. The results confirm the existence of the dual-view perceptual issue and that the majority of participants expect user-perspective rendering irrespective of their previous AR experience. Participants also demonstrated significantly better spatial perception and preference of the user-perspective view.