SIGGRAPH '86 Proceedings of the 13th annual conference on Computer graphics and interactive techniques
Snap-dragging in three dimensions
I3D '90 Proceedings of the 1990 symposium on Interactive 3D graphics
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Device comparisons for goal-directed drawing tasks
CHI '94 Conference Companion on Human Factors in Computing Systems
Video See-Through AR on Consumer Cell-Phones
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Face to Face Collaborative AR on Mobile Phones
ISMAR '05 Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality
Direct-touch vs. mouse input for tabletop displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Napkin sketch: handheld mixed reality 3D sketching
Proceedings of the 2008 ACM symposium on Virtual reality software and technology
Mobile augmented reality interaction techniques for authoring situated media on-site
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Pose tracking from natural features on mobile phones
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Freeze-Set-Go interaction method for handheld mobile augmented reality environments
Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology
A user study on the Snap-To-Feature interaction method
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
Hi-index | 0.00 |
Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen interfaces are common in mobile devices, and are also widely used in AR applications running on mobile devices, such as smartphones. However, due to unsteady camera view movement in handheld AR environment, it is hard to carry out precise interactions, such as drawing, especially when tracing physical objects. In this paper, we investigate two types of interaction techniques, Freeze-Set-Go and Snap-To-Feature, that help users to perform more accurate touch screen based AR interactions. The two techniques are compared in a user experiment with a task of tracing physical objects, which can be encountered when making annotation on or modeling physical objects within the AR scene. The results from the experiment show that a combination of these two makes a significant difference in accuracy and usability of touch screen based AR interaction.