Windows on the world: 2D windows for 3D augmented reality
UIST '93 Proceedings of the 6th annual ACM symposium on User interface software and technology
Whisper: a wristwatch style wearable handset
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Collaborative augmented reality
Communications of the ACM - How the virtual inspires the real
Visuo-Haptic Display Using Head-Mounted Projector
VR '00 Proceedings of the IEEE Virtual Reality 2000 Conference
Projection-Based Olfactory Display with Nose Tracking
VR '04 Proceedings of the IEEE Virtual Reality 2004
Food Simulator: A Haptic Interface for Biting
VR '04 Proceedings of the IEEE Virtual Reality 2004
Projected Augmentation - Augmented Reality using Rotatable Video Projectors
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Wearable Olfactory Display: Using Odor in Outdoor Environment
VR '06 Proceedings of the IEEE conference on Virtual Reality
Straw-like user interface: virtual experience of the sensation of drinking using a straw
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
A new typology of augmented reality applications
AH '12 Proceedings of the 3rd Augmented Human International Conference
Hi-index | 0.00 |
We present a new classification framework for describing augmented reality (AR) applications based on where the mixing of real and computer-generated stimuli takes place. In addition to "classical" visual AR techniques, such as optical-see-through and video-see-through AR, our framework encompasses AR directed at the other senses as well. This "axis of mixing location" is a continuum ranging from the physical environment to the human brain. There are advantages and disadvantages of mixing at different points along the continuum, and while there is no "best" location, we present sample usage scenarios that illustrate the expressiveness of this classification approach.