Finger Tracking for Interaction in Augmented Environments
ISAR '01 Proceedings of the IEEE and ACM International Symposium on Augmented Reality (ISAR'01)
First Steps Towards Handheld Augmented Reality
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
Immersive Authoring of Tangible Augmented Reality Applications
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
Occlusion based interaction methods for tangible augmented reality environments
VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry
PlayAnywhere: a compact interactive tabletop projection-vision system
Proceedings of the 18th annual ACM symposium on User interface software and technology
The tangible augmented street map
Proceedings of the 2005 international conference on Augmented tele-existence
A tangible user interface with multimodal feedback
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Towards massively multi-user augmented reality on handheld devices
PERVASIVE'05 Proceedings of the Third international conference on Pervasive Computing
Hi-index | 0.00 |
Diverse sensors are available in ubiquitous computing of which resources is inherent in environment. Among them, image sensor acquires necessary data using camera without any extra devices, which is a different aspect from other sensors. It can provide additional services and/or applications by using a location of code and ID in real time image. Focusing on this, Intuitive interface operating method in ubiquitous computing environment that has plenty of image codes is suggested. GUI using image sensor was designed, which works real-time interactive operation between user and the GUI without any additional button or device. This interface method recognizes user's hand images in real-time by learning them at a starting point. The method sets interaction point, and operates the GUI through hand gestures defined previously. We expect this study can be adopted to augmented reality area and real time interface using user's hand.