Bare hand interface for interaction in the video see-through HMD based wearable AR environment

  • Authors:
  • Taejin Ha;Woontack Woo

  • Affiliations:
  • GIST U-VR Lab., Gwangju, South Korea;GIST U-VR Lab., Gwangju, South Korea

  • Venue:
  • ICEC'06 Proceedings of the 5th international conference on Entertainment Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a natural and intuitive bare hand interface for wearable augmented reality environment using the video see-through HMD. The proposed methodology automatically learned color distribution of the hand object through the template matching and tracking the hand objects by using the Meanshift algorithm under the dynamic background and moving camera. Furthermore, even though users are not wearing gloves, extracting of the hand object from arm is enabled by applying distance transform and using radius of palm. The fingertip points are extracted by convex hull processing and assigning constraint to the radius of palm area. Thus, users don't need attaching fiducial markers on fingertips. Moreover, we implemented several applications to demonstrate the usefulness of proposed algorithm. For example, “AR-Memo" can help user to memo in the real environment by using a virtual pen which is augmented on the user's finger, and user can also see the saved memo on his/her palm by augmenting it while moving around anywhere. Finally, we experimented performance and did usability studies.