Epipolar Contrained User Pushbutton Selection in Projected Interfaces

  • Authors:
  • Amit Kale;Kenneth Kwan;Christopher Jaynes

  • Affiliations:
  • UK Center for Visualization and Virtual Reality, Lexington KY;UK Center for Visualization and Virtual Reality, Lexington KY;UK Center for Visualization and Virtual Reality, Lexington KY

  • Venue:
  • CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 10 - Volume 10
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

An almost ubiquitous user interaction in most HCI applications is the task of selecting one of out of a given list of options. For example, in common desktop environments, the user moves the mouse pointer to the desired option and clicks it. The analog of this action in projector-camera HCI environments involves the user raising her finger to touch one of the different virtual buttons projected on a display surface. In this paper, we discuss some of the challenges involved in tracking and recognizing this task in an projected immersive environment and present a hierarchical vision based approach to detect intuitive gesture-based "mouse clicks" in a front-projected virtual interface. Given the difficulty of tracking user gestures directly in a projected environment, our approach first tracks shadows cast on the display by the user and exploits the multi-view geometry of the camera-projector pair to constrain a subsequent search for the users hand position in the scene. The method only requires a simple setup step in which the projector's epipole in the cameraýs frame is estimated. We demonstrate how this approach is capable of detecting a contact event as a user interacts with a virtual pushbutton display. Results demonstrate that camera-based monitoring of user gesture is feasible even under difficult conditions in which the user is illluminated by changing and saturated colors.