Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Phidgets: easy development of physical interfaces through physical widgets
Proceedings of the 14th annual ACM symposium on User interface software and technology
Light widgets: interacting in every-day spaces
Proceedings of the 7th international conference on Intelligent user interfaces
A design tool for camera-based interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Papier-Mache: toolkit support for tangible input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
VICs: A modular HCI framework using spatiotemporal dynamics
Machine Vision and Applications
Visual trigger templates for knowledge-based indexing
PCM'04 Proceedings of the 5th Pacific Rim Conference on Advances in Multimedia Information Processing - Volume Part II
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Detecting, tracking and interacting with people in a public space
Proceedings of the 2009 international conference on Multimodal interfaces
Hi-index | 0.00 |
We present a novel camera-based adaptable user interface system that uses hotspot components for 2D gesture-based interaction. A camera points to the desktop and the image captured by the camera appears on the user’s screen. A hotspot area is activated when a user’s hand passes through the rectangle that defines it. For example, a right_left hotspot activates when the user moves his hand from right to left, entering and exiting the rectangle. Our system is highly flexible because it allows the user to customize the interface as follows: (1) hotspot areas can be created anywhere within the camera-captured image; (2) new commands can be assigned to particular hotspots or composite hotspot sequences (e.g., right_left for previous webpage; up+right+down for webpage reload); (3) a physical workspace on the desktop can be defined by pointing the camera to any location; (4) different hotspot layouts for different applications can be created and saved. The system works with an inexpensive webcam in real time and uses machine learning to automatically detect skin areas for robust gesture recognition.