A desk supporting computer-based interaction with paper documents
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interacting with paper on the DigitalDesk
Communications of the ACM - Special issue on computer augmented environments: back to the real world
A framework for spatio-temporal control in the tracking of visual contours
Real-time computer vision
Movement time prediction in human-computer interfaces
Human-computer interaction
Virtual environments and advanced interface design
Virtual environments and advanced interface design
BrightBoard: a video-augmented environment
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Beyond Fitts' law: models for trajectory-based HCI tasks
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
BUILD-IT: a planning tool for construction and design
CHI 98 Cconference Summary on Human Factors in Computing Systems
Artificial Life and Virtual Reality
Artificial Life and Virtual Reality
Vision for man machine interaction
Proceedings of the IFIP TC2/WG2.7 Working Conference on Engineering for Human-Computer Interaction
Robust finger tracking for wearable computer interfacing
Proceedings of the 2001 workshop on Perceptive user interfaces
Browsing the environment with the SNAP&TELL wearable computer system
Personal and Ubiquitous Computing
Vision-based two hand detection and tracking
Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human
Multimedia augmented reality information system for museum guidance
Personal and Ubiquitous Computing
Hi-index | 0.00 |
A trend in computing environments today is to move towards more natural interaction, another is to make hardware invisible to the user. Both these ideas converge into ubiquitous computing - the Digital Desk is an example of this idea. In this paper we concentrate on an input device for the Digital Desk, namely the users fingertip, which is made to act like a mouse. Tracking such an input device is common to a number of augmented reality environments and involves vision and motion analysis. However, previous attempts have focused more on the vision aspect of tracking general objects than on using the information already known about the users hand, which is the approach taken here. We adopted the goal of tracking the users fingertip as fast as possible in real-time so the system could be compared with other input devices, using models such as Fitts Law. Our system is shown to comply with the law adequately.