Contextware: Bridging Physical and Virtual Worlds
Ada-Europe '02 Proceedings of the 7th Ada-Europe International Conference on Reliable Software Technologies
Evaluation of Tracking Methods for Human-Computer Interaction
WACV '02 Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision
On Importance of Nose for Face Tracking
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
EyeKeys: A Real-Time Vision Interface Based on Gaze Detection from a Low-Grade Video Camera
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 10 - Volume 10
An iterative image registration technique with an application to stereo vision
IJCAI'81 Proceedings of the 7th international joint conference on Artificial intelligence - Volume 2
A Camera-Based Music-Making Tool for Physical Rehabilitation
Computer Music Journal
User experience to improve the usability of a vision-based interface
Interacting with Computers
Menu controller: making existing software more accessible for people with motor impairments
Proceedings of the 4th International Conference on PErvasive Technologies Related to Assistive Environments
Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments
Using kernels for a video-based mouse-replacement interface
Personal and Ubiquitous Computing
Hi-index | 0.00 |
Many people suffer from conditions that lead to deterioration of motor control making access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware, for example, wearable devices, to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement that allows users to navigate to all areas of the screen even with very limited physical movement and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.