Toward a framework for human-robot interaction
Human-Computer Interaction
Identifying the role of proprioception in upper-limb prosthesis control: Studies on targeted motion
ACM Transactions on Applied Perception (TAP)
DTAM: Dense tracking and mapping in real-time
ICCV '11 Proceedings of the 2011 International Conference on Computer Vision
Hi-index | 0.00 |
We are undertaking the development of a brain computer interface (BCI) [1] for control of an upper limb prosthetic. Our approach exploits electrical neural activity data for motor intent estimation, and eye gaze direction for target selection. These data streams are augmented by computer vision (CV) for 3D scene reconstruction, and are integrated with a hierarchical controller to achieve semi-autonomous control. User interfaces for the effective control of the many degrees of freedom (DOF) of advanced prosthetic arms are not yet available [2]. Ideally the combined arm and interface technology provides the user with reliable and dexterous capability for reaching, grasping and fine-scale manipulation. Technologies that improve arm embodiment i.e., the impression by the amputee subject that the arm is a natural part of their body-concept presents an important and difficult challenge to the human-robot interaction research community. Such embodiment is clearly predicated on cross-disciplinary advances, including accurate intent estimation and an and an algorithmic basis for natural arm control.