IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use
International Journal of Human-Computer Interaction
Human performance engineering (3rd ed.): designing high quality professional user interfaces for computer products, applications and systems
Computer Vision: A Modern Approach
Computer Vision: A Modern Approach
Care-O-bot II—Development of a Next Generation Robotic Home Assistant
Autonomous Robots
Development and evaluation of a flexible interface for a wheelchair mounted robotic arm
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition
Human-Computer Interaction
Architecting for usability: a survey
Journal of Systems and Software
The VoiceBot: a voice controlled robot arm
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi-touch interface for controlling multiple mobile robots
CHI '09 Extended Abstracts on Human Factors in Computing Systems
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Shape recognition of laser beam trace for human-robot interface
Pattern Recognition Letters
“I want that”: Human-in-the-loop control of a wheelchair-mounted robotic arm
Applied Bionics and Biomechanics - Assistive and Rehabilitation Robotics
Hi-index | 0.00 |
El-E ("Ellie") is a prototype assistive robot designed to help people with severe motor impairments manipulate everyday objects. When given a 3D location, El-E can autonomously approach the location and pick up a nearby object. Based on interviews of patients with amyotrophic lateral sclerosis (ALS), we have developed and tested three distinct interfaces that enable a user to provide a 3D location to El-E and thereby select an object to be manipulated: an ear-mounted laser pointer, a hand-held laser pointer, and a touch screen interface. Within this paper, we present the results from a user study comparing these three user interfaces with a total of 134 trials involving eight patients with varying levels of impairment recruited from the Emory ALS Clinic. During this study, participants used the three interfaces to select everyday objects to be approached, grasped, and lifted off of the ground. The three interfaces enabled motor impaired users to command a robot to pick up an object with a 94.8% success rate overall after less than 10 minutes of learning to use each interface. On average, users selected objects 69% more quickly with the laser pointer interfaces than with the touch screen interface. We also found substantial variation in user preference. With respect to the Revised ALS Functional Rating Scale (ALSFRS-R), users with greater upper-limb mobility tended to prefer the hand-held laser pointer, while those with less upper-limb mobility tended to prefer the ear-mounted laser pointer. Despite the extra efficiency of the laser pointer interfaces, three patients preferred the touch screen interface, which has unique potential for manipulating remote objects out of the user's line of sight. In summary, these results indicate that robots can enhance accessibility by supporting multiple interfaces. Furthermore, this work demonstrates that the communication of 3D locations during human-robot interaction can serve as a powerful abstraction barrier that supports distinct interfaces to assistive robots while using identical, underlying robotic functionality.