QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
The human-computer interaction handbook
A Multi-Modal Interface for an Interactive Simulated Vascular Reconstruction System
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
ICOST'11 Proceedings of the 9th international conference on Toward useful services for elderly and people with disabilities: smart homes and health telematics
Assistive mobility devices focusing on Smart Walkers: Classification and review
Robotics and Autonomous Systems
Designing a meta-model for a generic robotic agent system using Gaia methodology
Information Sciences: an International Journal
Case-based reasoning emulation of persons for wheelchair navigation
Artificial Intelligence in Medicine
Hi-index | 0.00 |
With the rising concern about the needs of people with physical disabilities and with the aging of the population there is a major concern of creating electronic devices that may improve the life of the physically handicapped and elderly person. One of these new solutions passes through the adaptation of electric wheelchairs in order to give them environmental perception, more intelligent capabilities and more adequate Human – Machine Interaction. This paper focuses in the development of a user-friendly multimodal interface, which is integrated in the Intellwheels project. This simple multimodal human-robot interface developed allows the connection of several input modules, enabling the wheelchair control through flexible input sequences of distinct types of inputs (voice, facial expressions, head movements, keyboard and, joystick). The system created is capable of storing user defined associations, of input’s sequences and corresponding output commands. The tests performed have proved the system efficiency and the capabilities of this multimodal interface.