Mobile ADVICE: an accessible device for visually impaired capability enhancement
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Accelerometer-based gesture control for a design environment
Personal and Ubiquitous Computing
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Gesture Recognition with a 3-D Accelerometer
UIC '09 Proceedings of the 6th International Conference on Ubiquitous Intelligence and Computing
uWave: Accelerometer-based personalized gesture recognition and its applications
Pervasive and Mobile Computing
Leveraging proprioception to make mobile phones more accessible to users with visual impairments
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Learning Pedestrian Trajectories with Kernels
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
(Computer) vision without sight
Communications of the ACM
Accelerometer based gesture recognition using continuous HMMs
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part I
Temporal Integration for Audio Classification With Application to Musical Instrument Classification
IEEE Transactions on Audio, Speech, and Language Processing
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Ambient interaction by smart watches
Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments
Hi-index | 0.00 |
Modern mobile devices provide several functionalities and new ones are being added at a breakneck pace. Unfortunately browsing the menu and accessing the functions of a mobile phone is not a trivial task for visual impaired users. Low vision people typically rely on screen readers and voice commands. However, depending on the situations, screen readers are not ideal because blind people may need their hearing for safety, and automatic recognition of voice commands is challenging in noisy environments. Novel smart watches technologies provides an interesting opportunity to design new forms of user interaction with mobile phones. We present our first works towards the realization of a system, based on the combination of a mobile phone and a smart watch for gesture control, for assisting low vision people during daily life activities. More specifically we propose a novel approach for gesture recognition which is based on global alignment kernels and is shown to be effective in the challenging scenario of user independent recognition. This method is used to build a gesture-based user interaction module and is embedded into a system targeted to visually impaired which will also integrate several other modules. We present two of them: one for identifying wet floor signs, the other for automatic recognition of predefined logos.