Intelligible encoding of ASL image sequences at extremely low information rates
Papers from the second workshop Vol. 13 on Human and Machine Vision II
Machine Learning
Conference Companion on Human Factors in Computing Systems
Power Evaluation of a Handheld Computer
IEEE Micro
Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning
IEEE Transactions on Pattern Analysis and Machine Intelligence
MobileASL:: intelligibility of sign language video as constrained by mobile phone technology
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Image change detection algorithms: a systematic survey
IEEE Transactions on Image Processing
ClassInFocus: enabling improved visual attention strategies for deaf and hard of hearing students
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
A multicue Bayesian state estimator for gaze prediction in open signed video
IEEE Transactions on Multimedia
Enabling access through real-time sign language communication over cell phones
Asilomar'09 Proceedings of the 43rd Asilomar conference on Signals, systems and computers
Proceedings of the Conference on Design, Automation and Test in Europe
A web-based user survey for evaluating power saving strategies for deaf users of mobileASL
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Evaluating quality and comprehension of real-time sign language video on mobile phones
The proceedings of the 13th international ACM SIGACCESS conference on Computers and accessibility
Hi-index | 0.00 |
The MobileASL project aims to increase accessibility by enabling Deaf people to communicate over video cell phones in their native language, American Sign Language (ASL). Real-time video over cell phones can be a computationally intensive task that quickly drains the battery, rendering the cell phone useless. Properties of conversational sign language allow us to save power and bits: namely, lower frame rates are possible when one person is not signing due to turn-taking, and signing can potentially employ a lower frame rate than fingerspelling. We conduct a user study with native signers to examine the intelligibility of varying the frame rate based on activity in the video. We then describe several methods for automatically determining the activity of signing or not signing from the video stream in real-time. Our results show that varying the frame rate during turn-taking is a good way to save power without sacrificing intelligibility, and that automatic activity analysis is feasible.