The reading assistant: eye gaze triggered auditory prompting for reading remediation
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Gaze and Speech in Attentive User Interfaces
ICMI '00 Proceedings of the Third International Conference on Advances in Multimodal Interfaces
Ultraconservative online algorithms for multiclass problems
The Journal of Machine Learning Research
A robust algorithm for reading detection
Proceedings of the 2001 workshop on Perceptive user interfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Full-time wearable headphone-type gaze detector
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Recognizing context for annotating a live life recording
Personal and Ubiquitous Computing - Memory and Sharing of Experiences
A gaze-based study for investigating the perception of visual realism in simulated scenes
ACM Transactions on Applied Perception (TAP)
IEEE Pervasive Computing
Rapid Prototyping of Activity Recognition Applications
IEEE Pervasive Computing
Discovery of activity patterns using topic models
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
Automated eye-movement protocol analysis
Human-Computer Interaction
Real-world vision: Selective perception and task
ACM Transactions on Applied Perception (TAP)
Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography
Pervasive '08 Proceedings of the 6th International Conference on Pervasive Computing
Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments
Journal of Ambient Intelligence and Smart Environments
CHI '10 Extended Abstracts on Human Factors in Computing Systems
A long-term evaluation of sensing modalities for activity recognition
UbiComp '07 Proceedings of the 9th international conference on Ubiquitous computing
Performance metrics for activity recognition
ACM Transactions on Intelligent Systems and Technology (TIST)
Eye Movement Analysis for Activity Recognition Using Electrooculography
IEEE Transactions on Pattern Analysis and Machine Intelligence
What's in the Eyes for Context-Awareness?
IEEE Pervasive Computing
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Machine Recognition of Human Activities: A Survey
IEEE Transactions on Circuits and Systems for Video Technology
Recognition of hearing needs from body and eye movements to improve hearing instruments
Pervasive'11 Proceedings of the 9th international conference on Pervasive computing
Recognition of visual memory recall processes using eye movement analysis
Proceedings of the 13th international conference on Ubiquitous computing
EyeContext: recognition of high-level contextual cues from human visual behaviour
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
My reading life: towards utilizing eyetracking on unmodified tablets and phones
Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication
I know what you are reading: recognition of document types using mobile eye tracking
Proceedings of the 2013 International Symposium on Wearable Computers
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Tracking how we read: activity recognition for cognitive tasks
XRDS: Crossroads, The ACM Magazine for Students - Wearable Computing: Getting Dressed in Tech
Hi-index | 0.00 |
Reading is one of the most well-studied visual activities. Vision research traditionally focuses on understanding the perceptual and cognitive processes involved in reading. In this work we recognize reading activity by jointly analyzing eye and head movements of people in an everyday environment. Eye movements are recorded using an electrooculography (EOG) system; body movements using body-worn inertial measurement units. We compare two approaches for continuous recognition of reading: String matching (STR) that explicitly models the characteristic horizontal saccades during reading, and a support vector machine (SVM) that relies on 90 eye movement features extracted from the eye movement data. We evaluate both methods in a study performed with eight participants reading while sitting at a desk, standing, walking indoors and outdoors, and riding a tram. We introduce a method to segment reading activity by exploiting the sensorimotor coordination of eye and head movements during reading. Using person-independent training, we obtain an average precision for recognizing reading of 88.9% (recall 72.3%) using STR and of 87.7% (recall 87.9%) using SVM over all participants. We show that the proposed segmentation scheme improves the performance of recognizing reading events by more than 24%. Our work demonstrates that the joint analysis of eye and body movements is beneficial for reading recognition and opens up discussion on the wider applicability of a multimodal recognition approach to other visual and physical activities.