Recognition of hearing needs from body and eye movements to improve hearing instruments
Pervasive'11 Proceedings of the 9th international conference on Pervasive computing
Discrimination of gaze directions using low-level eye image features
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Analysing EOG signal features for the discrimination of eye movements with wearable devices
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Recognition of visual memory recall processes using eye movement analysis
Proceedings of the 13th international conference on Ubiquitous computing
Proceedings of the 13th international conference on Ubiquitous computing
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Multimodal recognition of reading activity in transit using body-worn sensors
ACM Transactions on Applied Perception (TAP)
EOG-based visual navigation interface development
Expert Systems with Applications: An International Journal
Wearable eye tracking for mental health monitoring
Computer Communications
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
NaturalEyezer: interaction system based on natural reading eye movement detection
SIGGRAPH Asia 2012 Emerging Technologies
Gaze map matching: mapping eye tracking data to geographic vector features
Proceedings of the 20th International Conference on Advances in Geographic Information Systems
Online vigilance analysis combining video and electrooculography features
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
Automatic and continuous user task analysis via eye activity
Proceedings of the 2013 international conference on Intelligent user interfaces
EyeContext: recognition of high-level contextual cues from human visual behaviour
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Combining eye-tracking technologies with web usage mining for identifying Website Keyobjects
Engineering Applications of Artificial Intelligence
3D from looking: using wearable gaze tracking for hands-free and feedback-free object modelling
Proceedings of the 2013 International Symposium on Wearable Computers
I know what you are reading: recognition of document types using mobile eye tracking
Proceedings of the 2013 International Symposium on Wearable Computers
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Using eye movements to recognize activities on cartographic maps
Proceedings of the 21st ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems
Hi-index | 0.15 |
In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals—saccades, fixations, and blinks—and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.