InSense: Interest-Based Life Logging
IEEE MultiMedia
Do life-logging technologies support memory for the past?: an experimental study using sensecam
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Ubiquitous Computing for Capture and Access
Foundations and Trends in Human-Computer Interaction
BIRD'07 Proceedings of the 1st international conference on Bioinformatics research and development
Eye Movement Analysis for Activity Recognition Using Electrooculography
IEEE Transactions on Pattern Analysis and Machine Intelligence
Passively recognising human activities through lifelogging
Computers in Human Behavior
Multimodal recognition of reading activity in transit using body-worn sensors
ACM Transactions on Applied Perception (TAP)
SenseCam: a retrospective memory aid
UbiComp'06 Proceedings of the 8th international conference on Ubiquitous Computing
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Hi-index | 0.01 |
In this work we present EyeContext, a system to infer high-level contextual cues from human visual behaviour. We conducted a user study to record eye movements of four participants over a full day of their daily life, totalling 42.5 hours of eye movement data. Participants were asked to self-annotate four non-mutually exclusive cues: social (interacting with somebody vs. no interaction), cognitive (concentrated work vs. leisure), physical (physically active vs. not active), and spatial (inside vs. outside a building). We evaluate a proof-of-concept EyeContext system that combines encoding of eye movements into strings and a spectrum string kernel support vector machine (SVM) classifier. Our results demonstrate the large information content available in long-term human visual behaviour and opens up new venues for research on eye-based behavioural monitoring and life logging.