Sensing and Modeling Human Networks using the Sociometer
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Contextual recognition of head gestures
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Full-time wearable headphone-type gaze detector
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Signal processing in high-end hearing aids: state of the art, challenges, and future trends
EURASIP Journal on Applied Signal Processing
Sound classification in hearing aids inspired by auditory scene analysis
EURASIP Journal on Applied Signal Processing
Rapid Prototyping of Activity Recognition Applications
IEEE Pervasive Computing
Wearable Activity Tracking in Car Manufacturing
IEEE Pervasive Computing
Detecting Walking Gait Impairment with an Ear-worn Sensor
BSN '09 Proceedings of the 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks
The Attentive Hearing Aid: Eye Selection of Auditory Sources for Hearing Impaired Users
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments
Journal of Ambient Intelligence and Smart Environments
Eye Movement Analysis for Activity Recognition Using Electrooculography
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multimodal recognition of reading activity in transit using body-worn sensors
ACM Transactions on Applied Perception (TAP)
Identification of relevant multimodal cues to enhance context-aware hearing instruments
Proceedings of the 6th International Conference on Body Area Networks
Proceedings of the 6th International Conference on Body Area Networks
NaturalEyezer: interaction system based on natural reading eye movement detection
SIGGRAPH Asia 2012 Emerging Technologies
A tutorial on human activity recognition using body-worn inertial sensors
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
Hearing instruments (HIs) have emerged as true pervasive computers as they continuously adapt the hearing program to the user's context. However, current HIs are not able to distinguish different hearing needs in the same acoustic environment. In this work, we explore how information derived from body and eye movements can be used to improve the recognition of such hearing needs. We conduct an experiment to provoke an acoustic environment in which different hearing needs arise: active conversation and working while colleagues are having a conversation in a noisy office environment. We record body movements on nine body locations, eye movements using electrooculography (EOG), and sound using commercial HIs for eleven participants. Using a support vector machine (SVM) classifier and person-independent training we improve the accuracy of 77% based on sound to an accuracy of 92% using body movements. With a view to a future implementation into a HI we then perform a detailed analysis of the sensors attached to the head. We achieve the best accuracy of 86% using eye movements compared to 84% for head movements. Our work demonstrates the potential of additional sensor modalities for future HIs and motivates to investigate the wider applicability of this approach on further hearing situations and needs.