Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
Activity theory as a potential framework for human-computer interaction research
Context and consciousness
Studying context: a comparison of activity theory, situated action models, and distributed cognition
Context and consciousness
Activity and Location Recognition Using Wearable Sensors
IEEE Pervasive Computing
Hand Tension as a Gesture Segmentation Cue
Proceedings of Gesture Workshop on Progress in Gestural Interaction
Towards a Better Understanding of Context and Context-Awareness
HUC '99 Proceedings of the 1st international symposium on Handheld and Ubiquitous Computing
Probabilistic Motion Parameter Models for Human Activity Recognition
ICPR '02 Proceedings of the 16 th International Conference on Pattern Recognition (ICPR'02) Volume 1 - Volume 1
The Conference Assistant: Combining Context-Awareness with Wearable Computing
ISWC '99 Proceedings of the 3rd IEEE International Symposium on Wearable Computers
Adding Generic Contextual Capabilities to Wearable Computers
ISWC '98 Proceedings of the 2nd IEEE International Symposium on Wearable Computers
Inferring Activities from Interactions with Objects
IEEE Pervasive Computing
Layered representations for learning and inferring office activity from multiple sensory channels
Computer Vision and Image Understanding - Special issue on event detection in video
Fine-Grained Activity Recognition by Aggregating Abstract Object Usage
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
Wearable Hand Activity Recognition for Event Summarization
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
A similarity measure for motion stream segmentation and recognition
MDM '05 Proceedings of the 6th international workshop on Multimedia data mining: mining integrated media and complex data
Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Free-sketch recognition: putting the chi in sketching
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Context-Aware Computing Applications
WMCSA '94 Proceedings of the 1994 First Workshop on Mobile Computing Systems and Applications
Office activity recognition using hand posture cues
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
A hybrid discriminative/generative approach for modeling human activities
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Activity recognition based on intra and extra manipulation of everyday objects
UCS'07 Proceedings of the 4th international conference on Ubiquitous computing systems
IEEE Transactions on Robotics
Activity recognition using an egocentric perspective of everyday objects
UIC'07 Proceedings of the 4th international conference on Ubiquitous Intelligence and Computing
Nomadic gestures: A technique for reusing gesture commands for frequent ambient interactions
Journal of Ambient Intelligence and Smart Environments
Automatic recognition of object size and shape via user-dependent measurements of the grasping hand
International Journal of Human-Computer Studies
Hi-index | 0.00 |
Activity recognition plays a key role in providing information for context-aware applications. When attempting to model activities, some researchers have looked towards Activity Theory, which theorizes that activities have objectives and are accomplished through interactions with tools and objects. The goal of this paper is to determine if hand posture can be used as a cue to determine the types of interactions a user has with objects in a desk/office environment. Furthermore, we wish to determine if hand posture is user-independent across all users when interacting with the same objects in a natural manner. Our experiments indicate that (a) hand posture can be used to determine object interaction, with accuracy rates around 97%, and (b) hand posture is dependent upon the individual user when users are allowed to interact with objects as they would naturally.