Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Urp: a luminous-tangible workbench for urban planning and design
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
BUILD-IT: A Brick-based integral Solution Supporting Multidisciplinary Design Tasks
Proceedings of the IFIP Working Group 13.2 Conference on Designing Effective and Usable Multimedia Systems
The MediaCup: Awareness Technology Embedded in a Everyday Object
HUC '99 Proceedings of the 1st international symposium on Handheld and Ubiquitous Computing
Real-time Analysis of Data from Many Sensors with Neural Networks
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Learning Significant Locations and Predicting User Movement with GPS
ISWC '02 Proceedings of the 6th IEEE International Symposium on Wearable Computers
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
Lexical and pragmatic considerations of input structures
ACM SIGGRAPH Computer Graphics
Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
Detection of eating and drinking arm gestures using inertial body-worn sensors
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
Using Wearable Sensors to Measure Motor Abilities following Stroke
BSN '06 Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks
Context-aware kitchen utilities
Proceedings of the 1st international conference on Tangible and embedded interaction
Recognition of dietary activity events using on-body sensors
Artificial Intelligence in Medicine
Gesture spotting with body-worn inertial sensors to detect user activities
Pattern Recognition
Posey: instrumenting a poseable hub and strut construction toy
Proceedings of the 2nd international conference on Tangible and embedded interaction
Activity recognition from accelerometer data
IAAI'05 Proceedings of the 17th conference on Innovative applications of artificial intelligence - Volume 3
Hi-index | 0.00 |
There are many ways to capture human gestures. In this paper, consideration is given to an extension to the growing trend to use sensors to capture movements and interpret these as gestures. However, rather than have sensors on people, the focus is on the attachment of sensors (i.e., strain gauges and accelerometers) to the tools that people use. By instrumenting a set of handles, which can be fitted with a variety of effectors (e.g., knives, forks, spoons, screwdrivers, spanners, saws etc.), it is possible to capture the variation in grip force applied to the handle as the tool is used and the movements made using the handle. These data can be sent wirelessly (using Zigbee) to a computer where distinct patterns of movement can be classified. Different approaches to the classification of activity are considered. This provides an approach to combining the use of real tools in physical space with the representation of actions on a computer. This approach could be used to capture actions during manual tasks, say in maintenance work, or to support development of movements, say in rehabilitation.