A kinetic and 3D image input device
CHI 98 Cconference Summary on Human Factors in Computing Systems
Recognizing User Context via Wearable Sensors
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
StartleCam: A Cybernetic Wearable Camera
ISWC '98 Proceedings of the 2nd IEEE International Symposium on Wearable Computers
Visual Contextual Awareness in Wearable Computing
ISWC '98 Proceedings of the 2nd IEEE International Symposium on Wearable Computers
Editorial: The age of human computer interaction
Image and Vision Computing
Head orientation sensing by a wearable device for assisted locomotion
Proceedings of the 2nd Augmented Human International Conference
Hi-index | 0.00 |
In this paper, we propose a body-mounted system to capture user experience as audio/visual information. The proposed system consists of two cameras (head-detection and wide angle) and a microphone. The head-detection camera captures user head motions, while the wide angle color camera captures user frontal view images. An image region approximately corresponding to user view is then synthesized from the wide angle image based on estimated human head motions. The synthesized image and head-motion data are stored in a storage device with audio data. This system overcomes the disadvantages of head-mounted cameras in terms of ease of putting on/taking off the device. It also has less obtrusive visual impact on third persons. Using the proposed system, we can simultaneously record audio data, images in the user field of view, and head gestures (nodding, shaking, etc.) simultaneously. These data contain significant information for recording/analyzing human activities and can be used in wider application domains such as a digital diary or interaction analysis. Experimental results demonstrate the effectiveness of the proposed system.