Modeling focus of attention for meeting indexing
MULTIMEDIA '99 Proceedings of the seventh ACM international conference on Multimedia (Part 1)
Meeting Capture in a Media Enriched Conference Room
CoBuild '99 Proceedings of the Second International Workshop on Cooperative Buildings, Integrating Information, Organization, and Architecture
The Aware Home: A Living Laboratory for Ubiquitous Computing Research
CoBuild '99 Proceedings of the Second International Workshop on Cooperative Buildings, Integrating Information, Organization, and Architecture
CT '97 Proceedings of the 2nd International Conference on Cognitive Technology (CT '97)
Wearable Interfaces for a Video Diary: Towards Memory Retrieval, Exchange, and Transportation
ISWC '02 Proceedings of the 6th IEEE International Symposium on Wearable Computers
The KidsRoom: A Perceptually-Based Interactive and Immersive Story Environment
Presence: Teleoperators and Virtual Environments
Ubigraphy: a third-person viewpoint life log
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Inferring Human Interactions in Meetings: A Multimodal Approach
UIC '09 Proceedings of the 6th International Conference on Ubiquitous Intelligence and Computing
Multimodal sensing, recognizing and browsing group social dynamics
Personal and Ubiquitous Computing
Social life logging: can we describe our own personal experience by using collective intelligence?
Proceedings of the 10th asia pacific conference on Computer human interaction
Hi-index | 0.00 |
This paper proposes a notion of interaction corpus, a captured collection of human behaviors and interactions among humans and artifacts. Digital multimedia and ubiquitous sensor technologies create a venue to capture and store interactions that are automatically annotated. A very large-scale accumulated corpus provides an important infrastructure for a future digital society for both humans and computers to understand verbal/non-verbal mechanisms of human interactions. The interaction corpus can also be used as a well-structured stored experience, which is shared with other people for communication and creation of further experiences. Our approach employs wearable and ubiquitous sensors, such as video cameras, microphones, and tracking tags, to capture all of the events from multiple viewpoints simultaneously. We demonstrate an application of generating a video-based experience summary that is reconfigured automatically from the interaction corpus.