Context Awareness by Analyzing Accelerometer Data
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
Context-based vision system for place and object recognition
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Continuous lifelong capture of personal experience with EyeTap
Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experiences
Efficient retrieval of life log based on context and content
Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experiences
Passive capture and ensuing issues for a personal lifetime store
Proceedings of the the 1st ACM workshop on Continuous archival and retrieval of personal experiences
Practical experience recording and indexing of Life Log video
CARPE '05 Proceedings of the 2nd ACM workshop on Continuous archival and retrieval of personal experiences
MMM '09 Proceedings of the 15th International Multimedia Modeling Conference on Advances in Multimedia Modeling
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of paper in everyday student life
Proceedings of the 10th International Conference NZ Chapter of the ACM's Special Interest Group on Human-Computer Interaction
Retrieving multimedia travel stories using location data and spatial queries
MM '09 Proceedings of the 17th ACM international conference on Multimedia
IEEE Transactions on Multimedia
Interacting with location-based multimedia using sketches
Proceedings of the ACM International Conference on Image and Video Retrieval
"Life portal": an information access scheme based on life logs
HCII'11 Proceedings of the 1st international conference on Human interface and the management of information: interacting with information - Volume Part II
Hi-index | 0.00 |
We present a new system for creation and efficient retrieval of personal life log media(P-LLM) on networked environment in this paper. Personal life log media data include audiovisual data for user's experiences and additional data from intelligent gadgets which include multimodal sensors, such as GPS, 3D-accelerometers, physiological reaction sensors and environmental sensors. We made our system as a web-based system which provides spatiotemporal graphical user interface and tree-based activity search environment, so that users can access easily and also query intuitively. Our learning based activity classification technique makes it easier to classify the user's activity from multimodal sensor data. Finally we can provide user-centered service with individual activity registration and classification for each user with our proposed system.