Reality mining: sensing complex social systems
Personal and Ubiquitous Computing
Fast food recognition from videos of eating for calorie estimation
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Social pixels: genesis and evaluation
Proceedings of the international conference on Multimedia
A wearable multi-sensor system for mobile acquisition of emotion-related physiological data
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Situation recognition: an evolving problem for heterogeneous dynamic big multimedia data
Proceedings of the 20th ACM international conference on Multimedia
Hi-index | 0.00 |
Multimedia data are now created at a macro, public scale as well as individual personal scale. While distributed multimedia streams (e.g. images, microblogs, and sensor readings) have recently been combined to understand multiple spatio-temporal phenomena like epidemic spreads, seasonal patterns, and political situations; personal data (via mobile sensors, quantified-self technologies) are now being used to identify user behavior, intent, affect, social connections, health, gaze, and interest level in real time. An effective combination of the two types of data can revolutionize multiple applications ranging from healthcare, to mobility, to product recommendation, to content delivery. Building systems at this intersection can lead to better orchestrated media systems that may also improve users' social, emotional and physical well-being. For example, users trapped in risky hurricane situations can receive personalized evacuation instructions based on their health, mobility parameters, and distance to nearest shelter. This workshop bring together researchers interested in exploring novel techniques that combine multiple streams at different scales (macro and micro) to understand and react to each user's needs.