Interactive music systems: machine listening and composing
Interactive music systems: machine listening and composing
A grounded investigation of game immersion
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Developing multimodal interactive systems with EyesWeb XMI
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
User-centered design of a social game to tag music
Proceedings of the ACM SIGKDD Workshop on Human Computation
Concepts, Technology, and Assessment of the Social Music Game "Sync-in-Team'
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Toward a Real-Time Automated Measure of Empathy and Dominance
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Reinforcement Learning of Listener Response for Mood Classification of Audio
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Interpersonal synchronization of body motion and the walk-mate walking support robot
IEEE Transactions on Robotics - Special issue on rehabilitation robotics
Automated analysis of non-verbal affective and social behaviour
Proceedings of the 3rd international workshop on Affective interaction in natural environments
BeSound: embodied reflexion for music education in childhood
Proceedings of the 11th International Conference on Interaction Design and Children
Hi-index | 0.00 |
Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users' physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync`n'Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME ( www.sameproject.eu ) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync`n'Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content.