Designing Sociable Robots
X-means: Extending K-means with Efficient Estimation of the Number of Clusters
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Audiopad: a tag-based interface for musical performance
NIME '02 Proceedings of the 2002 conference on New interfaces for musical expression
Multi-user instruments: models, examples and promises
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces
Proceedings of the 1st international conference on Tangible and embedded interaction
Stirring up experience through movement in game play: effects on engagement and social behaviour
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Does Body Movement Engage You More in Digital Game Play? and Why?
ACII '07 Proceedings of the 2nd international conference on Affective Computing and Intelligent Interaction
Social signals, their function, and automatic analysis: a survey
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Concepts, Technology, and Assessment of the Social Music Game "Sync-in-Team'
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Reinforcement Learning of Listener Response for Mood Classification of Audio
CSE '09 Proceedings of the 2009 International Conference on Computational Science and Engineering - Volume 04
Interpersonal synchronization of body motion and the walk-mate walking support robot
IEEE Transactions on Robotics - Special issue on rehabilitation robotics
TinyTune, a collaborative sensor network musical instrument
Proceedings of the 8th ACM Conference on Embedded Networked Sensor Systems
A system for embodied social active listening to sound and music content
Journal on Computing and Cultural Heritage (JOCCH)
Estimating Dominance in Multi-Party Meetings Using Speaker Diarization
IEEE Transactions on Audio, Speech, and Language Processing
Estimating Cohesion in Small Groups Using Audio-Visual Nonverbal Behavior
IEEE Transactions on Multimedia
Hi-index | 0.00 |
Embodied cooperation "arises when two co-present, individuals in motion coordinate their goal-directed actions". The adoption of the embodied cooperation paradigm for the development of embodied and social multimedia systems opens new perspectives for future User Centric Media. Systems for embodied music listening, which enable users to influence music in real-time by movement and gesture, can greatly benefit from the embodied cooperation paradigm. This paper presents the design and the evaluation of an application, Sync4All, based on such a paradigm, allowing users to experience social embodied music listening. Each user rhythmically and freely moves a mobile phone trying to synchronise her movements with those of the other ones. The level of such a synchronisation influences the music experience. The evaluation of Sync4All was aimed at finding out which is the overall attitude of the users towards the application, and how the participants perceived embodied cooperation and music embodiment.