Elements of computer music
HARP: a system for intelligent composer's assistance
Computer - Special issue: Computer-generated music
A Computer Music System that Follows a Human Conductor
Computer - Special issue: Computer-generated music
WOLFGANG—a system using emoting potentials to manage musical design
Understanding music with AI
A new approach to music through vision
Understanding music with AI
Interactive music systems: machine listening and composing
Interactive music systems: machine listening and composing
A design space for multimodal systems: concurrent processing and data fusion
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
The role of emotion in believable agents
Communications of the ACM
Placeholder: landscape and narrative in virtual environments
ACM SIGGRAPH Computer Graphics - Special issue on interactive entertainment design, implementation and adrenaline
A cognitive model and a knowledge representation system for music and multimedia
Journal of New Music Research
An Architecture for Emotional Agents
IEEE MultiMedia
Modeling interactive agents in ALIVE
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Slow Technology – Designing for Reflection
Personal and Ubiquitous Computing
An Architecture for Emotional Agents
IEEE MultiMedia
ICMAI '02 Proceedings of the Second International Conference on Music and Artificial Intelligence
Multisensory Musical Entertainment Systems
IEEE MultiMedia
EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems
Computer Music Journal
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Marco Somalvico Memorial Issue
Towards real-time affect detection based on sample entropy analysis of expressive gesture
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction
Priming creativity through improvisation on an adaptive musical instrument
Proceedings of the 9th ACM Conference on Creativity & Cognition
Hi-index | 0.00 |
Multimodal Environments (MEs) are systems capable of establishing creative, multimodal user interactionby exhibiting real-time adaptive behaviour. In a typical scenario, one or more users are immersed in an environmentallowing them to communicate by means of full-body movement, singing or playing. Users get feedback from the environment in real time in terms of sound, music, visual media,and actuators, i.e. movement of semi-autonomous mobilesystems including mobile scenography, on-stage robots behaving as actors or players, possibly equipped with musicand multimedia output. MEs are therefore a sort of extension of augmented reality environments. From another viewpoint, an ME can be seen as a sort of prolongation of thehuman mind and senses. From an artificial intelligence perspective, an ME consists of a population of physical andas software agents capable of changing their reactions andtheir social interaction over time. For example, a gesture ofthe user(s) can mean different things in different situations,and can produce changes in the agents populating the ME.The paradigm adopted for movement recognition is that ofa human observer of the dance, where the focus of attentionchanges according to the evolution of the dance itself andof the music produced. MEs are therefore agents able to observe the user, extract "gesture gestalts", and change theirstate, including artificial emotions, over time. MEs open newniches of application, many still to be discovered, includingmusic, dance, theatre, interactive arts, entertainment, interactive exhibitions and museal installations, information atelier, edutainment, training, industrial applications and cognitive rehabilitation (e.g. for autism). The environment canbe a theatre, a museum, a discotheque, a school classroom,a rehabilitation centre for patients with a variety of sensory/motor and cognitive impairments, etc. The ME conceptgeneralizes the bio-feedback methods which already havefound widespread applications. The paper introduces MEs,then a flexible ME architecture, with a special focus on themodeling of the emotional component of the agents formingan ME. Description of four applications we recently developed, currently used in several real testbeds, conclude thepaper.