International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Using music to interact with a virtual character
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
A learning-based jam session system that imitates a player's personality model
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
From acoustic cues to an expressive agent
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Interacting with a virtual rap dancer
INTETAIN'05 Proceedings of the First international conference on Intelligent Technologies for Interactive Entertainment
Visualizing emotion in musical performance using a virtual character
SG'05 Proceedings of the 5th international conference on Smart Graphics
Experiencing-in-the-world: using pragmatist philosophy to design for aesthetic experience
Proceedings of the 2007 conference on Designing for User eXperiences
Dancing the night away: controlling a virtual karaoke dancer by multimodal expressive cues
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 3
Gesture-Based Human-Computer Interaction and Simulation
The use of a digital dance mat for training kindergarten children in a magnitude comparison task
ICLS '10 Proceedings of the 9th International Conference of the Learning Sciences - Volume 1
Proceedings of the 5th International Conference on Ubiquitous Information Management and Communication
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Towards a reactive virtual trainer
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Transactions on Edutainment IX
Towards affect sensitive and socially perceptive companions
Your Virtual Butler
Hi-index | 0.00 |
This paper presents a virtual rap dancer that is able to dance to the beat of music coming in from music recordings, beats obtained from music, voice or other input through a microphone, motion beats detected in the video stream of a human dancer, or motions detected from a dance mat. The rap dancer's moves are generated from a lexicon that was derived manually from the analysis of the video clips of rap songs performed by various rappers. The system allows for adaptation of the moves in the lexicon on the basis of style parameters. The rap dancer invites a user to dance along with the music.