The role of emotion in believable agents
Communications of the ACM
The media equation: how people treat computers, television, and new media like real people and places
Personality-rich believable agents that use language
AGENTS '97 Proceedings of the first international conference on Autonomous agents
Panel on affect and emotion in the user interface
IUI '98 Proceedings of the 3rd international conference on Intelligent user interfaces
Alternative essences of intelligence
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Template-based recognition of pose and motion gestures on a mobile robot
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Embodied conversational interface agents
Communications of the ACM
ELIZA—a computer program for the study of natural language communication between man and machine
Communications of the ACM
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Exploiting auditory fovea in humanoid-human interaction
Eighteenth national conference on Artificial intelligence
A context-dependent attention system for a social robot
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Real-time auditory and visual multiple-object tracking for humanoids
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Robotic etiquette: results from user studies involving a fetch and carry task
Proceedings of the ACM/IEEE international conference on Human-robot interaction
A review of methods and frameworks for sonic interaction design: exploring existing approaches
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
Hi-index | 0.00 |
We are studying how to create social physical agents, i.e., humanoids, that perform actions empowered by real-time audio-visual tracking of multiple talkers. Social skills require complex perceptual and motor capabilities as well as communicating ones. It is critical to identify primary features in designing building blocks for social skills, because performance of social interaction is usually evaluated as a whole system but not as each component. We investigate the minimum functionalities for social interaction, supposed that a humanoid is equipped with auditory and visual perception and simple motor control but not with sound output. Real-time audio-visual multiple-talker tracking system is implemented on the humanoid, SIG, by using sound source localization, stereo vision, face recognition, and motor control. It extracts either auditory or visual streams and associates audio and visual streams by the proximity in localization. Socially-oriented attention control makes the best use of personality variations classified by the Interpersonal Theory of psychology. It also provides task-oriented funcitons with decaying factor of belief for each stream. We demonstrate that the resulting behavior of SIG invites the users' participation in interaction and encourages the users to explore SIG's behaviors. These demonstrations show that SIG behaves like a physical non-verbal Eliza.