Situated facial displays: towards social interaction
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
“Social” human-computer interaction
Human values and the design of computer technology
Animation control for real-time virtual humans
Communications of the ACM
More than just a pretty face: affordances of embodiment
Proceedings of the 5th international conference on Intelligent user interfaces
The impact of animated interface agents: a review of empirical research
International Journal of Human-Computer Studies
User Modeling and User-Adapted Interaction
Toward a New Generation of Virtual Humans for Interactive Experiences
IEEE Intelligent Systems
ICALT '01 Proceedings of the IEEE International Conference on Advanced Learning Technologies
The Virtual Human Interface: A Photorealistic Digital Human
IEEE Computer Graphics and Applications
A gaze contingent environment for fostering social attention in autistic children
Proceedings of the 2004 symposium on Eye tracking research & applications
Acceptance and usability of a relational agent interface by urban older adults
CHI '05 Extended Abstracts on Human Factors in Computing Systems
A Virtual Reality Patient and Environments for Image Guided Diagnosis
MIAR '08 Proceedings of the 4th international workshop on Medical Imaging and Augmented Reality
Compliance monitoring for assisted living using mobile platforms
SEA '07 Proceedings of the 11th IASTED International Conference on Software Engineering and Applications
Affective intelligence: a novel user interface paradigm
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
Virtual reality house for rehabilitation of aphasic clients
Transactions on Edutainment III
PanoMOBI: panoramic mobile entertainment system
ICEC'07 Proceedings of the 6th international conference on Entertainment Computing
Hi-index | 0.00 |
A special education and rehabilitation system employs real-time interactive computer graphics and photorealistic virtual humans that use gaze and facial gestures as guides to focus the learner's attention. By incorporating appropriate emotional responses while providing both verbal and nonverbal feedback, the solution implements an emotional modulation technique that increases learning efficiency. The model is based on a closed-loop interaction paradigm in which the learner's internal state is continuously monitored by sensors, and the responses as well as strategies of the animated tutoring system are adjusted accordingly. Sensors include a video camera for facial tracking and facial expression analysis, an eye tracker for measuring gaze, a biofeedback device to gauge stress levels, and other input/output devices for VR. The system has been used to develop many novel practical solutions. Case studies include applications for autistic children, cybertherapy, and cognitive rehabilitation.