Lessons from participatory design with adolescents on the autism spectrum
CHI '09 Extended Abstracts on Human Factors in Computing Systems
ACM Transactions on Accessible Computing (TACCESS)
User Modeling and User-Adapted Interaction
Designing for interaction immediacy to enhance social skills of children with autism
Proceedings of the 12th ACM international conference on Ubiquitous computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An application of interactive game for facial expression of the autisms
Edutainment'11 Proceedings of the 6th international conference on E-learning and games, edutainment technologies
Supporting children with special needs through multi-perspective behavior analysis
Proceedings of the 10th International Conference on Mobile and Ubiquitous Multimedia
Personal and Ubiquitous Computing
PERSUASIVE'12 Proceedings of the 7th international conference on Persuasive Technology: design for health and safety
ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume 2
International Journal of Technology Enhanced Learning
ACM SIGACCESS Accessibility and Computing
Evaluation of tablet apps to encourage social interaction in children with autism spectrum disorders
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Many first-hand accounts from individuals diagnosed with autism spectrum disorders (ASD) highlight the challenges inherent in processing high-speed, complex, and unpredictable social information such as facial expressions in real-time. In this paper, we describe a new technology aimed at helping people capture, analyze, and reflect on a set of social-emotional signals communicated by facial and head movements in live social interaction that occurs with their everyday social companions. We describe our development of a new combination of hardware using a miniature camera connected to an ultramobile PC together with custom software developed to track, capture, interpret, and intuitively present various interpretations of the facial-head movements (e.g., presenting that there is a high probability the person looks "confused"). This paper describes this new technology together with the results of a series of pilot studies conducted with adolescents diagnosed with ASD who used the technology in their peer-group setting and contributed to its development via their feedback.