A novel virtual reality driving environment for autism intervention

  • Authors:
  • Dayi Bian;Joshua W. Wade;Lian Zhang;Esubalew Bekele;Amy Swanson;Julie Ana Crittendon;Medha Sarkar;Zachary Warren;Nilanjan Sarkar

  • Affiliations:
  • Electrical Engineering and Computer Science Department, Vanderbilt University, Nashville, TN;Computer Science Department, Middle Tennessee State University, Murfreesboro, TN;Electrical Engineering and Computer Science Department, Vanderbilt University, Nashville, TN;Electrical Engineering and Computer Science Department, Vanderbilt University, Nashville, TN;Pediatrics and Psychiatry Department, Vanderbilt University, Nashville, TN;Pediatrics and Psychiatry Department, Vanderbilt University, Nashville, TN and Treatment and Research in Autism Spectrum Disorder (TRIAD), Vanderbilt University, Nashville, TN;Computer Science Department, Middle Tennessee State University, Murfreesboro, TN;Pediatrics and Psychiatry Department, Vanderbilt University, Nashville, TN and Treatment and Research in Autism Spectrum Disorder (TRIAD), Vanderbilt University, Nashville, TN;Mechanical Engineering Department, Vanderbilt University, Nashville, TN and Electrical Engineering and Computer Science Department, Vanderbilt University, Nashville, TN

  • Venue:
  • UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: user and context diversity - Volume 2
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Individuals with autism spectrum disorders (ASD) often have difficulty functioning independently and display impairments related to important tasks related to adaptive independence such as driving. Ability to drive is believed to be an important factor of quality of life for individuals with ASD. The presented work describes a novel driving simulator based on a virtual city environment that will be used in the future to impart driving skills to teenagers with ASD as a part of intervention. A physiological data acquisition system, which was used to acquire and process participant's physiological signals, and an eye tracker, which was utilized to detect eye gaze signals, were each integrated into the driving simulator. These physiological and eye gaze indices were recorded and computed to infer the affective states of the participant in real-time when he/she was driving. Based on the affective states of the participant together with his/her performance, the driving simulator adaptively changes the difficulty level of the task. This VR-based driving simulator will be capable of manipulating the driving task difficulty in response to the physiological and eye gaze indices recorded during the task. The design of this novel driving simulator system and testing data to validate its functionalities are presented in this paper.