Feature selection for gaze, pupillary, and EEG signals evoked in a 3D environment

  • Authors:
  • David C. Jangraw;Paul Sajda

  • Affiliations:
  • Columbia University, New Y, NY, USA;Columbia University, New York, NY, USA

  • Venue:
  • Proceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

As we navigate our environment, we are constantly assessing the objects we encounter and deciding on their subjective interest to us. In this study, we investigate the neural and ocular correlates of this assessment as a step towards their potential use in a mobile human-computer interface (HCI). Past research has shown that multiple physiological signals are evoked by objects of interest during visual search in the laboratory, including gaze, pupil dilation, and neural activity; these have been exploited for use in various HCIs. We use a virtual environment to explore which of these signals are also evoked during exploration of a dynamic, free-viewing 3D environment. Using a hierarchical classifier and sequential forward floating selection (SFFS), we identify a small, robust set of features across multiple modalities that can be used to distinguish targets from distractors in the virtual environment. The identification of these features may serve as an important factor in the design of mobile HCIs.