Multimodal data analysis and visualization to study the usage of electronic health records

  • Authors:
  • Nadir Weibel;Shazia Ashfaq;Alan Calvitti;James D. Hollan;Zia Agha

  • Affiliations:
  • University of California San Diego, La Jolla, CA;HSRD VA San Diego Healthcare System, San Diego, CA;HSRD VA San Diego Healthcare System, San Diego, CA;University of California San Diego, La Jolla, CA;HSRD VA San Diego Healthcare System, San Diego, CA

  • Venue:
  • Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Understanding interaction with Electronic Health Records (EHR), often means to understand the multimodal nature of the physician-patient interaction, as well as the interaction with other materials (e.g. paper charts), in addition to analyze the tasks fulfilled by the doctor on his computerized system. Recent approaches started to analyze and quantify speech, gaze, body movements, etc. and represent a very promising way to complement classic software usability. However, it is hard to characterize multimodal activity, since often it requires manual coding of hours of video data. We present our approach to use automatic tracking of body, audio signals and gaze in the medical office to achieve multimodal analysis of EHR.