Documents and professional practice: “bad” organisational reasons for “good” clinical records
CSCW '96 Proceedings of the 1996 ACM conference on Computer supported cooperative work
Visualizing activity on wikipedia with chromograms
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
ChronoViz: a system for supporting navigation of time-coded data
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Understanding interaction with Electronic Health Records (EHR), often means to understand the multimodal nature of the physician-patient interaction, as well as the interaction with other materials (e.g. paper charts), in addition to analyze the tasks fulfilled by the doctor on his computerized system. Recent approaches started to analyze and quantify speech, gaze, body movements, etc. and represent a very promising way to complement classic software usability. However, it is hard to characterize multimodal activity, since often it requires manual coding of hours of video data. We present our approach to use automatic tracking of body, audio signals and gaze in the medical office to achieve multimodal analysis of EHR.