Modeling instrumental activities of daily living in egocentric vision as sequences of active objects and context for alzheimer disease research

  • Authors:
  • Iván González Díaz;Vincent Buso;Jenny Benois-Pineau;Guillaume Bourmaud;Rémi Megret

  • Affiliations:
  • University of Bordeaux, LaBRI, UMR 5800, Bordeaux, France;University of Bordeaux, LaBRI, UMR 5800, Bordeaux, France;University of Bordeaux, LaBRI, UMR 5800, Bordeaux, France;University of Bordeaux, IMS Laboratory, UMR5218, Bordeaux, France;University of Bordeaux, IMS Laboratory, UMR5218, Bordeaux, France

  • Venue:
  • Proceedings of the 1st ACM international workshop on Multimedia indexing and information retrieval for healthcare
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we study the problem of recognizing Instrumental Activities of Daily Living (IADL) in egocentric camera view. The target application of this research is the indexing of videos of patients with Alzehimer disease, thus providing medical staff with fast access and easy navigation through the video contents and helping them while assessing patients' abilities to perform IADL. Driven by the consideration that an activity in egocentric videos can be defined as a sequence of interacted objects inside different rooms, we present a novel representation based on the output of object and room detectors over temporal segments. In addition, our object detection approach is extended by automatic detection of visually salient regions since distinguishing active objects from context has been proven to dramatically improve performances in egocentric ADL recognition. We have assessed our proposal on a publicly available egocentric dataset and show extensive experimental results that demonstrate our approach outperforms current state of the art for unconstrained scenarios in which training and testing environments may be notably different.