Automatic person annotation of family photo album

  • Authors:
  • Ming Zhao;Yong Wei Teo;Siliang Liu;Tat-Seng Chua;Ramesh Jain

  • Affiliations:
  • Department of Computer Science, National University of Singapore, Singapore;Department of Computer Science, National University of Singapore, Singapore;Department of Computer Science, National University of Singapore, Singapore;Department of Computer Science, National University of Singapore, Singapore;Donald Bren Professor in Information & Computer Sciences, Department of Computer Science, Bren School of Information and Computer Sciences, University of California, Irvine, CA

  • Venue:
  • CIVR'06 Proceedings of the 5th international conference on Image and Video Retrieval
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Digital photographs are replacing tradition films in our daily life and the quantity is exploding. This stimulates the strong need for efficient management tools, in which the annotation of “who” in each photo is essential. In this paper, we propose an automated method to annotate family photos using evidence from face, body and context information. Face recognition is the first consideration. However, its performance is limited by the uncontrolled condition of family photos. In family album, the same groups of people tend to appear in similar events, in which they tend to wear the same clothes within a short time duration and in nearby places. We could make use of social context information and body information to estimate the probability of the persons' presence and identify other examples of the same recognized persons. In our approach, we first use social context information to cluster photos into events. Within each event, the body information is clustered, and then combined with face recognition results using a graphical model. Finally, the clusters with high face recognition confidence and context probabilities are identified as belonging to specific person. Experiments on a photo album containing over 1500 photos demonstrate that our approach is effective.