Visualization and User-Modeling for Browsing Personal Photo Libraries

  • Authors:
  • Baback Moghaddam;Qi Tian;Neal Lesh;Chia Shen;Thomas S. Huang

  • Affiliations:
  • Mitsubishi Electric Research Laboratories, Cambridge, MA 02139, USA. baback@merl.com;Beckman Institute, University of Illinois, Urbana-Champaign, IL 61801, USA. qitian@ifp.uiuc.edu;Mitsubishi Electric Research Laboratories, Cambridge, MA 02139, USA. lesh@merl.com;Mitsubishi Electric Research Laboratories, Cambridge, MA 02139, USA. shen@merl.com;Beckman Institute, University of Illinois, Urbana-Champaign, IL 61801, USA. huang@ifp.uiuc.edu

  • Venue:
  • International Journal of Computer Vision - Special Issue on Content-Based Image Retrieval
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a user-centric system for visualization and layout for content-based image retrieval. Image features (visual and/or semantic) are used to display retrievals as thumbnails in a 2-D spatial layout or “configuration” which conveys all pair-wise mutual similarities. A graphical optimization technique is used to provide maximally uncluttered and informative layouts. Moreover, a novel subspace feature weighting technique can be used to modify 2-D layouts in a variety of context-dependent ways. An efficient computational technique for subspace weighting and re-estimation leads to a simple user-modeling framework whereby the system can learn to display query results based on layout examples (or relevance feedback) provided by the user. The resulting retrieval, browsing and visualization can adapt to the user's (time-varying) notions of content, context and preferences in style and interactive navigation. Monte Carlo simulations with machine-generated layouts as well as pilot user studies have demonstrated the ability of this framework to model or “mimic” users, by automatically generating layouts according to their preferences.