3D content-based search using sketches

  • Authors:
  • Konstantinos Moustakas;Georgios Nikolakis;Dimitrios Tzovaras;Sebastien Carbini;Olivier Bernier;Jean Emmanuel Viallet

  • Affiliations:
  • Informatics and Telematics Instiutte, Thermi-Thessaloniki, Greece 57001 and Electrical and Computer Engineering Department, Aristotle University of Thessaloniki, Thessaloniki, Greece 54006;Informatics and Telematics Instiutte, Thermi-Thessaloniki, Greece 57001;Informatics and Telematics Instiutte, Thermi-Thessaloniki, Greece 57001;France Telecom R&D, Lannion, France 22307;France Telecom R&D, Lannion, France 22307;France Telecom R&D, Lannion, France 22307

  • Venue:
  • Personal and Ubiquitous Computing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a novel interactive framework for 3D content-based search and retrieval using as query model an object that is dynamically sketched by the user. In particular, two approaches are presented for generating the query model. The first approach uses 2D sketching and symbolic representation of the resulting gestures. The second utilizes non-linear least squares minimization to model the 3D point cloud that is generated by the 3D tracking of the user's hands, using superquadrics. In the context of the proposed framework, three interfaces were integrated to the sketch-based 3D search system including (a) an unobtrusive interface that utilizes pointing gesture recognition to allow the user manipulate objects in 3D, (b) a haptic---VR interface composed by 3D data gloves and a force feedback device, and (c) a simple air---mouse. These interfaces were tested and comparative results were extracted according to usability and efficiency criteria.