3D Imaging for hand gesture recognition: Exploring the software-hardware interaction of current technologies

  • Authors:
  • Frol Periverzov;Horea T. Ilieş

  • Affiliations:
  • Department of Mechanical Engineering, University of Connecticut, Connecticut, USA;Department of Mechanical Engineering, University of Connecticut, Connecticut, USA

  • Venue:
  • 3D Research
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Interaction with 3D information is one of the fundamental and most familiar tasks in virtually all areas of engineering and science. Several recent technological advances pave the way for developing hand gesture recognition capabilities available to all, which will lead to more intuitive and efficient 3D user interfaces (3DUI). These developments can unlock new levels of expression and productivity in all activities concerned with the creation and manipulation of virtual 3D shapes and, specifically, in engineering design. Building fully automated systems for tracking and interpreting hand gestures requires robust and efficient 3D imaging techniques as well as potent shape classifiers. We survey and explore current and emerging 3D imaging technologies, and focus, in particular, on those that can be used to build interfaces between the users' hands and the machine. The purpose of this paper is to categorize and highlight the relevant differences between these existing 3D imaging approaches in terms of the nature of the information provided, output data format, as well as the specific conditions under which these approaches yield reliable data. Furthermore we explore the impact of each of these approaches on the computational cost and reliability of the required image processing algorithms. Finally we highlight the main challenges and opportunities in developing natural user interfaces based on hand gestures, and conclude with some promising directions for future research.