PEYE: toward a visual motion based perceptual interface for mobile devices

  • Authors:
  • Gang Hua;Ting-Yi Yang;Srinath Vasireddy

  • Affiliations:
  • Microsoft Live Labs Research, Redmond, WA;Mirosoft Live Labs Engineering, Redmond WA;Mirosoft Live Labs Engineering, Redmond WA

  • Venue:
  • HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present the architecture and algorithm design of a visual motion based perceptual interface for mobile devices with cameras. In addition to motion vector, we use the term "visual motion" to be any dynamic changes on consecutive image frames. In the lower architectural hierarchy, visual motion events are defined by identifying distinctive motion patterns. In the higher hierarchy, these visual events are used for interacting with user applications. We present an approach to context aware motion vector estimation to better tradeoff between speed and accuracy. It switches among a set of motion estimation algorithms of different speeds and precisions based on system context such as computation load and battery level. For example, when the CPU is heavily loaded or the battery level is low, we switch to a fast but less accurate algorithm, and vice versa. Moreover, to obtain more accurate motion vectors, we propose to adapt the search center of fast block matching methods based on previous motion vectors. Both quantitative evaluation of algorithms and subjective usability study are conducted. It is demonstrated that the proposed approach is very robust yet efficient.