CyberCode: designing augmented reality environments with visual tags
DARE '00 Proceedings of DARE 2000 on Designing augmented reality environments
Visual panel: virtual mouse, keyboard and 3D controller with an ordinary piece of paper
Proceedings of the 2001 workshop on Perceptive user interfaces
Sweep and point and shoot: phonecam-based interactions for large public displays
CHI '05 Extended Abstracts on Human Factors in Computing Systems
A Vision-Based Approach for Controlling User Interfaces of Mobile Devices
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
TinyMotion: camera phone based interaction methods
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Camera phone based motion sensing: interaction techniques, applications and performance study
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Robust computer vision-based detection of pinching for one and two-handed gesture input
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
A conceptual framework for camera phone-based interaction techniques
PERVASIVE'05 Proceedings of the Third international conference on Pervasive Computing
A new diamond search algorithm for fast block-matching motion estimation
IEEE Transactions on Image Processing
A novel four-step search algorithm for fast block motion estimation
IEEE Transactions on Circuits and Systems for Video Technology
Hexagon-based search pattern for fast block motion estimation
IEEE Transactions on Circuits and Systems for Video Technology
Human-computer intelligent interaction: a survey
HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
Introduction to the Special Issue on Mobile Vision
International Journal of Computer Vision
Looking at you: fused gyro and face tracking for viewing large imagery on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
We present the architecture and algorithm design of a visual motion based perceptual interface for mobile devices with cameras. In addition to motion vector, we use the term "visual motion" to be any dynamic changes on consecutive image frames. In the lower architectural hierarchy, visual motion events are defined by identifying distinctive motion patterns. In the higher hierarchy, these visual events are used for interacting with user applications. We present an approach to context aware motion vector estimation to better tradeoff between speed and accuracy. It switches among a set of motion estimation algorithms of different speeds and precisions based on system context such as computation load and battery level. For example, when the CPU is heavily loaded or the battery level is low, we switch to a fast but less accurate algorithm, and vice versa. Moreover, to obtain more accurate motion vectors, we propose to adapt the search center of fast block matching methods based on previous motion vectors. Both quantitative evaluation of algorithms and subjective usability study are conducted. It is demonstrated that the proposed approach is very robust yet efficient.