Pick-and-drop: a direct manipulation technique for multiple computer environments
Proceedings of the 10th annual ACM symposium on User interface software and technology
Flatland: new dimensions in office whiteboards
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The designers' outpost: a tangible interface for collaborative web site
Proceedings of the 14th annual ACM symposium on User interface software and technology
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Proceedings of the 17th annual ACM symposium on User interface software and technology
Sweep and point and shoot: phonecam-based interactions for large public displays
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Interaction techniques in large display environments using hand-held devices
Proceedings of the ACM symposium on Virtual reality software and technology
A camera-based mobile data channel: capacity and analysis
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Tangible interfaces for interactive multimedia presentations
Mobile Information Systems - Information Assurance and Advanced Human-Computer Interfaces
Unsynchronized 4D barcodes: coding and decoding time-multiplexed 2D colorcodes
ISVC'07 Proceedings of the 3rd international conference on Advances in visual computing - Volume Part I
LightSync: unsynchronized visual communication over screen-camera links
Proceedings of the 19th annual international conference on Mobile computing & networking
Hi-index | 0.00 |
User interaction with applications and data is traditionally based on Menu driven, mouse or pen based user interfaces. As increasing number of mobile devices are equipped with cameras, mobile devices can be a key tool for local and remote visual interaction and communications. In this paper we discuss our Mobile Visual Interaction system that enables pointing with mobile camera devices on larger displays, and key findings in using mobile cameras for human computer interaction. The mobile pointing software can identify and interpret camera-detectable data elements inserted on data surfaces such as computer displays or projected screens. The user can interact with this system by pointing the mobile camera device to the surface and performing gestures. In this article we share the key challenges and key results found during developing the system.