Situated information spaces and spatially aware palmtop computers
Communications of the ACM - Special issue on computer augmented environments: back to the real world
A camera-based interface for interaction with mobile handheld computers
Proceedings of the 2005 symposium on Interactive 3D graphics and games
TinyMotion: camera phone based interaction methods
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Fast camera motion estimation for hand-held devices and applications
MUM '05 Proceedings of the 4th international conference on Mobile and ubiquitous multimedia
An intuitive interface based on camera parameters for portable devices
ACM SIGGRAPH 2006 Research posters
Interaction techniques in large display environments using hand-held devices
Proceedings of the ACM symposium on Virtual reality software and technology
Which one is better?: information navigation techniques for spatially aware handheld displays
Proceedings of the 8th international conference on Multimodal interfaces
CaMus2: optical flow and collaboration in camera phone music performance
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Real-Time Object Tracking for Augmented Reality Combining Graph Cuts and Optical Flow
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Camera-Based virtual environment interaction on mobile devices
ISCIS'06 Proceedings of the 21st international conference on Computer and Information Sciences
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IntelliTilt: an enhanced tilt interaction technique for mobile map-based applications
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Hi-index | 0.00 |
We present a three degree-of-freedom control designed for viewing large documents and images on a mobile device equipped with a camera. Tracking natural features detected in the camera's field of view, we can roughly estimate the motion of the device, using the results to scroll and zoom the current document. Central to our implementation is the manner by which we amplify the motion, allowing the user to scroll through large portions of the document with minimal hand movement. Then, using a Hidden Markov Model, we determine when the user is scrolling, zooming, or some combination of the two, thus providing smoother, more fluid control. We demonstrate a prototype of our 3DOF control that can easily navigate documents that are many times larger than the display area, and show how it might be incorporated into a larger document retrieval application.