ISeeU: camera-based user interface for a handheld computer
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
TinyMotion: camera phone based interaction methods
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Lean and zoom: proximity-aware user interface and content magnification
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 23rd Australian Computer-Human Interaction Conference
Multi-perspective multi-layer interaction on mobile device
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
In the 3D object controlling or virtual space wandering tasks, it is necessary to provide the efficient zoom operation. The common method is using the combination of the mouse and keyboard. This method requires users familiar with the operation which needs much time to practice. This paper presents two methods to recognize the zoom operation by sensing users' pull and push movement. People only need to hold a camera in hand and when they pull or push hands, our approach will sense the proximity and translate it into the zoom operation in the tasks. By user studies, we have compared different methods' correct rate and analyzed the factors which will affect the approach's performance. The results show that our methods are real-time and high accurate.