The Rockin'Mouse: integral 3D manipulation on a plane
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
An exploration into supporting artwork orientation in the user interface
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The VideoMouse: a camera-based multi-degree-of-freedom input device
Proceedings of the 12th annual ACM symposium on User interface software and technology
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Rock 'n' Scroll Is Here to Stay
IEEE Computer Graphics and Applications
Body-Brush: a body-driven interface for visual aesthetics
Proceedings of the tenth ACM international conference on Multimedia
Sweep and point and shoot: phonecam-based interactions for large public displays
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Mixed interaction space: designing for camera based interaction with mobile devices
CHI '05 Extended Abstracts on Human Factors in Computing Systems
ISeeU: camera-based user interface for a handheld computer
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
TinyMotion: camera phone based interaction methods
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Vision-based motion estimation for interaction with mobile devices
Computer Vision and Image Understanding
Face Tracking for Spatially Aware Mobile User Interfaces
ICISP '08 Proceedings of the 3rd international conference on Image and Signal Processing
UCam: direct manipulation using handheld camera for 3d gesture interaction
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones
Proceedings of the 25th annual ACM symposium on User interface software and technology
iRotate grasp: automatic screen rotation based on grasp of mobile devices
Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology
Proceeding of the 11th annual international conference on Mobile systems, applications, and services
Front-camera video recordings as emotion responses to mobile photos shared within close-knit groups
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iGrasp: grasp-based adaptive keyboard for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IrotateGrasp: automatic screen rotation based on grasp of mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Upright or sideways?: analysis of smartphone postures in the wild
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Surround-see: enabling peripheral vision on smartphones during active use
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.01 |
We present iRotate, an approach to automatically rotate screens on mobile devices to match users' face orientation. Current approaches to automatic screen rotation are based on gravity and device orientation. Our survey of 513 users shows that 42% currently experience auto-rotation that leads to incorrect viewing orientation several times a week or more, and 24% find the problem to be very serious to extremely serious. iRotate augments gravity-based approach, and uses front cameras on mobile devices to detect users' faces and rotates screens accordingly. It requires no explicit user input and supports different user postures and device orientations. We have implemented a iRotate that works in real-time on iPhone and iPad, and we assess the accuracy and limitations of iRotate through a 20- participant feasibility study.