The perspective wall: detail and context smoothly integrated
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluating 3D task performance for fish tank virtual worlds
ACM Transactions on Information Systems (TOIS)
Situated information spaces and spatially aware palmtop computers
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Surround-screen projection-based virtual reality: the design and implementation of the CAVE
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
A generic platform for addressing the multimodal challenge
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Halo: a technique for visualizing off-screen objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An Operator Interaction Framework for Visualization Systems
INFOVIS '98 Proceedings of the 1998 IEEE Symposium on Information Visualization
The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations
VL '96 Proceedings of the 1996 IEEE Symposium on Visual Languages
A vision-based head tracker for fish tank virtual reality-VR without head gear
VRAIS '95 Proceedings of the Virtual Reality Annual International Symposium (VRAIS'95)
Perceptual user interfaces using vision-based eye tracking
Proceedings of the 5th international conference on Multimodal interfaces
ICARE software components for rapidly developing multimodal interfaces
Proceedings of the 6th international conference on Multimodal interfaces
Head gesture recognition in intelligent interfaces: the role of context in improving recognition
Proceedings of the 11th international conference on Intelligent user interfaces
Evaluating motion constraints for 3D wayfinding in immersive and desktop virtual environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Face Tracking for Spatially Aware Mobile User Interfaces
ICISP '08 Proceedings of the 3rd international conference on Image and Signal Processing
Hacking the Nintendo Wii Remote
IEEE Pervasive Computing
Experience in the design and development of a game based on head-tracking input
Future Play '08 Proceedings of the 2008 Conference on Future Play: Research, Play, Share
Body-based interaction for desktop games
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Head tilting for interaction in mobile contexts
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Head Tracking in First-Person Games: Interaction Using a Web-Camera
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
A Face Tracking Algorithm for User Interaction in Mobile Devices
CW '09 Proceedings of the 2009 International Conference on CyberWorlds
pCubee: a perspective-corrected handheld cubic display
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Use your head: tangible windows for 3D information spaces in a tabletop environment
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces
Proceedings of the 24th Australian Computer-Human Interaction Conference
An experimentation environment for mobile 3D and virtual reality
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Pond of illusion: interacting through mixed reality
SIGGRAPH Asia 2013 Posters
Hi-index | 0.00 |
We study interaction modalities for mobile devices (smartphones and tablets) that rely on a camera-based head tracking. This technique defines new possibilities for input and output interaction. For output, by computing the position of the device according to the user's head, it is for example possible to realistically control the viewpoint on a 3D scene (Head-Coupled Perspective, HCP). This technique improves the output interaction bandwidth by enhancing the depth perception and by allowing the visualization of large workspaces (virtual window). For input, head movement can be used as a means of interacting with a mobile device. Moreover such an input modality does not require any additional sensor except the built-in front-facing camera. In this paper, we classify the interaction possibilities offered by head tracking on smartphones and tablets. We then focus on the output interaction by introducing several applications of HCP on both smartphones and tablets and by presenting the results of a qualitative user experiment.