Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interacting with the big screen: pointers to ponder
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Direct interaction with large-scale display systems using infrared laser tracking devices
APVis '03 Proceedings of the Asia-Pacific symposium on Information visualisation - Volume 24
An Auto-Calibrated Laser-Pointing Interface for Large Screen Displays
DS-RT '03 Proceedings of the Seventh IEEE International Symposium on Distributed Simulation and Real-Time Applications
The optical tweezers: multiple-point interaction technique
Proceedings of the ACM symposium on Virtual reality software and technology
uPen: laser-based, personalized, multi-user interaction on large displays
Proceedings of the 13th annual ACM international conference on Multimedia
A practical system for laser pointer interaction on large displays
Proceedings of the ACM symposium on Virtual reality software and technology
Laser Pointer Tracking in Projector-Augmented Architectural Environments
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Structured laser pointer: enabling wrist-rolling movements as a new interactive dimension
Proceedings of the International Conference on Advanced Visual Interfaces
SideBySide: ad-hoc multi-user interaction with handheld projectors
Proceedings of the 24th annual ACM symposium on User interface software and technology
Large area interactive browsing for high resolution digitized dunhuang murals
Transactions on Edutainment III
MultiPoint: Comparing laser and manual pointing as remote input in large display interactions
International Journal of Human-Computer Studies
Lumitrack: low cost, high precision, high speed tracking with projected m-sequences
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
In this paper a 3D tracking system for Virtual Environments is presented which utilizes infrared (IR) laser technology. Invisible laser patterns are projected from the user(s) to the screen via the input device Sceptre or the appending headtracking device. IR-sensible cameras which are placed near the projectors in a backprojection setup recognize the pattern. That way position and orientation of the input devices is reconstructed. The infrared laser is not seen by human eye and therefore does not disturb the immersion.