Fast spheres, shadows, textures, transparencies, and imgage enhancements in pixel-planes
SIGGRAPH '85 Proceedings of the 12th annual conference on Computer graphics and interactive techniques
Sorcerer's apprentice: head-mounted display and wand
Sorcerer's apprentice: head-mounted display and wand
The use of a kinesthetic supplement in an interactive graphics system.
The use of a kinesthetic supplement in an interactive graphics system.
Self-tracker: a smart optical sensor on silicon (vlsi, graphics)
Self-tracker: a smart optical sensor on silicon (vlsi, graphics)
A demonstrated optical tracker with scalable work area for head-mounted display systems
I3D '92 Proceedings of the 1992 symposium on Interactive 3D graphics
The HiBall Tracker: high-performance wide-area tracking for virtual and augmented environments
Proceedings of the ACM symposium on Virtual reality software and technology
A vision-based head tracker for fish tank virtual reality-VR without head gear
VRAIS '95 Proceedings of the Virtual Reality Annual International Symposium (VRAIS'95)
Case study: observing a volume rendered fetus within a pregnant patient
VIS '94 Proceedings of the conference on Visualization '94
FlightTracker: A Novel Optical/Inertial Tracker for Cockpit Enhanced Vision
ISMAR '04 Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality
International Journal of Computer Vision
High-Performance Wide-Area Optical Tracking: The HiBall Tracking System
Presence: Teleoperators and Virtual Environments
Hi-index | 0.00 |
In this paper, a new optical system for real-time, three-dimensional position tracking is described. This system adopts an "inside-out" tracking paradigm. The working environment is a room where the ceiling is lined with a regular pattern of infrared LEDs flashing under the system's control. Three cameras are mounted on a helmet which the user wears. Each camera uses a lateral effect photodiode as the recording surface. The 2D positions of the LED images inside the field of view of the cameras are detected and reported in real time. The measured 2D image positions and the known 3D positions of the LEDs are used to compute the position and orientation of the camera assembly in space.We have designed an iterative algorithm to estimate the 3D position of the camera assembly in space. The algorithm is a generalized version of the Church's method, and allows for multiple cameras with nonconvergent nodal points. Several equations are formulated to predict the system's error analytically. The requirements of accuracy, speed, adequate working volume, light weight and small size of the tracker are also addressed.A prototype was designed and built to demonstrate the integration and coordination of all essential components of the new tracker. This prototype uses off-the-shelf components and can be easily duplicated. Our results indicate that the new system significantly out-performs other existing systems. The new tracker provides more than 200 updates per second, registers 0.1-degree rotational movements and 2-millimeter translational movements, and processes a working volume about 1,000 ft3 (10 ft on each side).