Interacting with paper on the DigitalDesk
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
The metaDESK: models and prototypes for tangible user interfaces
Proceedings of the 10th annual ACM symposium on User interface software and technology
mediaBlocks: physical containers, transports, and controls for online media
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
Bridging physical and virtual worlds with electronic tags
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Augmented surfaces: a spatially continuous work space for hybrid computing environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Sensetable: a wireless object tracking platform for tangible user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DataTiles: a modular platform for mixed physical and graphical interactions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
Digital Image Processing
Activity Recognition using Visual Tracking and RFID
WACV-MOTION '05 Proceedings of the Seventh IEEE Workshops on Application of Computer Vision (WACV/MOTION'05) - Volume 1 - Volume 01
Photosensing wireless tags for geometric procedures
Communications of the ACM - Special issue: RFID
PlayAnywhere: a compact interactive tabletop projection-vision system
Proceedings of the 18th annual ACM symposium on User interface software and technology
Marked-up maps: combining paper maps and electronic information resources
Personal and Ubiquitous Computing
BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking
GI '07 Proceedings of Graphics Interface 2007
LightSense: enabling spatially aware handheld interaction devices
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Bonfire: a nomadic system for hybrid laptop-tabletop interaction
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Lumino: tangible blocks for tabletop computers based on glass fiber bundles
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The booTable experience: iterative design and prototyping of an alternative interactive tabletop
Proceedings of the 8th ACM Conference on Designing Interactive Systems
The IR ring: authenticating users' touches on a multi-touch display
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Mediated tabletop interaction in the biology lab: exploring the design space of the rabbit
Proceedings of the 13th international conference on Ubiquitous computing
TapSense: enhancing finger interaction on touch surfaces
Proceedings of the 24th annual ACM symposium on User interface software and technology
Bootstrapper: recognizing tabletop users by their shoes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A pattern language for interactive tabletops in collaborative workspaces
Proceedings of the 15th European Conference on Pattern Languages of Programs
Exploring multi-user interactions with dynamic NFC-displays
Pervasive and Mobile Computing
GravitySpace: tracking users and their poses in a smart room using a pressure-sensing floor
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Fiberio: a touchscreen that senses fingerprints
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
Interactive surfaces and related tangible user interfaces often involve everyday objects that are identified, tracked, and augmented with digital information. Traditional approaches for recognizing these objects typically rely on complex pattern recognition techniques, or the addition of active electronics or fiducials that alter the visual qualities of those objects, making them less practical for real-world use. Radio Frequency Identification (RFID) technology provides an unobtrusive method of sensing the presence of and identifying tagged nearby objects but has no inherent means of determining the position of tagged objects. Computer vision, on the other hand, is an established approach to track objects with a camera. While shapes and movement on an interactive surface can be determined from classic image processing techniques, object recognition tends to be complex, computationally expensive and sensitive to environmental conditions. We present a set of techniques in which movement and shape information from the computer vision system is fused with RFID events that identify what objects are in the image. By synchronizing these two complementary sensing modalities, we can associate changes in the image with events in the RFID data, in order to recover position, shape and identification of the objects on the surface, while avoiding complex computer vision processes and exotic RFID solutions.