PingPongPlus: design of an athletic-tangible interface for computer-supported cooperative play
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM Transactions on Computer-Human Interaction (TOCHI)
Full-size projection keyboard for handheld devices
Communications of the ACM - A game experience in every application
Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays
Proceedings of the 16th annual ACM symposium on User interface software and technology
TouchLight: an imaging touch screen and display for gesture-based interaction
Proceedings of the 6th international conference on Multimodal interfaces
Visual touchpad: a two-handed gestural input device
Proceedings of the 6th international conference on Multimodal interfaces
Visual tracking of bare fingers for interactive surfaces
Proceedings of the 17th annual ACM symposium on User interface software and technology
Tangible acoustic interfaces and their applications for the design of new musical instruments
NIME '05 Proceedings of the 2005 conference on New interfaces for musical expression
PlayAnywhere: a compact interactive tabletop projection-vision system
Proceedings of the 18th annual ACM symposium on User interface software and technology
Low-cost multi-touch sensing through frustrated total internal reflection
Proceedings of the 18th annual ACM symposium on User interface software and technology
Generic Framework for Transforming Everyday Objects into Interactive Surfaces
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
Hi-index | 0.00 |
This article describes a few techniques to transform daily life objects into tactile interfaces, and presents the implementation details for three objects chosen as example: a light globe, a tray and a table. Those techniques can be divided in two main categories, acoustic techniques and computer vision techniques. Acoustic techniques use the vibrations that are produced when touching an object and that are propagating through and on the surface of the object until reaching piezo sensors attached on the surface. The computer vision approach is an extension of the technique used for virtual keyboards, and is based on the detection of fingers intercepting a plane of infrared light projected above the surface by a pair of laser modules. It allows for multi-touch sensing on any flat surfaces.