The smart floor: a mechanism for natural user identification and tracking
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Sensor systems for interactive surfaces
IBM Systems Journal
The magic carpet: physical sensing for immersive environments
CHI EA '97 CHI '97 Extended Abstracts on Human Factors in Computing Systems
Live Soundscape Composition Based on Synthetic Emotions
IEEE MultiMedia
An interactive space that learns to influence human behavior
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Key issues for the successful design of an intelligent, interactive playground
IDC '08 Proceedings of the 7th international conference on Interaction design and children
Urban pixels: painting the city with light
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Analysing the playground: sensitizing concepts to inform systems that promote playful interaction
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Perceptually inspired methods for naturally navigating virtual worlds
SIGGRAPH Asia 2011 Courses
Embodied interaction with complex neuronal data in mixed-reality
Proceedings of the 2012 Virtual Reality International Conference
The effect of guided and free navigation on spatial memory in mixed reality
Proceedings of the Virtual Reality International Conference: Laval Virtual
Hi-index | 0.00 |
This paper describes the interactive tactile luminous floor that was constructed and used as the skin of the playful interactive space Ada, which ran as a public exhibit for five months in 2002 and had over 550,000 visitors. Ada's floor was custom-built to provide a means for individual and collective user interaction. It consists of 360 hexagonal 66 cm tiles covering a total area of 136 m^2, each with analogue tactile load sensors based on force-sensitive resistors and dimmable neon red, green and blue (RGB) lamps. The tiles are constructed from extruded aluminum with glass tops. An Interbus factory automation bus senses and controls the tiles. Software is described for rendering fluid, dynamic visual effects on the floor, for signal processing of the load information, for real-time visitor tracking and for a variety of behavioural modes, games and interactions. Data from single tiles and from tracking are shown. This floor offers new modalities of human-computer interaction and human-robot interaction for autonomous robotic spaces.