Bottles as a minimal interface to access digital information
CHI '01 Extended Abstracts on Human Factors in Computing Systems
a CAPpella: programming by demonstration of context-aware applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
I/O brush: drawing with everyday objects as ink
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Topobo: a constructive assembly system with kinetic memory
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ReachMedia: On-the-move interaction with everyday objects
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces
Proceedings of the 1st international conference on Tangible and embedded interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 15th international conference on Multimedia
Jabberstamp: embedding sound and voice in traditional drawings
Proceedings of the 6th international conference on Interaction design and children
Picture this!: film assembly using toy gestures
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
Coming to grips with the objects we grasp: detecting interactions with efficient wrist-worn sensors
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Hi-index | 0.00 |
Many Tangible User Interface (TUI) systems employ sensor-equipped physical objects. However they do not easily scale to users' actual environments; most everyday objects lack the necessary hardware, and modification requires hardware and software development by skilled individuals. This limits TUI creation by end users, resulting in inflexible interfaces in which the mapping of sensor input and output events cannot be easily modified reflecting the end user's wishes and circumstances. We introduce OnObject, a small device worn on the hand, which can program physical objects to respond to a set of gestural triggers. Users attach RFID tags to situated objects, grab them by the tag, and program their responses to grab, release, shake, swing, and thrust gestures using a built-in button and a microphone. In this paper, we demonstrate how novice end users including preschool children can instantly create engaging gestural object interfaces with sound feedback from toys, drawings, or clay.