Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Composable ad-hoc mobile services for universal interaction
MobiCom '97 Proceedings of the 3rd annual ACM/IEEE international conference on Mobile computing and networking
mediaBlocks: physical containers, transports, and controls for online media
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
The Jini architecture for network-centric computing
Communications of the ACM
A universal information appliance
IBM Systems Journal
ToolStone: effective use of the physical manipulation vocabularies of input devices
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Phidgets: easy development of physical interfaces through physical widgets
Proceedings of the 14th annual ACM symposium on User interface software and technology
Where the action is: the foundations of embodied interaction
Where the action is: the foundations of embodied interaction
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Bottles as a minimal interface to access digital information
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Embodied User Interfaces: Towards Invisible User Interfaces
Proceedings of the IFIP TC2/TC13 WG2.7/WG13.4 Seventh Working Conference on Engineering for Human-Computer Interaction
An Inertial Measurement Framework for Gesture Recognition and Applications
GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction
ICrafter: A Service Framework for Ubiquitous Computing Environments
UbiComp '01 Proceedings of the 3rd international conference on Ubiquitous Computing
On-Line Handwriting Recognition with Support Vector Machines " A Kernel Approach
IWFHR '02 Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition (IWFHR'02)
Graspable user interfaces
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
Emerging frameworks for tangible user interfaces
IBM Systems Journal
Detection of eating and drinking arm gestures using inertial body-worn sensors
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
Orientation sensing for gesture-based interaction with smart artifacts
Computer Communications
From interaction to participation: configuring space through embodied interaction
UbiComp'05 Proceedings of the 7th international conference on Ubiquitous Computing
Hi-index | 0.00 |
Pervasive Computing has postulated to invisibly integrate technology into everyday objects in such a way, that these objects turn into smart things. Not only a single object of this kind is supposed to represent the interface among the "physical world" of atoms and the "digital world" of bits, but a whole landscapes of them. The interaction among humans and such landscapes of technology rich artifacts happens to be more confluently, rather than on a per device basis. To address the confluence among humans and computing landscapes we study we study human gesticulation and the manipulation of graspable and movable everyday artifacts as a potentially effective means for the interaction with the physical environment. In detail, we consider gestures in the general sense of a movement or a state (posture) of the human body, as well as a movement or state of any physical object resulting from human manipulation. Further, based on the tangible user interface paradigm, we propose employing intuitive tangible universal controls that translate physical motions into actions for controlling landscapes of smart things. Such intuitive "everyday"-gestures have been collected in a series of user tests, yielding a catalogue of generic body and artifact gesture dynamics. We present a systematic approach to selecting and steering using tangible artifacts by associating a flip-movement to service selection and a turn-movement to parameter steering. An implementation of this approach in a general software framework and several experiments with various fully functional artifacts and devices are described.