Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Triangles: tangible interface for manipulation and exploration of digital information topography
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
mediaBlocks: physical containers, transports, and controls for online media
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
Logjam: a tangible multi-person interface for video logging
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Bridging physical and virtual worlds with electronic tags
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
ToolStone: effective use of the physical manipulation vocabularies of input devices
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Haptic techniques for media control
Proceedings of the 14th annual ACM symposium on User interface software and technology
Emerging frameworks for tangible user interfaces
IBM Systems Journal
Token+constraint systems for tangible interaction with digital information
ACM Transactions on Computer-Human Interaction (TOCHI)
A framework for designing sensor-based interactions to promote exploration and reflection in play
International Journal of Human-Computer Studies
Tactile brush: drawing on skin with a tactile grid display
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploring the relations between physical objects and digital world with a geometric sorting board
Proceedings of the 9th ACM SIGCHI Italian Chapter International Conference on Computer-Human Interaction: Facing Complexity
inFORM: dynamic physical affordances and constraints through shape and object actuation
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.01 |
Discrete and continuous modes of manual control are fundamentally different: buttons select or change state, while handles persistently modulate an analog parameter. User interfaces for many electronically aided tasks afford only one of these modes when both are needed. We describe an integration of two kinds of physical interfaces (tagged objects and force feedback) that enables seamless execution of such multimodal tasks while applying the benefits of physicality; and demonstrate application scenarios with conceptual and engineering prototypes. Our emphasis is on sharing insights gained in a design case study, including expert user reactions.