Learning a word processing system with training wheels and guided exploration
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Two-handed virtual manipulation
ACM Transactions on Computer-Human Interaction (TOCHI)
AVI '08 Proceedings of the working conference on Advanced visual interfaces
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sketching User Experiences: Getting the Design Right and the Right Design
Sketching User Experiences: Getting the Design Right and the Right Design
ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
ACM International Conference on Interactive Tabletops and Surfaces
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture
StrikeAPose: revealing mid-air gestures on public displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Error analysis for tablet user interface transfers based on operational knowledge interference
EPCE'13 Proceedings of the 10th international conference on Engineering Psychology and Cognitive Ergonomics: understanding human cognition - Volume Part I
Hi-index | 0.00 |
The touch language we use to interact with computers and devices is still developing. How can we teach users of our systems new touch gestures without interfering with their user experience? A team of user experience designers and researchers went through an iterative process to design a teaching method for two new touch interactions. This case study describes the designs they created, their insight from user studies, and the final design that will be implemented in Windows 8.