Communications of the ACM - Special issue on graphical user interfaces
User learning and performance with marking menus
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Stylus input and editing without prior selection of mode
Proceedings of the 16th annual ACM symposium on User interface software and technology
Design and analysis of delimiters for selection-action pen gesture phrases in scriboli
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Experimental analysis of mode switching techniques in pen-based user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The springboard: multiple modes in one spring-loaded control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Fluid inking: augmenting the medium of free-form inking with gestures
GI '06 Proceedings of Graphics Interface 2006
A study on the scalability of non-preferred hand mode manipulation
Proceedings of the 9th international conference on Multimodal interfaces
A model of non-preferred hand mode switching
GI '08 Proceedings of graphics interface 2008
GestureBar: improving the approachability of gesture-based interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Motion marking menus: An eyes-free approach to motion input for handheld devices
International Journal of Human-Computer Studies
Handle Flags: efficient and flexible selections for inking applications
Proceedings of Graphics Interface 2009
The Design of Everyday Things
LogicPad: a pen-based application for visualization and verification of boolean algebra
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
User perceptions of drawing logic diagrams with pen-centric user interfaces
Proceedings of Graphics Interface 2013
Hi-index | 0.00 |
The inferred mode protocol uses contextual reasoning and local mediators to eliminate the need to access specific modes to perform draw, select, move and delete operations in a sketch interface. In this paper, we describe an observational experiment to understand the learnability, user preference and frequency of use of mode inferencing in a sketch application. The experiment demonstrated that those participants instructed in the interface features liked the fluid transitions between modes. As well, interaction techniques were not self-revealing: Participants who were not instructed in interaction techniques took longer to learn about inferred mode features and were more negative about the interaction techniques. Over multiple sketching sessions, as users develop expertise with the system, we find that they combine inferred mode techniques to speed interaction, and frequently make use of scratch space on the display to retrain themselves and to tune their behaviors. Our results inform the design of sketch interface techniques that incorporate noncommand features.