Issues in combining marking and direct manipulation techniques
UIST '91 Proceedings of the 4th annual ACM symposium on User interface software and technology
The limits of expert performance using hierarchic marking menus
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
The PadMouse: facilitating selection and spatial positioning for the non-dominant hand
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Affordance, conventions, and design
interactions
PreSense: interaction techniques for finger sensing input devices
Proceedings of the 16th annual ACM symposium on User interface software and technology
Simple vs. compound mark hierarchical marking menus
Proceedings of the 17th annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
TapTap and MagStick: improving one-handed target acquisition on small touch-screens
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Using strokes as command shortcuts: cognitive benefits and toolkit support
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Shapewriter on the iphone: from the laboratory to the real world
CHI '09 Extended Abstracts on Human Factors in Computing Systems
ArchMenu et ThumbMenu: contrôler son dispositif mobile « sur le pouce »
IHM '07 Proceedings of the 19th International Conference of the Association Francophone d'Interaction Homme-Machine
Wavelet menus: a stacking metaphor for adapting marking menus to mobile devices
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Leaf Menus: Linear Menus with Stroke Shortcuts for Small Handheld Devices
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I
Wavelet menu: une adaptation des marking menus pour les dispositifs mobiles
Proceedings of the 21st International Conference on Association Francophone d'Interaction Homme-Machine
Wave menus: improving the novice mode of hierarchical marking menus
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
23rd French Speaking Conference on Human-Computer Interaction
JerkTilts: using accelerometers for eight-choice selection on mobile devices
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Enhancing one-handed website operation on touchscreen mobile phones
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Bezel-Tap gestures: quick activation of commands from sleep mode on tablets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
This paper presents the design and evaluation of the Wavelet menu and its implementation on the iPhone. The Wavelet menu consists of a concentric hierarchical Marking menu using simple gestures. The novice mode, i.e. when the menu is displayed, is well adapted to the limited screen space of handheld devices because the representation of the menu hierarchy is inverted, the deeper submenu being always displayed at the center of the screen. The visual design is based on a stacking metaphor to reinforce the perception of the hierarchy and to help users to quickly understand how the technique works. The menu also supports submenu previsualization, a key property to navigate efficiently in a hierarchy of commands. The quantitative evaluation shows that the Wavelet menu provides an intuitive way for supporting efficient gesture-based navigation. The expert mode, i.e. gesture without waiting for the menu to pop-up, is another key property of the Wavelet menu: By providing stroke shortcuts, the Wavelet favors the selection of frequent commands in expert mode and makes eyes-free selection possible. A user experiment shows that participants are able to select commands, eyes-free, while walking.