Development of an instrument measuring user satisfaction of the human-computer interface
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
User learning and performance with marking menus
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Auditory Display: Sonification, Audification, and Auditory Interfaces
Auditory Display: Sonification, Audification, and Auditory Interfaces
Shorthand writing on stylus keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
KSPC (Keystrokes per Character) as a Characteristic of Text Entry Techniques
Mobile HCI '02 Proceedings of the 4th International Symposium on Mobile Human-Computer Interaction
EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion
Proceedings of the 16th annual ACM symposium on User interface software and technology
SHARK2: a large vocabulary shorthand writing system for pen-based computers
Proceedings of the 17th annual ACM symposium on User interface software and technology
Funology: from usability to enjoyment
Funology: from usability to enjoyment
The SonicFinder: an interface that uses auditory icons
Human-Computer Interaction
Effects of auditory feedback for augmenting the act of writing
AH '12 Proceedings of the 3rd Augmented Human International Conference
User learning and performance with bezel menus
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparative evaluation of finger and pen stroke gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping, Wheeling, and Flicking
ACM Transactions on Computer-Human Interaction (TOCHI)
Perception and replication of planar sonic gestures
ACM Transactions on Applied Perception (TAP)
Coupling gestures with tactile feedback: a comparative user study
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Small, medium, or large?: estimating the user-perceived scale of stroke gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Activity or product?: drawing and HCI
Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation
Hi-index | 0.00 |
We investigate the use of auditory feedback in pen-gesture interfaces in a series of informal and formal experiments. Initial iterative exploration showed that gaining performance advantage with auditory feedback was possible using absolute cues and state feedback after the gesture was produced and recognized. However, gaining learning or performance advantage from auditory feedback tightly coupled with the pen-gesture articulation and recognition process was more difficult. To establish a systematic baseline, Experiment 1 formally evaluated gesture production accuracy as a function of auditory and visual feedback. Size of gestures and the aperture of the closed gestures were influenced by the visual or auditory feedback, while other measures such as shape distance and directional difference were not, supporting the theory that feedback is too slow to strongly influence the production of pen stroke gestures. Experiment 2 focused on the subjective aspects of auditory feedback in pen-gesture interfaces. Participants' rating on the dimensions of being wonderful and stimulating was significantly higher with musical auditory feedback. Several lessons regarding pen gestures and auditory feedback are drawn from our exploration: a few simple functions such as indicating the pen-gesture recognition results can be achieved, gaining performance and learning advantage through tightly coupled process-based auditory feedback is difficult, pen-gesture sets and their recognizers can be designed to minimize visual dependence, and people's subjective experience of gesture interaction can be influenced using musical auditory feedback. These lessons may serve as references and stepping stones toward future research and development in pen-gesture interfaces with auditory feedback.