Predicting hyperarticulate speech during human-computer error resolution
Speech Communication
Stylus input and editing without prior selection of mode
Proceedings of the 16th annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Experimental analysis of mode switching techniques in pen-based user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Audio-visual cues distinguishing self- from system-directed speech in younger and older adults
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
The springboard: multiple modes in one spring-loaded control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Quiet interfaces that help students think
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Proceedings of the 8th international conference on Multimodal interfaces
Toward open-microphone engagement for multiparty interactions
Proceedings of the 8th international conference on Multimodal interfaces
A neural mechanism for involuntary attention shifts to changes in auditory stimulation
Journal of Cognitive Neuroscience
A high-performance dual-wizard infrastructure for designing speech, pen, and multimodal interfaces
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Modeling and Using Salience in Multimodal Interaction Systems
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
Cognitive load evaluation of handwriting using stroke-level features
Proceedings of the 16th international conference on Intelligent user interfaces
Intuition as instinctive dialogue
Computing with instinct
User modelling server for adaptive help
Intelligent Decision Technologies - Special issue on knowledge-based environments and services in human-computer interaction
Proceedings of the Eighth Eurographics Symposium on Sketch-Based Interfaces and Modeling
Hi-index | 0.01 |
As emphasis is placed on developing mobile, educational, and other applications that minimize cognitive load on users, it is becoming more essential to explore interfaces based on implicit engagement techniques so users can remain focused on their tasks. In this research, data were collected with 12 pairs of students who solved complex math problems using a tutorial system that they engaged over 100 times per session entirely implicitly via speech amplitude or pen pressure cues. Results revealed that users spontaneously, reliably, and substantially adapted these forms of communicative energy to designate and repair an intended interlocutor in a computer-mediated group setting. Furthermore, this behavior was harnessed to achieve system engagement accuracies of 75-86%, with accuracies highest using speech amplitude. However, students had limited awareness of their own adaptations. Finally, while continually using these implicit engagement techniques, students maintained their performance level at solving complex mathematics problems throughout a one-hour session.