Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
Interpreting symptoms of cognitive load in speech input
UM '99 Proceedings of the seventh international conference on User modeling
When do we interact multimodally?: cognitive load and multimodal communication patterns
Proceedings of the 6th international conference on Multimodal interfaces
Examining the redundancy of multimodal input
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Review: Integrating cognitive load theory and concepts of human-computer interaction
Computers in Human Behavior
Cognitive skills learning: pen input patterns in computer-based athlete training
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Cognitive load evaluation of handwriting using stroke-level features
Proceedings of the 16th international conference on Intelligent user interfaces
Freeform pen-input as evidence of cognitive load and expertise
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Teaching athletes cognitive skills: detecting cognitive load in speech input
BCS '10 Proceedings of the 24th BCS Interaction Specialist Group Conference
Multimodal behavior and interaction as indicators of cognitive load
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special issue on highlights of the decade in interactive intelligent systems
Drag and drop the apple: the semantic weight of words and images in touch-based interaction
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Assessing recovery from cognitive load through pen input
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Workload on your fingertips: the influence of workload on touch-based drag and drop
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
International Journal of Technology and Human Interaction
Estimating cognitive workload using wavelet entropy-based features during an arithmetic task
Computers in Biology and Medicine
Hi-index | 0.00 |
Multimodal interfaces are known to be useful in map-based applications, and in complex, time-pressure based tasks. Cognitive load variations in such tasks have been found to impact multimodal behaviour. For example, users become more multimodal and tend towards semantic complementarity as cognitive load increases. The richness of multimodal data means that systems could monitor particular input features to detect experienced load variations. In this paper, we present our attempt to induce controlled levels of load and solicit natural speech and pen-gesture inputs. In particular, we analyse for these features in the pen gesture modality. Our experimental design relies on a map-based Wizard of Oz, using a tablet PC. This paper details analysis of pen-gesture interaction across subjects, and presents suggestive trends of increases in the degree of degeneration of pen-gestures in some subjects, and possible trends in gesture kinematics, when cognitive load increases.