Creating user interfaces by demonstration
Creating user interfaces by demonstration
Programmable user models for predictive evaluation of interface designs
CHI '89 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User modeling in adaptive interfaces
UM '99 Proceedings of the seventh international conference on User modeling
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Domain, task, and user models for an adaptive hypermedia performance support system
Proceedings of the 7th international conference on Intelligent user interfaces
User Modeling in Human–Computer Interaction
User Modeling and User-Adapted Interaction
Evaluating Comprehension-Based User Models: Predicting Individual User Planning and Action
User Modeling and User-Adapted Interaction
Intelligent User Interfaces: Survey and Research Directions
Intelligent User Interfaces: Survey and Research Directions
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An integrated model of eye movements and visual encoding
Cognitive Systems Research
Hi-index | 0.00 |
We present an architectural model for adaptive interfaces based on eye-gaze patterns and facial expression analysis. In our approach, each basic visual sign can adapt its appearance and level of detail during the communication process. Atomic Communication Units (ACUs) -- analogous to graphical output primitives - encapsulate the intended denotation, the encoding of the message and a method for the judgment of the communication goal. We have analyzed feedback cycles in human-human communication tasks, and propose applications scenarios for ACUs.