Cognitive strategies and eye movements for searching hierarchical computer displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Predictive human performance modeling made easy
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cognitive strategies for the visual search of hierarchical computer displays
Human-Computer Interaction
Cogtool-explorer: towards a tool for predicting user interaction
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Improving visual search with image segmentation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Toward modeling auditory information seeking strategies on the web
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Users' eye gaze pattern in organization-based recommender interfaces
Proceedings of the 16th international conference on Intelligent user interfaces
Extending predictive models of exploratory behavior to broader populations
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: design for all and eInclusion - Volume Part I
Modeling visual attention for rule-based usability simulations of elderly citizen
EPCE'11 Proceedings of the 9th international conference on Engineering psychology and cognitive ergonomics
Eye-Tracking study of user behavior in recommender interfaces
UMAP'10 Proceedings of the 18th international conference on User Modeling, Adaptation, and Personalization
CogTool-Explorer: a model of goal-directed user exploration that considers information layout
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Action graphs and user performance analysis
International Journal of Human-Computer Studies
Predicting whether users view dynamic content on the world wide web
ACM Transactions on Computer-Human Interaction (TOCHI)
Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
Hi-index | 0.01 |
Visual search is an important part of human-computer interaction. It is critical that we build theory about how people visually search displays in order to better support the users' visual capabilities and limitations in everyday tasks. One way of building such theory is through computational cognitive modeling. The ultimate promise for cognitive modeling in HCI it to provide the science base needed for predictive interface analysis tools. This paper discusses computational cognitive modeling of the perceptual, strategic, and oculomotor processes people used in a visual search task. This work refines and rounds out previously reported cognitive modeling and eye tracking analysis. A revised "minimal model" of visual search is presented that explains a variety of eye movement data better than the original model. The revised model uses a parsimonious strategy that is not tied to a particular visual structure or feature beyond the location of objects. Three characteristics of the minimal strategy are discussed in detail.