Aiding the operator during novel fault diagnosis
IEEE Transactions on Systems, Man and Cybernetics
International Journal of Man-Machine Studies - Special Issue: Cognitive Engineering in Dynamic Worlds
Intelligent aids, mental models, and the theory of machines
International Journal of Man-Machine Studies - Special Issue: Cognitive Engineering in Dynamic Worlds
Information and reasoning in intelligent decision support systems
International Journal of Man-Machine Studies - Special Issue: Cognitive Engineering in Dynamic Worlds
Applications of intelligent agents
Agent technology
Cognitive systems engineering: new wine in new bottles
International Journal of Human-Computer Studies - Special issue: 1969-1999, the 30th anniversary
Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering
Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering
Modeling the Human in Human Factors
SAFECOMP '01 Proceedings of the 20th International Conference on Computer Safety, Reliability and Security
IEA/AIE'1997 Proceedings of the 10th international conference on Industrial and Engineering Applications of Artificial Intelligence and Expert Systems
GHOST: experimenting conflicts countermeasures in the pilot's activity
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Mental models: a theoretical overview and preliminary study
Journal of Information Science
Notional machines and introductory programming education
ACM Transactions on Computing Education (TOCE)
More applicable environmental scanning systems leveraging "modern" information systems
Information Systems and e-Business Management
Hi-index | 0.01 |
This paper highlights a psychological phenomenon affecting the accuracy of mental models. It occurs when two consecutive events happen as expected by an operator. Typically, such a situation reinforces the confidence in one's mental model. However, consecutive events can happen as a random co-occurrence, for reasons that actually differ from the ones believed by the operator. Nonetheless, because of the consistency between the environmental data and the operator's expectations, one event can be taken to be the cause of the other. When this false belief happens, the mental model is erroneously assumed to be valid. We discuss this phenomenon and its potential disastrous consequences using the example of a real commercial air crash. We finally address some implications for systems' design and support tools.