Mental models: towards a cognitive science of language, inference, and consciousness
Mental models: towards a cognitive science of language, inference, and consciousness
Does automation bias decision-making?
International Journal of Human-Computer Studies
Accountability and automation bias
International Journal of Human-Computer Studies
The role of trust in automation reliance
International Journal of Human-Computer Studies - Special issue: Trust and technology
International Journal of Human-Computer Studies - Special issue: Trust and technology
Cognition, Technology and Work
A model for types and levels of human interaction with automation
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Why Are People's Decisions Sometimes Worse with Computer Support?
SAFECOMP '09 Proceedings of the 28th International Conference on Computer Safety, Reliability, and Security
Experimental investigation of misuse and disuse in using automation system
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: users and applications - Volume Part IV
The applicability of human-centred automation guidelines in the fighter aircraft domain
Proceedings of the 29th Annual European Conference on Cognitive Ergonomics
Impact of robot failures and feedback on real-time trust
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
The present study investigates automation misuse based on complacency and automation bias in interacting with a decision aid in a process control system. The effect of a preventive training intervention which includes exposing participants to rare automation failures is examined. Complacency is reflected in an inappropriate checking and monitoring of automated functions. In interaction with automated decision aids complacency might result in commission errors, i.e., following automatically generated recommendations even though they are false. Yet, empirical evidence proving this kind of relationship is still lacking. A laboratory experiment (N=24) was conducted using a process control simulation. An automated decision aid provided advice for fault diagnosis and management. Complacency was directly measured by the participants' information sampling behavior, i.e., the amount of information sampled in order to verify the automated recommendations. Possible commission errors were assessed when the aid provided false recommendations. The results provide clear evidence for complacency, reflected in an insufficient verification of the automation, while commission errors were associated with high levels of complacency. Hence, commission errors seem to be a possible, albeit not an inevitable consequence of complacency. Furthermore, exposing operators to automation failures during training significantly decreased complacency and thus represents a suitable means to reduce this risk, even though it might not avoid it completely. Potential applications of this research include the design of training protocols in order to prevent automation misuse in interaction with automated decision aids.