In the age of the smart machine: the future of work and power
In the age of the smart machine: the future of work and power
Trust between humans and machines, and the design of decision aids
International Journal of Man-Machine Studies - Special Issue: Cognitive Engineering in Dynamic Worlds
Trust, self-confidence, and operators' adaptation to automation
International Journal of Human-Computer Studies
Driver reliability requirements for traffic advisory information
Ergonomics and safety of intelligent driver interfaces
Towards improving trust in context-aware systems by displaying system confidence
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
Internet anxiety: An empirical study of the effects of personality, beliefs, and social support
Information and Management
Analysis of user attitude and behaviour in evaluating a personalized search engine
Proceedings of the 13th Eurpoean conference on Cognitive ergonomics: trust and control in complex socio-technical systems
International Journal of Human-Computer Studies
Effect of indirect information on system trust and control allocation
Behaviour & Information Technology
WI-IAT '08 Proceedings of the 2008 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology - Volume 03
Why Are People's Decisions Sometimes Worse with Computer Support?
SAFECOMP '09 Proceedings of the 28th International Conference on Computer Safety, Reliability, and Security
Computers in Human Behavior
Effects of changing reliability on trust of robot systems
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Impact of robot failures and feedback on real-time trust
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Adaptive interaction support in ambient-aware environments based on quality of context information
Multimedia Tools and Applications
Hi-index | 0.00 |
The concept of trust is believed by some to compensate for feelings of uncertainty. Therefore, trust is considered to be crucial in people's decision to rely on a complex automated system to perform tasks for them. This experiment aimed to study the effects of errors on control allocation, and the mediating role of trust and self-confidence in the domain of route planning. Using a computer-based route planner, participants completed 10 route-planning trials in manual mode, and 10 in automatic mode, allowing participants to become equally experienced in operating both modes. During these so-called fixed trials, the numbers of errors in automatic as well as manual mode were systematically varied. Subsequently, participants completed six free trials, during which they were free to choose between modes. Our results showed that high automation error rates (AERs) decreased levels of system trust compared to low AERs. Conversely, high manual error rates (MERs) resulted in lower levels of self-confidence compared to low MERs, although to a lesser extent. Moreover, the difference between measures of trust and self-confidence proved to be highly predictive of the number of times automatic mode was selected during the six free trials. Additionally, results suggest a fundamental bias to trust one's own abilities over those of the system. Finally, evidence indicating a relationship between trust and self-confidence is discussed.