Pros and Cons of Controllability: An Empirical Study
AH '02 Proceedings of the Second International Conference on Adaptive Hypermedia and Adaptive Web-Based Systems
The role of trust in automation reliance
International Journal of Human-Computer Studies - Special issue: Trust and technology
Trust-inspiring explanation interfaces for recommender systems
Knowledge-Based Systems
Toward establishing trust in adaptive agents
Proceedings of the 13th international conference on Intelligent user interfaces
Trust in Online Technology: Towards Practical Guidelines Based on Experimentally Verified Theory
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
Explanatory Debugging: Supporting End-User Debugging of Machine-Learned Programs
VLHCC '10 Proceedings of the 2010 IEEE Symposium on Visual Languages and Human-Centric Computing
Hi-index | 0.00 |
Intelligent systems often violate fundamental usability principles, such as control and transparency. Explanations have been shown to have a positive effect on transparency in such systems but little research exists as to how it may affect control. We set out to investigate how explanations may impact upon users' perceptions of control in an intelligent system. We conducted an empirical study in which 15 participants carried out a qualitative data analysis task using an intelligent system. Participants were divided into two groups: with and without explanations. Participants could indicate agreement or correct the system. Our results show that participants without explanations display more control-exerting behaviors but that there is no difference between conditions in participants' perception of control. We discuss our findings and implications for future work.