Agents that reduce work and information overload
Communications of the ACM
The elements of computer credibility
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The role of transparency in recommender systems
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Pros and Cons of Controllability: An Empirical Study
AH '02 Proceedings of the Second International Conference on Adaptive Hypermedia and Adaptive Web-Based Systems
The role of trust in automation reliance
International Journal of Human-Computer Studies - Special issue: Trust and technology
User Attitudes Regarding a User-Adaptive eCommerce Web Site
User Modeling and User-Adapted Interaction
Evaluating collaborative filtering recommender systems
ACM Transactions on Information Systems (TOIS)
User Involvement in Automatic Filtering: An Experimental Study
User Modeling and User-Adapted Interaction
How it works: a field study of non-technical users interacting with an intelligent system
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The effects of transparency on trust in and acceptance of a content-based art recommender
User Modeling and User-Adapted Interaction
Gradual trust and distrust in recommender systems
Fuzzy Sets and Systems
Can I trust you?: sharing information with artificial companions
Proceedings of the 11th International Conference on Autonomous Agents and Multiagent Systems - Volume 3
Hi-index | 0.01 |
Even though adaptive (trainable) spam filters are a common example of systems that make (semi-)autonomous decisions on behalf of the user, trust in these filters has been underexplored. This paper reports a study of usage of spam filters in the daily workplace and user behaviour in training these filters (N=43). User observation, interview and survey techniques were applied to investigate attitudes towards two types of filters: a user-adaptive (trainable) and a rule-based filter. While many of our participants invested extensive effort in training their filters, training did not influence filter trust. Instead, the findings indicate that users' filter awareness and understanding seriously impacts attitudes and behaviour. Specific examples of difficulties related to awareness of filter activity and adaptivity are described showing concerns relevant to all adaptive and (semi-)autonomous systems that rely on explicit user feedback.