Scenario-based requirements analysis
Requirements Engineering
Trust (and mistrust) in secure applications
Communications of the ACM
Usability Engineering
Requirements Engineering: Processes and Techniques
Requirements Engineering: Processes and Techniques
Applying the Wizard of Oz Technique to the Study of Multimodal Systems
EWHCI '93 Selected papers from the Third International Conference on Human-Computer Interaction
Trust and mistrust of online health sites
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM-SE 42 Proceedings of the 42nd annual Southeast regional conference
Basic Concepts and Taxonomy of Dependable and Secure Computing
IEEE Transactions on Dependable and Secure Computing
Goal-centric traceability for managing non-functional requirements
Proceedings of the 27th international conference on Software engineering
A survey of trust in computer science and the Semantic Web
Web Semantics: Science, Services and Agents on the World Wide Web
Journal of Management Information Systems
Software requirement patterns
Requirements Prioritization Based on Benefit and Cost Prediction: An Agenda for Future Research
RE '08 Proceedings of the 2008 16th IEEE International Requirements Engineering Conference
Examining Trust in Information Technology Artifacts: The Effects of System Quality and Culture
Journal of Management Information Systems
Software Engineering
Trust and TAM in online shopping: an integrated model
MIS Quarterly
Imagined communities: awareness, information sharing, and privacy on the facebook
PET'06 Proceedings of the 6th international conference on Privacy Enhancing Technologies
How to use behavioral research insights on trust for HCI system design
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Trust has been shown to be a key factor for technology adoption by users, that is, users prefer to use applications they trust. While existing literature on trust originating in computer science mostly revolves around aspects of information security, authentication, etc., research on trust in automation--originating from behavioral sciences--almost exclusively focuses on the sociotechnical context in which applications are embedded. The behavioral theory of trust in automation aims at explaining the formation of trust, helping to identify countermeasures for users' uncertainties that lead to lessened trust in an application. We hence propose an approach to augment the system development process of ubiquitous systems with insights into behavioral trust theory. Our approach enables developers to derive design elements that help foster trust in their application by performing four key activities: identifying users' uncertainties, linking them to trust antecedents from theory, deducting functional requirements and finally designing trust-supporting design elements (TSDEs). Evaluating user feedback on two recommender system prototypes, gathered in a study with over 160 participants, we show that by following our process, we were able to derive four TSDEs that helped to significantly increase the users' trust in the system.