A Privacy Awareness System for Ubiquitous Computing Environments
UbiComp '02 Proceedings of the 4th international conference on Ubiquitous Computing
User Modeling and User-Adapted Interaction
Toward establishing trust in adaptive agents
Proceedings of the 13th international conference on Intelligent user interfaces
Intelligibility and accountability: human considerations in context-aware systems
Human-Computer Interaction
Intelligent privacy support for large public displays
UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
Toolkit to support intelligibility in context-aware applications
Proceedings of the 12th ACM international conference on Ubiquitous computing
Trust-centered design for multi-display applications
Proceedings of the 8th International Conference on Advances in Mobile Computing and Multimedia
Friend or foe? relationship-based adaptation on public displays
AmI'11 Proceedings of the Second international conference on Ambient Intelligence
Trust, reputation and user modeling
UMAP'11 Proceedings of the 19th international conference on Advances in User Modeling
Achieving user participation for adaptive applications
UCAmI'12 Proceedings of the 6th international conference on Ubiquitous Computing and Ambient Intelligence
Hi-index | 0.00 |
Adaptation on public displays brings certain advantages and risks. Due to the implicit nature of adaptation, the users often miss the causality behind the adaptive behavior. Moreover, a high degree of autonomy in adaptive displays may leave the users with the feeling of control loss. Limited amount of transparency and controllability leads to the loss of user trust. As a result, the users feel insecure, frustrated, and are likely to abandon the system. The research goal of this work is to optimize the system actions in a ubiquitous display environment, in order make adaptation design transparent, controllable, and thus trustworthy. By means of a decision-theoretic approach the user trust can be assessed in different trust-critical contexts. The contexts describe the changes in the environment that call for adaptation: privacy of content, social setting, and accuracy of knowledge. The generated decisions enable the system to maintain trust and keep interaction comfortable.