Password security: a case history
Communications of the ACM
Secrets & Lies: Digital Security in a Networked World
Secrets & Lies: Digital Security in a Networked World
Automated cross-organisational trust establishment on extranets
ITVE '01 Proceedings of the workshop on Information technology for virtual enterprises
Spreading Activation Models for Trust Propagation
EEE '04 Proceedings of the 2004 IEEE International Conference on e-Technology, e-Commerce and e-Service (EEE'04)
Security Policies to Mitigate Insider Threat in the Document Control Domain
ACSAC '04 Proceedings of the 20th Annual Computer Security Applications Conference
A trust-enhanced recommender system application: Moleskiing
Proceedings of the 2005 ACM symposium on Applied computing
How to incorporate revocation status information into the trust metrics for public-key certification
Proceedings of the 2005 ACM symposium on Applied computing
Persistent and dynamic trust: analysis and the related impact of trusted platforms
iTrust'05 Proceedings of the Third international conference on Trust Management
QoS-T: QoS throttling to elicit user cooperation in computer systems
MMM-ACNS'10 Proceedings of the 5th international conference on Mathematical methods, models and architectures for computer network security
Towards modeling trust based decisions: a game theoretic approach
ESORICS'07 Proceedings of the 12th European conference on Research in Computer Security
Hi-index | 0.00 |
The human component is a determining factor in the success of the security subsystem. While security policies dictate the set of permissible actions of a user, best practices dictate the efficient mode of execution for these actions. Unfortunately, this efficient mode of execution is not always the easiest to carry out. Users, unaware of the implications of their actions, seek to carry out the easier mode of execution rather than the efficient one, thereby introducing a certain level of uncertainty unacceptable in high assurance information systems. In this paper, we present a dynamic trust assignment model that evaluates the system's trust on user actions over time. We first discuss the interpretation of trust in the context of the statement “the system trusts the users' actions” as opposed to “the system trusts the user.” We then derive the intuition of our trust assignment framework from a game-theoretic model, where trust updates are performed through “compensatory transfer.” For each efficient action by a user, we assign a trust value equal to the “best claim for compensation”, defined as the maximum difference between the benefits of an alternate action and the selected efficient action by the user. The users' initial trust and recent actions are both taken into account and the user is appropriately rewarded or penalized through trust updates. The utility of such a model is two-fold: It helps the system to identify and educate users who consistently avoid (or are unaware of) implementing the organization's best practices and secondly, in the face of an action whose conformance to the organizational policies is contentious, it provides the system or a monitoring agent with a basis, viz. the trust level, to allow or disallow the action. Finally we demonstrate the application of this model in a Document Management System.