You've been warned: an empirical study of the effectiveness of web browser phishing warnings
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Readability Checker Based on Deep Semantic Indicators
Human Language Technology. Challenges of the Information Society
Crying wolf: an empirical study of SSL warning effectiveness
SSYM'09 Proceedings of the 18th conference on USENIX security symposium
Improving computer security dialogs
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part IV
Hi-index | 0.00 |
Security systems frequently rely on warning messages to convey important information, especially when a machine is not able to assess a situation automatically. For a long time, researchers have investigated the effects of warning messages to optimise their reception by a user. Design guidelines and best practises help the developer or interaction designer to adequately channel urgent information. In this poster, we investigate the application of readability measures to assess the difficulty of the descriptive text in warning messages. Adapting such a measure to fit the needs of warning message design allows objective feedback on the quality of a warning's descriptive text. An automated process will be able to assist software developers and designers in creating more readable and hence more understandable security warning messages. We present an initial exploration of the use of readability measures on the descriptive text of warning messages. Existing measures were evaluated on warning messages extracted from current browsers using an experimental study with 15 undergrad students. While our data did not yield conclusive results yet, we argue that readability measures can provide valuable assistance when implementing security systems.