Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
User interface design
Tasks, errors, and mental models
Modeling and predicting human error
Human performance models for computer-aided engineering
Developing user interfaces: ensuring usability through product & process
Developing user interfaces: ensuring usability through product & process
A probabilistic logic for the development of safety-critical, interactive systems
International Journal of Man-Machine Studies
The cognitive walkthrough method: a practitioner's guide
Usability inspection methods
Safeware: system safety and computers
Safeware: system safety and computers
The invisible computer
Supporting Scenario-Based Requirements Engineering
IEEE Transactions on Software Engineering
Usability Engineering
Formal Specification as a Tool for Objective Assessment of Safety-Critical Interactive Systems
INTERACT '97 Proceedings of the IFIP TC13 Interantional Conference on Human-Computer Interaction
Human Errors and System Requirements
RE '99 Proceedings of the 4th IEEE International Symposium on Requirements Engineering
A task centered approach to analysing human error tolerance requirements
RE '95 Proceedings of the Second IEEE International Symposium on Requirements Engineering
Analogical Reuse of Requirements Frameworks
RE '97 Proceedings of the 3rd IEEE International Symposium on Requirements Engineering
On the effective use and reuse of HCI knowledge
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction in the new millennium, Part 2
Preventing user errors by systematic analysis of deviations from the system task model
International Journal of Human-Computer Studies
Blending Descriptive and Numeric Analysis in Human Reliability Design
DSV-IS '02 Proceedings of the 9th International Workshop on Interactive Systems. Design, Specification, and Verification
Scenario-Based Assessment of Nonfunctional Requirements
IEEE Transactions on Software Engineering
Using Bayesian belief networks for change impact analysis in architecture design
Journal of Systems and Software
Toward a more accurate view of when and how people seek help with computer applications
SIGDOC '07 Proceedings of the 25th annual ACM international conference on Design of communication
Usability inspection methods after 15 years of research and practice
SIGDOC '07 Proceedings of the 25th annual ACM international conference on Design of communication
Dynamic positioning systems: usability and interaction styles
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Towards evidence-based architectural design for safety-critical software applications
Architecting dependable systems IV
Assessing the effectiveness of direct gesture interaction for a safety critical maritime application
International Journal of Human-Computer Studies
SAFECOMP'07 Proceedings of the 26th international conference on Computer Safety, Reliability, and Security
Hi-index | 0.00 |
We describe a method of assessing the implications for human error on user interface design of safety-critical systems. In previous work we have proposed a taxonomy of influencing factors that contribute to error. In this article, components of the taxonomy are combined into a mathematical and causal model for error, represented as a Bayesian Belief Net (BBN). TheBBN quantifies error influences arising from user knowledge, ability, and the task environ-ment, combined with factors describing the complexity of user action and user interface quality. The BBN model predicts probabilities of different types of errorslips and mistakes for each component action of a task involving user-system interaction. We propose an Impact Analysis Method that involves running test scenarios against this causal model of error in order to determine user interactions that are prone to different types of error. Applying the proposed method will enable the designer to determine the combinations of influencing factors and their interactions that are most likely to influence human error. Finally we show how such scenario-based causal analysis can be useful as a means of focusing on relevant guidelines for safe user interface design. The proposed method is demonstrated through a case study of an operator performing a task using the control system for a laser spectrophotometer.We describe a method of assessing the implications for human error on user interface design of safety-critical systems. In previous work we have proposed a taxonomy of influencing factors that contribute to error. In this article, components of the taxonomy are combined into a mathematical and causal model for error, represented as a Bayesian Belief Net (BBN). TheBBN quantifies error influences arising from user knowledge, ability, and the task environ-ment, combined with factors describing the complexity of user action and user interface quality. The BBN model predicts probabilities of different types of errorslip for each component action of a task involving user-system interaction. We propose an Impact Analysis Method that involves running test scenarios against this causal model of error in order to determine user interactions that are prone to different types of error. Applying the proposed method will enable the designer to determine the combinations of influencing factors and their interactions that are most likely to influence human error. Finally we show how such scenario-based causal analysis can be useful as a means of focusing on relevant guidelines for safe user interface design. The proposed method is demonstrated through a casestudy of an operator performing a task using the control system for a laser spectrophotometer.