Design rules based on analyses of human error
Communications of the ACM
Critiquing Human Error: A Knowledge Based Human-Computer Collaboration Approach
Critiquing Human Error: A Knowledge Based Human-Computer Collaboration Approach
Expert Critiquing Systems
Critiquing Software Specifications
IEEE Software
A General Framework for Debugging
IEEE Software
Building a Better Critic-Recent Empirical Results
IEEE Expert: Intelligent Systems and Their Applications
The CRITTER system: Automated critiquing of digital circuit designs
DAC '84 Proceedings of the 21st Design Automation Conference
Empirical research in software engineering: a workshop
ACM SIGSOFT Software Engineering Notes
Empirical Software Engineering
Hi-index | 0.00 |
This paper summarizes an empirical study of performanceby, and reactions of, programmers using expert critiquing systemsduring a programming task. The study tests hypotheses about thevalue of various strategies for critic timing, agency, dialogue,and strategy. Performance statistics and reactions were collectedfrom 39 competent programmers participating in the trials. Amongother findings, results indicate that textual explanations andrepair suggestions speed up programming time 3.5-fold relativeto non-textual debuggers. However, a tenth of the subjects refuseto use the critics, and another fifth of the subjects indicatethey do not like to read the textual suggestions. These and otherlessons learned are reviewed herein.