Translating keyword commands into executable code
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Toward harnessing user feedback for machine learning
Proceedings of the 12th international conference on Intelligent user interfaces
GoalDebug: A Spreadsheet Debugger for End Users
ICSE '07 Proceedings of the 29th international conference on Software Engineering
Integrating rich user feedback into intelligent user interfaces
Proceedings of the 13th international conference on Intelligent user interfaces
Interacting meaningfully with machine learning systems: Three experiments
International Journal of Human-Computer Studies
Automatic detection of dimension errors in spreadsheets
Journal of Visual Languages and Computing
A strategy-centric approach to the design of end-user debugging tools
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
No Code Required: Giving Users Tools to Transform the Web
No Code Required: Giving Users Tools to Transform the Web
The state of the art in end-user software engineering
ACM Computing Surveys (CSUR)
End-user debugging strategies: A sensemaking perspective
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
End-user programmers, because they are human, make mistakes. However, past research has not considered how visual end-user debugging devices could be designed to ameliorate the effects of mistakes. This paper empirically examines oracle mistakes 驴 mistakes users make about which values are right and which are wrong 驴 to reveal differences in how different types of oracle mistakes impact the quality of visual feedback about bugs. We then consider the implications of these empirical results for designers of end-user software engineering environments.