Remote usability evaluation: can users report their own critical incidents?
CHI 98 Cconference Summary on Human Factors in Computing Systems
Remote evaluation for post-deployment usability improvement
AVI '98 Proceedings of the working conference on Advanced visual interfaces
Automating Software Failure Reporting
Queue - System Failures
A Linguistic Analysis of How People Describe Software Problems
VLHCC '06 Proceedings of the Visual Languages and Human-Centric Computing
Proceedings of the 16th ACM SIGSOFT International Symposium on Foundations of software engineering
The Design of Everyday Things
Participatory usability: supporting proactive users
CHINZ '03 Proceedings of the 4th Annual Conference of the ACM Special Interest Group on Computer-Human Interaction
Hi-index | 0.00 |
User feedback for deployed software systems ranges from simple one-bit-feedback to full-blown bug reports. While detailed bug reports are very helpful for the developers to track down problems, the expertise and commitment required from the user is high. We analyzed existing user report systems and propose a flexible and independent hard- and software architecture to collect user feedback. We report our results from a preliminary two-week user study testing the system in the field and discuss challenges and solutions for the collection of multiple levels of user feedback through different modalities.