Interactive Systems: Bridging the Gaps Between Developers and Users
Computer - Special issue on instruction sequencing
A survey of user-centered design practice
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User Centered System Design; New Perspectives on Human-Computer Interaction
User Centered System Design; New Perspectives on Human-Computer Interaction
Designing for usability—key principles and what designers think
CHI '83 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Selecting and evoking innovators: combining democracy and creativity
Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles
Systematic sources of suboptimal interface design in large product development organizations
Human-Computer Interaction
Guidelines for conducting and reporting case study research in software engineering
Empirical Software Engineering
Investigating the State of User Involvement in Practice
APSEC '09 Proceedings of the 2009 16th Asia-Pacific Software Engineering Conference
IEEE Transactions on Software Engineering
A case study of post-deployment user feedback triage
Proceedings of the 4th International Workshop on Cooperative and Human Aspects of Software Engineering
Towards systematic analysis of continuous user input
Proceedings of the 4th international workshop on Social software engineering
DASC '11 Proceedings of the 2011 IEEE Ninth International Conference on Dependable, Autonomic and Secure Computing
Hi-index | 0.00 |
User involvement in software engineering has been researched over the last three decades. However, existing studies concentrate mainly on early phases of user-centered design projects, while little is known about how professionals work with post-deployment end-user feedback. In this paper we report on an empirical case study that explores the current practice of user involvement during software evolution. We found that user feedback contains important information for developers, helps to improve software quality and to identify missing features. In order to assess its relevance and potential impact, developers need to analyze the gathered feedback, which is mostly accomplished manually and consequently requires high effort. Overall, our results show the need for tool support to consolidate, structure, analyze, and track user feedback, particularly when feedback volume is high. Our findings call for a hypothesis-driven analysis of user feedback to establish the foundations for future user feedback tools.