Usability Engineering
Automated summative usability studies: an empirical evaluation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications (Human Factors and Ergonomics Series)
Software Engineering: Barry W. Boehm's Lifetime Contributions to Software Development, Management, and Research (Practitioners)
Guest Editors' Introduction: Mining Software Archives
IEEE Software
Evaluation and usability of programming languages and tools (PLATEAU)
Proceedings of the ACM international conference companion on Object oriented programming systems languages and applications companion
How do developers use parallel libraries?
Proceedings of the ACM SIGSOFT 20th International Symposium on the Foundations of Software Engineering
Hi-index | 0.00 |
Multicore computers are ubiquitous, and proposals to extend existing languages with parallel constructs mushroom. While everyone claims to make parallel programming easier and less error-prone, empirical language usability evaluations are rarely done in-the-field with many users and real programs. Key obstacles are costs and a lack of appropriate environments to gather enough data for representative conclusions. This paper discusses the idea of automating the usability evaluation of parallel language constructs by gathering subjective and objective data directly in every software engineer's IDE. The paper presents an Eclipse prototype suite that can aggregate such data from potentially hundreds of thousands of programmers. Mismatch detection in subjective and objective feedback as well as construct usage mining can improve language design at an early stage, thus reducing the risk of developing and maintaining inappropriate constructs. New research directions arising from this idea are outlined for software repository mining, debugging, and software economics.