The Vision of Autonomic Computing
Computer
Capturing, indexing, clustering, and retrieving system history
Proceedings of the twentieth ACM symposium on Operating systems principles
Traffic-aware stress testing of distributed systems based on UML models
Proceedings of the 28th international conference on Software engineering
Automatic Stress and Load Testing for Embedded Systems
COMPSAC '06 Proceedings of the 30th Annual International Computer Software and Applications Conference - Volume 02
OSDI'04 Proceedings of the 6th conference on Symposium on Opearting Systems Design & Implementation - Volume 6
A framework for measurement based performance modeling
WOSP '08 Proceedings of the 7th international workshop on Software and performance
CLUEBOX: a performance log analyzer for automated troubleshooting
WASL'08 Proceedings of the First USENIX conference on Analysis of system logs
Self-organizing algorithms for generalized eigen-decomposition
IEEE Transactions on Neural Networks
Automated detection of performance regressions using statistical process control techniques
ICPE '12 Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering
Hi-index | 0.00 |
Performance analysts rely heavily on load testing to measure the performance of their applications under a given load. During the load test, analyst strictly monitor and record thousands of performance counters to measure the run time system properties such as CPU utilization, Disk I/O, memory consumption, network traffic etc. The most frustrating problem faced by analysts is the time spent and complexity involved in analysing these huge counter logs and finding relevant information distributed across thousands of counters. We present our methodology to help analysts by automatically identifying important performance counters for load test and comparing them across tests to find performance gain/loss. Further, our methodology help analysts to understand the root cause of a load test failure by finding previously solved problems in test repositories. A case study on load test data of a large enterprise application shows that our methodology can effectively guide performance analysts to identify and compare top performance counters across tests in limited time thereby archiving 88% counter data reduction.