C4.5: programs for machine learning
C4.5: programs for machine learning
ACM Computing Surveys (CSUR)
Machine Learning
X-means: Extending K-means with Efficient Estimation of the Number of Clusters
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Resource-Aware Stream Management with the Customizable dproc Distributed Monitoring Mechanisms
HPDC '03 Proceedings of the 12th IEEE International Symposium on High Performance Distributed Computing
Chi2: Feature Selection and Discretization of Numeric Attributes
TAI '95 Proceedings of the Seventh International Conference on Tools with Artificial Intelligence
SysProf: Online Distributed Behavior Diagnosis through Fine-grain System Monitoring
ICDCS '06 Proceedings of the 26th IEEE International Conference on Distributed Computing Systems
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Operating System Concepts
Revisiting Negative Selection Algorithms
Evolutionary Computation
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
IWLCS'03-05 Proceedings of the 2003-2005 international conference on Learning classifier systems
Error log processing for accurate failure prediction
WASL'08 Proceedings of the First USENIX conference on Analysis of system logs
Empirical comparison of techniques for automated failure diagnosis
SysML'08 Proceedings of the Third conference on Tackling computer systems problems with machine learning techniques
Hi-index | 0.00 |
System administrators have to analyze a number of system parameters to identify performance bottlenecks in a system. The major contribution of this paper is a utility - EvoPerf - which has the ability to autonomously monitor different system-wide parameters, requiring no user intervention, to accurately identify performance based anomalies (or bottlenecks). EvoPerf uses Windows Perfmon utility to collect a number of performance counters from the kernel of Windows OS. Subsequently, we show that artificial intelligence based techniques - using performance counters - can be used successfully to design an accurate and efficient performance monitoring utility. We evaluate feasibility of six classifiers - UCS, GAssist-ADI, GAssist-Int, NN-MLP, NN-RBF and J48 - and conclude that all classifiers provide more than 99% classification accuracy with less than 1% false positives. However, the processing overhead of J48 and neural networks based classifiers is significantly smaller compared with evolutionary classifiers.