Machine Learning
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Support vector machine active learning for image retrieval
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Performance Degradation in Boosting
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
A study of the behavior of several methods for balancing machine learning training data
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
IEEE Transactions on Pattern Analysis and Machine Intelligence
Bounds on Error Expectation for Support Vector Machines
Neural Computation
AdaBoost with SVM-based component classifiers
Engineering Applications of Artificial Intelligence
Nonlinear Kernel-Based Approaches for Predicting Normal Tissue Toxicities
ICMLA '08 Proceedings of the 2008 Seventh International Conference on Machine Learning and Applications
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Multitraining Support Vector Machine for Image Retrieval
IEEE Transactions on Image Processing
Hi-index | 0.01 |
Radiation-induced lung injury, radiation pneumonitis (RP), is a potentially fatal side-effect of thoracic radiation therapy. In this work, using an ensemble of support vector machines (SVMs), we build a binary RP risk model from clinical and dosimetric parameters. Patient/treatment data is partitioned into balanced subsets to prevent model bias. Forward feature selection, maximizing the area under the curve (AUC) for a cross-validated receiver operating characteristic (ROC) curve, is performed on each subset. Model parameter selection and construction occurs concurrently via alternating SVM and gradient descent steps to minimize estimated generalization error. We show that an ensemble classifier with a mean fusion function, five component SVMs, and limit of five features per classifier exhibits a mean AUC of 0.818-an improvement over previous SVM models of RP risk.