Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Crafting Papers on Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Mining with rarity: a unifying framework
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
A Data Mining Approach for Retailing Bank Customer Attrition Analysis
Applied Intelligence
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Expert Systems with Applications: An International Journal
Random Forests for multiclass classification: Random MultiNomial Logit
Expert Systems with Applications: An International Journal
Cancer classification using Rotation Forest
Computers in Biology and Medicine
RotBoost: A technique for combining Rotation Forest and AdaBoost
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Handling class imbalance in customer churn prediction
Expert Systems with Applications: An International Journal
Customer churn prediction using improved balanced random forests
Expert Systems with Applications: An International Journal
Customer churn prediction by hybrid neural networks
Expert Systems with Applications: An International Journal
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Expert Systems with Applications: An International Journal
Toward a successful CRM: variable selection, sampling, and ensemble
Decision Support Systems
Expert Systems with Applications: An International Journal
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
An experimental study on rotation forest ensembles
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
K nearest sequence method and its application to churn prediction
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
A novel evolutionary data mining algorithm with applications to churn prediction
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Save the best for last? The treatment of dominant predictors in financial forecasting
Expert Systems with Applications: An International Journal
Customer event history for churn prediction: How long is long enough?
Expert Systems with Applications: An International Journal
Learning ensemble classifiers via restricted Boltzmann machines
Pattern Recognition Letters
Hi-index | 12.06 |
Several studies have demonstrated the superior performance of ensemble classification algorithms, whereby multiple member classifiers are combined into one aggregated and powerful classification model, over single models. In this paper, two rotation-based ensemble classifiers are proposed as modeling techniques for customer churn prediction. In Rotation Forests, feature extraction is applied to feature subsets in order to rotate the input data for training base classifiers, while RotBoost combines Rotation Forest with AdaBoost. In an experimental validation based on data sets from four real-life customer churn prediction projects, Rotation Forest and RotBoost are compared to a set of well-known benchmark classifiers. Moreover, variations of Rotation Forest and RotBoost are compared, implementing three alternative feature extraction algorithms: principal component analysis (PCA), independent component analysis (ICA) and sparse random projections (SRP). The performance of rotation-based ensemble classifier is found to depend upon: (i) the performance criterion used to measure classification performance, and (ii) the implemented feature extraction algorithm. In terms of accuracy, RotBoost outperforms Rotation Forest, but none of the considered variations offers a clear advantage over the benchmark algorithms. However, in terms of AUC and top-decile lift, results clearly demonstrate the competitive performance of Rotation Forests compared to the benchmark algorithms. Moreover, ICA-based Rotation Forests outperform all other considered classifiers and are therefore recommended as a well-suited alternative classification technique for the prediction of customer churn that allows for improved marketing decision making.