The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Optimization Theory and Applications
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural, Novel and Hybrid Algorithms for Time Series Prediction
Neural, Novel and Hybrid Algorithms for Time Series Prediction
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
The Journal of Machine Learning Research
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
A DC-programming algorithm for kernel selection
ICML '06 Proceedings of the 23rd international conference on Machine learning
The lack of a priori distinctions between learning algorithms
Neural Computation
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Expectation propagation for approximate Bayesian inference
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Hi-index | 12.05 |
First, the all-important no free lunch theorems are introduced. Next, kernel methods, support vector machines (SVMs), preprocessing, model selection, feature selection, SVM software and the Fisher kernel are introduced and discussed. A hidden Markov model is trained on foreign exchange data to derive a Fisher kernel for an SVM, the DC algorithm and the Bayes point machine (BPM) are also used to learn the kernel on foreign exchange data. Further, the DC algorithm was used to learn the parameters of the hidden Markov model in the Fisher kernel, creating a hybrid algorithm. The mean net returns were positive for BPM; and BPM, the Fisher kernel, the DC algorithm and the hybrid algorithm were all improvements over a standard SVM in terms of both gross returns and net returns, but none achieved net returns as high as the genetic programming approach employed by Neely, Weller, and Dittmar (1997) and published in Neely, Weller, and Ulrich (2009). Two implementations of SVMs for Windows with semi-automated parameter selection are built.