“Fast learning in multi-resolution hierarchies”
Advances in neural information processing systems 1
The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Making large-scale support vector machine learning practical
Advances in kernel methods
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Boosting Methods for Regression
Machine Learning
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A tutorial on support vector regression
Statistics and Computing
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
An efficient method for simplifying support vector machines
ICML '05 Proceedings of the 22nd international conference on Machine learning
An Efficient Method for Simplifying Decision Functions of Support Vector Machines
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
MultiK-MHKS: A Novel Multiple Kernel Learning Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Localized multiple kernel learning
Proceedings of the 25th international conference on Machine learning
Regularized least squares fuzzy support vector regression for financial time series forecasting
Expert Systems with Applications: An International Journal
Multi-resolution Boosting for Classification and Regression Problems
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
A brief introduction to boosting
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
A hierarchical RBF online learning algorithm for real-time 3-D scanner
IEEE Transactions on Neural Networks
An effective method of pruning support vector machine classifiers
IEEE Transactions on Neural Networks
Localized Multiple Kernel Regression
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Fuzzy Support Vector Machine for bankruptcy prediction
Applied Soft Computing
Multiscale approximation with hierarchical radial basis functions networks
IEEE Transactions on Neural Networks
SMO-based pruning methods for sparse least squares support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support vector regression (SVR) is a promising regression tool based on support vector machine (SVM). It is a paradigm for identifying estimated models that are based on minimizing Vapnik's loss function of residuals. It is based on linear combination of displaced replicas of kernel function. Single kernel is ineffective when function approximated is non stationary. This problem is taken care of by hierarchical modified regularized least squares fuzzy support vector regression (HMRLFSVR). It is developed from modified regularized least squares fuzzy support vector regression (MRLFSVR) and regularized least squares fuzzy support vector regression (RLFSVR). HMRLFSVR consists of a set of hierarchical layers each containing MRLFSVR with Gaussian kernel at given scale. On increasing scale layer by layer details are incorporated inside regression function. It adapts local scale to data keeping number of support vectors and configuration time comparable with classical SVR. It considers disadvantages when approximating non stationary function using single kernel approach where it is not able to follow variations in frequency content in different regions of input space. The approach is based on interleaving regression estimate with pruning activity. It denoises original data obtaining an effective multiscale reconstruction. The tuning of SVR configuration parameters becomes simplified in HMRLFSVR. Favourable results over noisy synthetic and real datasets are obtained when compared with multikernel approaches.