Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
An Improved Cluster Labeling Method for Support Vector Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dynamic Characterization of Cluster Structures for Robust and Inductive Support Vector Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering Based on Gaussian Processes
Neural Computation
Prediction of pricing and hedging errors for equity linked warrants with Gaussian process models
Expert Systems with Applications: An International Journal
Constructing sparse kernel machines using attractors
IEEE Transactions on Neural Networks
Fast support-based clustering method for large-scale problems
Pattern Recognition
Dynamic Dissimilarity Measure for Support-Based Clustering
IEEE Transactions on Knowledge and Data Engineering
Predicting a distribution of implied volatilities for option pricing
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Dynamic pattern denoising method using multi-basin system with kernels
Pattern Recognition
Multi-basin particle swarm intelligence method for optimal calibration of parametric Lévy models
Expert Systems with Applications: An International Journal
Sequential manifold learning for efficient churn prediction
Expert Systems with Applications: An International Journal
Discriminative Orthogonal Nonnegative matrix factorization with flexibility for data representation
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
During the last decades, many studies have been conducted on performing reliable prediction for high-dimensional data that are usually non-linearly correlated with complex patterns. In this paper, we propose a novel Bayesian regression method via non-linear dimensionality reduction. The method incorporates prior information on the underlying structure of original input features to preserve input-output patterns on reduced features, and to provide distributions of predicted values. To verify the effectiveness of the proposed method, we conducted simulations on benchmark and real-world data. Results showed that the method not only better predicts a distribution of forecast estimates compared with other methods, but also more robust and consistent performance on prediction.