Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Using support vector machines for time series prediction
Advances in kernel methods
An Algorithm for Finding Best Matches in Logarithmic Expected Time
ACM Transactions on Mathematical Software (TOMS)
A simulation study of artificial neural networks for nonlinear time-series forecasting
Computers and Operations Research
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Time Series Analysis: Forecasting and Control
Time Series Analysis: Forecasting and Control
Further Research on Feature Selection and Classification Using Genetic Algorithms
Proceedings of the 5th International Conference on Genetic Algorithms
Support Vector Machine for Regression and Applications to Financial Forecasting
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 6 - Volume 6
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Time series forecasting: Obtaining long term trends with self-organizing maps
Pattern Recognition Letters - Special issue: Artificial neural networks in pattern recognition
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Hybrid neural network models for hydrologic time series forecasting
Applied Soft Computing
Methodology for long-term prediction of time series
Neurocomputing
Hybridization of intelligent techniques and ARIMA models for time series prediction
Fuzzy Sets and Systems
Finding the embedding dimension and variable dependencies in time series
Neural Computation
On Nonparametric Residual Variance Estimation
Neural Processing Letters
Expert Systems with Applications: An International Journal
RCGA-S/RCGA-SP Methods to Minimize the Delta Test for Regression Tasks
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Minimising the delta test for variable selection in regression problems
International Journal of High Performance Systems Architecture
The curse of dimensionality in data mining and time series prediction
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Direct and recursive prediction of time series using mutual information selection
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Dimensionality reduction using genetic algorithms
IEEE Transactions on Evolutionary Computation
Editorial: European Symposium on Times Series Prediction
Neurocomputing
Hi-index | 0.01 |
In many real world problems, the existence of irrelevant input variables (features) hinders the predictive quality of the models used to estimate the output variables. In particular, time series prediction often involves building large regressors of artificial variables that can contain irrelevant or misleading information. Many techniques have arisen to confront the problem of accurate variable selection, including both local and global search strategies. This paper presents a method based on genetic algorithms that intends to find a global optimum set of input variables that minimize the Delta Test criterion. The execution speed has been enhanced by substituting the exact nearest neighbor computation by its approximate version. The problems of scaling and projection of variables have been addressed. The developed method works in conjunction with MATLAB's Genetic Algorithm and Direct Search Toolbox. The goodness of the proposed methodology has been evaluated on several popular time series examples, and also generalized to other non-time-series datasets.