Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Modeling with constructive backpropagation
Neural Networks
Multi-agent reinforcement learning: weighting and partitioning
Neural Networks
Incremental Learning with Respect to New Incoming Input Attributes
Neural Processing Letters
A Fuzzy Approach to Partitioning Continuous Attributes for Classification
IEEE Transactions on Knowledge and Data Engineering
A Dual-Objective Evolutionary Algorithm for Rules Extraction in Data Mining
Computational Optimization and Applications
An evolutionary artificial neural networks approach for breast cancer diagnosis
Artificial Intelligence in Medicine
Artificial Intelligence in Medicine
Evolutionary computing for knowledge discovery in medical diagnosis
Artificial Intelligence in Medicine
A sequential neural network model for diabetes prediction
Artificial Intelligence in Medicine
A survey of fuzzy logic monitoring and control utilisation in medicine
Artificial Intelligence in Medicine
Artificial Intelligence in Medicine
Using localizing learning to improve supervised learning algorithms
IEEE Transactions on Neural Networks
Recursive hybrid decomposition with reduced pattern training
International Journal of Hybrid Intelligent Systems
Recursive Learning of Genetic Algorithms with Task Decomposition and Varied Rule Set
International Journal of Applied Evolutionary Computation
Input Space Partitioning for Neural Network Learning
International Journal of Applied Evolutionary Computation
Hi-index | 0.01 |
The lack of segregation of input space for conventional neural networks (NNs) training often causes interference within the network. The interference-less neural network training (ILNNT) method employed in this paper reduces interference among input attributes by identifying those attributes that interfere with one another and separating them, while attributes that are mutually beneficial are grouped together. Separated attributes in different batches do not share the same hidden neurons while attributes within a batch are connected to the same hidden neurons. ILNNT is applied to widely used benchmark binary and multi-class classification problems and experimental results from K-fold cross validation show that there exist varying degrees of interference among the attributes for the datasets used and the classification accuracy produced by NNs with reduced interference is high.