Multilayer feedforward networks are universal approximators
Neural Networks
Initializing back propagation networks with prototypes
Neural Networks
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
A New Weight Initialization Method for the MLP with the BP inMulticlass Classification Problems
Neural Processing Letters
Machine Learning
A Generalized Definition of Rough Approximations Based on Similarity
IEEE Transactions on Knowledge and Data Engineering
A study of the behavior of several methods for balancing machine learning training data
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Accelerated Gradient Learning Algorithm for Neural Network Weights Update
KES '08 Proceedings of the 12th international conference on Knowledge-Based Intelligent Information and Engineering Systems, Part I
An Evolutionary Approach for Tuning Artificial Neural Network Parameters
HAIS '08 Proceedings of the 3rd international workshop on Hybrid Artificial Intelligence Systems
Advances in Neuro-Information Processing
Ideas about a regularized MLP classifier by means of weight decay stepping
ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
ICSI'10 Proceedings of the First international conference on Advances in Swarm Intelligence - Volume Part I
Simultaneous evolution of neural network topologies and weights for classification and regression
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
High-order and multilayer perceptron initialization
IEEE Transactions on Neural Networks
MICAI'12 Proceedings of the 11th Mexican international conference on Advances in Artificial Intelligence - Volume Part I
Hi-index | 0.00 |
This work presents a technique that integrates the backpropagation learning method with a method to calculate the initial weights in order to train the Multilayer Perceptron Model. The method to calculate the initial weights of the MLP is based on the quality of similarity measure proposed on the framework of the extended Rough Set Theory. Experimental results show that the proposed initialization method performs better than other methods used to calculate the weight of the features, so it is an interesting alternative to the conventional random initialization.