Neurocomputing: foundations of research
Neurocomputing: foundations of research
Machine learning: a theoretical approach
Machine learning: a theoretical approach
Learning and decision-making in the framework of fuzzy lattices
New learning paradigms in soft computing
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
An Information-Theoretic Definition of Similarity
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Evolutionary Optimization of Heterogeneous Problems
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Predictive models for the breeder genetic algorithm i. continuous parameter optimization
Evolutionary Computation
Cluster Analysis
Hi-index | 0.00 |
This paper introduces a class of neuron models accepting heterogeneous inputsand weights. The neuron model computes a user- defined similarity functionbetween inputs and weights. The neuron transfer function is formed by composition of an adapted logistic function with the power mean of the partial input-weight similarities. The resulting neuron model is capable of dealing directly with mixtures of continuous quantities (crisp or fuzzy) and discrete quantities (ordinal, integer, binary or nominal). There is also provision for missing values. An artificial neural network using these neuron models is trained using a breeder genetic algorithmuntil convergence. A number of experiments are carried out using several real-world benchmarkingproblems. The network is compared to a standard radial basis function network and to a multi-layer perceptron and shown to learn from non-trivial data sets with superior generalization ability in most cases, at a comparable computational cost. A further advantage is the interpretability of the learned weights.