Multilayer feedforward networks are universal approximators
Neural Networks
A comparative study of ID3 and backpropagation for English text-to-speech mapping
Proceedings of the seventh international conference (1990) on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Symbolic Interpretation of Artificial Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Machine Learning
Handling Continuous-Valued Attributes in Decision Tree with Neural Network Modelling
ECML '00 Proceedings of the 11th European Conference on Machine Learning
NeuroLinear: A System for Extracting Oblique Decision Rules from Neural Networks
ECML '97 Proceedings of the 9th European Conference on Machine Learning
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
It has been shown that a neural network is better than induction tree applications in modeling complex relations of input attributes in sample data. Those relations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. A linear classifier is derived from a linear combination of input attributes and neuron weights in the first hidden layer of neural networks. Training data are projected onto the set of linear classifier hyperplanes and then information gain measure is applied to the data. We propose that this can reduce computational complexity to extract rules from neural networks. As a result, concise rules can be extracted from neural networks to support data with input variable relations over continuous-valued attributes.