Multilayer feedforward networks are universal approximators
Neural Networks
A comparative study of ID3 and backpropagation for English text-to-speech mapping
Proceedings of the seventh international conference (1990) on Machine learning
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Comparing connectionist and symbolic learning methods
Proceedings of a workshop on Computational learning theory and natural learning systems (vol. 1) : constraints and prospects: constraints and prospects
Symbolic Representation of Neural Networks
Computer - Special issue: neural computing: companion issue to Spring 1996 IEEE Computational Science & Engineering
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks in Computer Intelligence
Neural Networks in Computer Intelligence
Switching and Finite Automata Theory: Computer Science Series
Switching and Finite Automata Theory: Computer Science Series
An Evolutionary Algorithm Using Multivariate Discretization for Decision Rule Induction
PKDD '99 Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery
Improved use of continuous attributes in C4.5
Journal of Artificial Intelligence Research
Instance-Based Method to Extract Rules from Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Rule Reduction over Numerical Attributes in Decision Tree Using Multilayer Perceptron
PAKDD '01 Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining
Expert Systems with Applications: An International Journal
Reverse Engineering the Neural Networks for Rule Extraction in Classification Problems
Neural Processing Letters
Hi-index | 0.00 |
Induction tree is useful to obtain a proper set of rules for a large amount of examples. However, it has difficulty in obtaining the relation between continuous-valued data points. Many data sets show significant correlations between input variables, and a large amount of useful information is hidden in the data as nonlinearities. It has been shown that neural network is better than direct application of induction tree in modeling nonlinear characteristics of sample data. It is proposed in this paper that we derive a compact set of rules to support data with input variable relations. Those relations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. This will also solve overgeneralization amd overspecialization problems often seen in induction tree. We have tested this scheme over several data sets to compare with decision tree results.