Modular construction of time-delay neural networks for speech recognition
Neural Computation
Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
Practical Issues in Temporal Difference Learning
Machine Learning
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Applications of machine learning and rule induction
Communications of the ACM
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
Overcoming Process Delays with Decision Tree Induction
IEEE Expert: Intelligent Systems and Their Applications
Machine Learning
Machine Learning
Machine Learning
Pedagogical Method for Extraction of Symbolic Knowledge from Neural Networks
RSCTC '98 Proceedings of the First International Conference on Rough Sets and Current Trends in Computing
Extracting comprehensible models from trained neural networks
Extracting comprehensible models from trained neural networks
Predicting credit card customer churn in banks using data mining
International Journal of Data Analysis Techniques and Strategies
International Journal of Data Analysis Techniques and Strategies
Testing terrorism theory with data mining
International Journal of Data Analysis Techniques and Strategies
Rule learning by searching on adapted nets
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
ANN-DT: an algorithm for extraction of decision trees from artificial neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
TREPAN is decision tree algorithm that utilises artificial neural networks (ANNs) in order to improve partitioning conditions when sample data is sparse. When sample sizes are limited during the tree-induction process, TREPAN relies on an ANN oracle in order to create artificial sample instances. The original TREPAN implementation was limited to ANNs that were designed to be classification models. In other words, TREPAN was incapable of building decision trees from ANN models that were continuous in nature. Thus, the objective of this research was to modify the original implementation of TREPAN in order to develop and test decision trees derived from continuous-based ANN models. Though the modification were minor, they are significant because it provides researchers and practitioners an additional strategy to extract knowledge from a trained ANN regardless of its design. This research also explores how TEPAN|s adjustable settings influence predictive performances based on a dataset|s complexity and size.