Computer systems that learn: classification and prediction methods from statistics, neural nets, machine learning, and expert systems
Symbolic and Neural Learning Algorithms: An Experimental Comparison
Machine Learning
Extracting Refined Rules from Knowledge-Based Neural Networks
Machine Learning
Symbolic Representation of Neural Networks
Computer - Special issue: neural computing: companion issue to Spring 1996 IEEE Computational Science & Engineering
Using neural networks for data mining
Future Generation Computer Systems - Special double issue on data mining
Effective Data Mining Using Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Computers and Operations Research
An empirical comparison of ID3 and back-propagation
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
Rule learning by searching on adapted nets
AAAI'91 Proceedings of the ninth National conference on Artificial intelligence - Volume 2
ChiMerge: discretization of numeric attributes
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Bankruptcy prediction for credit risk using neural networks: A survey and new results
IEEE Transactions on Neural Networks
Using genetic algorithm based knowledge refinement model for dividend policy forecasting
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
The knowledge-based artificial neural network (KBANN) is composed of phases involving the expression of domain knowledge, the abstraction of domain knowledge at neural networks, the training of neural networks, and finally, the extraction of rules from trained neural networks. The KBANN attempts to open up the neural network black box and generates symbolic rules with (approximately) the same predictive power as the neural network itself. An advantage of using KBANN is that the neural network considers the contribution of the inputs towards classification as a group, while rule-based algorithms like C5.0 measure the individual contribution of the inputs one at a time as the tree is grown. The knowledge consolidation model (KCM) combines the rules extracted using KBANN (NeuroRule), frequency matrix (which is similar to the Naive Bayesian technique), and C5.0 algorithm. The KCM can effectively integrate multiple rule sets into one centralized knowledge base. The cumulative rules from other single models can improve overall performance as it can reduce error-term and increase R-square. The key idea in the KCM is to combine a number of classifiers such that the resulting combined system achieves higher classification accuracy and efficiency than the original single classifiers. The aim of KCM is to design a composite system that outperforms any individual classifier by pooling together the decisions of all classifiers. Another advantage of KCM is that it does not need the memory space to store the dataset as only extracted knowledge is necessary in build this integrated model. It can also reduce the costs from storage allocation, memory, and time schedule. In order to verify the feasibility and effectiveness of KCM, personal credit rating dataset provided by a local bank in Seoul, Republic of Korea is used in this study. The results from the tests show that the performance of KCM is superior to that of the other single models such as multiple discriminant analysis, logistic regression, frequency matrix, neural networks, decision trees, and NeuroRule. Moreover, our model is superior to a previous algorithm for the extraction of rules from general neural networks.