Neural computing: theory and practice
Neural computing: theory and practice
Context-sensitive learning methods for text categorization
ACM Transactions on Information Systems (TOIS)
A re-examination of text categorization methods
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Hierarchical neural networks for text categorization (poster abstract)
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Feature Reduction for Neural Network Based Text Categorization
DASFAA '99 Proceedings of the Sixth International Conference on Database Systems for Advanced Applications
Training Neural Networks with Threshold Activation Functions and Constrained Integer Weights
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 5 - Volume 5
Deterministic convergence of an online gradient method for BP neural networks
IEEE Transactions on Neural Networks
Expert Systems with Applications: An International Journal
Information Processing and Management: an International Journal
Automatic classification of Tamil documents using vector space model and artificial neural network
Expert Systems with Applications: An International Journal
Toward breast cancer survivability prediction models through improving training space
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
This paper described two kinds of neural networks for text categorization, multi-output perceptron learning (MOPL) and back-propagation neural network (BPNN), and then we proposed a novel algorithm using improved back-propagation neural network. This algorithm can overcome some shortcomings in traditional back-propagation neural network such as slow training speed and easy to enter into local minimum. We compared the training time and the performance, and tested the three methods on the standard Reuter- 21578. The results show that the proposed algorithm is able to achieve high categorization effectiveness as measured by the precision, recall and F-measure.