Distributed Representations, Simple Recurrent Networks, And Grammatical Structure
Machine Learning - Connectionist approaches to language learning
Building neural networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
The Handbook of Brain Theory and Neural Networks
The Handbook of Brain Theory and Neural Networks
Hybrid Neural Network and Expert Systems
Hybrid Neural Network and Expert Systems
Brain-like Computing and Intelligent Information Systems
Brain-like Computing and Intelligent Information Systems
Recurrent Neural Networks for Robust Real-World Text Classification
WI '07 Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence
Journal of Artificial Intelligence Research
Quantum-inspired evolutionary algorithm for a class of combinatorial optimization
IEEE Transactions on Evolutionary Computation
Gradient calculations for dynamic recurrent neural networks: a survey
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
As the enormous amount of on-line text grows on the World-Wide Web, the development of methods for automatically summarizing this text becomes more important. The primary goal of this research is to create an efficient tool that is able to summarize large documents automatically. We propose an Evolving connectionist System that is adaptive, incremental learning and knowledge representation system that evolves its structure and functionality. In this paper, we propose a novel approach for Part of Speech disambiguation using a recurrent neural network, a paradigm capable of dealing with sequential data. We observed that connectionist approach to text summarization has a natural way of learning grammatical structures through experience. Experimental results show that our approach achieves acceptable performance.