Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Machine Learning
The Connectionist Inductive Learning and Logic Programming System
Applied Intelligence
Some challenges and grand challenges for computational intelligence
Journal of the ACM (JACM)
TACAS '02 Proceedings of the 8th International Conference on Tools and Algorithms for the Construction and Analysis of Systems
Journal of Automata, Languages and Combinatorics - Selected papers of the workshop on logic and algebra for concurrency
Handbook of Temporal Reasoning in Artificial Intelligence (Foundations of Artificial Intelligence (Elsevier))
Model checking: algorithmic verification and debugging
Communications of the ACM - Scratch Programming for All
A connectionist cognitive model for temporal synchronisation and learning
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
The integration of knowledge representation, reasoning and learning into a robust and computationally effective model is a key challenge in Artificial Intelligence. Temporal models are fundamental to describe the behaviour of computing and information systems. In addition, acquiring the description of the desired behaviour of a system is a complex task in several AI domains. In this paper, we evaluate a neural framework capable of adapting temporal models according to properties, and also learning through observation of examples. In this framework, a symbolically described model is translated into a recurrent neural network, and algorithms are proposed to integrate learning, both from examples and from properties. In the end, the knowledge is again symbolically represented, incorporating both initial model and learned specification, as shown by our case study. The case study illustrates how the integration of methodologies and principles from distinct AI areas can be relevant to build robust intelligent systems.