Communications of the ACM
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Towards a unified theory of intensional logic programming
Journal of Logic Programming
Temporal logic (vol. 1): mathematical foundations and computational aspects
Temporal logic (vol. 1): mathematical foundations and computational aspects
Knowledge-based artificial neural networks
Artificial Intelligence
Reasoning about knowledge
Robust reasoning: integrating rule-based and similarity-based reasoning
Artificial Intelligence
Modal deduction with applications in epistemic and temporal logics
Handbook of logic in artificial intelligence and logic programming (Vol. 4)
Knowledge-based neurocomputing
Knowledge-based neurocomputing
Symbolic knowledge extraction from trained neural networks: a sound approach
Artificial Intelligence
Connectionist inference models
Neural Networks
Connectionist-Symbolic Integration: From Unified to Hybrid Approaches
Connectionist-Symbolic Integration: From Unified to Hybrid Approaches
Neural-Symbolic Learning System: Foundations and Applications
Neural-Symbolic Learning System: Foundations and Applications
The Connectionist Inductive Learning and Logic Programming System
Applied Intelligence
Three problems in computer science
Journal of the ACM (JACM)
An Overview of Temporal and Modal Logic Programming
ICTL '94 Proceedings of the First International Conference on Temporal Logic
Rule-based reasoning in connectionist networks
Rule-based reasoning in connectionist networks
Complete Axiomatizations for Reasoning about Knowledge and Time
SIAM Journal on Computing
A connectionist cognitive model for temporal synchronisation and learning
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
The importance of the efforts to bridge the gap between the connectionist and symbolic paradigms of artificial intelligence has been widely recognized. The merging of theory (background knowledge) and data learning (learning from examples) into neural-symbolic systems has indicated that such a learning system is more effective than purely symbolic or purely connectionist systems. Until recently, however, neural-symbolic systems were not able to fully represent, reason, and learn expressive languages other than classical propositional and fragments of first-order logic. In this article, we show that nonclassical logics, in particular propositional temporal logic and combinations of temporal and epistemic (modal) reasoning, can be effectively computed by artificial neural networks. We present the language of a connectionist temporal logic of knowledge (CTLK). We then present a temporal algorithm that translates CTLK theories into ensembles of neural networks and prove that the translation is correct. Finally, we apply CTLK to the muddy children puzzle, which has been widely used as a testbed for distributed knowledge representation. We provide a complete solution to the puzzle with the use of simple neural networks, capable of reasoning about knowledge evolution in time and of knowledge acquisition through learning.