Natural gradient works efficiently in learning
Neural Computation
Sparse matrix computations on manycore GPU's
Proceedings of the 45th annual Design Automation Conference
Probabilistic Logic Networks: A Comprehensive Framework for Uncertain Inference
Probabilistic Logic Networks: A Comprehensive Framework for Uncertain Inference
Proceedings of the 2008 conference on Artificial General Intelligence 2008: Proceedings of the First AGI Conference
BioNLP '06 Proceedings of the Workshop on Linking Natural Language Processing and Biology: Towards Deeper Biological Literature Analysis
Three hypotheses about the geometry of mind
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Three hypotheses about the geometry of mind
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
Hi-index | 0.00 |
Inspired by a broader perspective viewing intelligent system dynamics in terms of the geometry of "cognitive spaces," we conduct a preliminary investigation of the application of information-geometry based learning to ECAN (Economic Attention Networks), the component of the integrative OpenCog AGI system concerned with attention allocation and credit assignment. We generalize Amari's "natural gradient" algorithm for network learning to encompass ECAN and other recurrent networks, and apply it to small example cases of ECAN, demonstrating a dramatic improvement in the effectiveness of attention allocation compared to prior (Hebbian learning like) ECAN methods. Scaling up the method to deal with realistically-sized ECAN networks as used in OpenCog remains for the future, but should be achievable using sparse matrix methods on GPUs.