Maximum likelihood bounded tree-width Markov networks
Artificial Intelligence
Synchronous firing and higher-order interactions in neuron pool
Neural Computation
Information-geometric measure for neural spikes
Neural Computation
An Information Geometric Perspective on Active Learning
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Locality of global stochastic interaction in directed acyclic networks
Neural Computation
Dynamical properties of strongly interacting Markov chains
Neural Networks
Stochastic reasoning, free energy, and information geometry
Neural Computation
Information Geometry of Interspike Intervals in Spiking Neurons
Neural Computation
Compact representations as a search strategy: compression EDAs
Theoretical Computer Science - Foundations of genetic algorithms
An information theoretic approach to a novel nonlinear independent component analysis paradigm
Signal Processing - Special issue: Information theoretic signal processing
An information geometry perspective on estimation of distribution algorithms: boundary analysis
Proceedings of the 10th annual conference companion on Genetic and evolutionary computation
Generating spike trains with specified correlation coefficients
Neural Computation
Characterizing Pure High-Order Entanglements in Lexical Semantic Spaces via Information Geometry
QI '09 Proceedings of the 3rd International Symposium on Quantum Interaction
Information Geometry and Its Applications: Convex Function and Dually Flat Manifold
Emerging Trends in Visual Computing
Efficient identification of assembly neurons within massively parallel spike trains
Computational Intelligence and Neuroscience - Special issue on signal processing for neural spike trains
IEEE Transactions on Information Theory
Entropy bounds for a Markov random subfield
ISIT'09 Proceedings of the 2009 IEEE international conference on Symposium on Information Theory - Volume 1
A neural network pruning approach based on compressive sampling
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Measure of correlation orthogonal to change in firing rate
Neural Computation
Discovering functional communities in dynamical networks
ICML'06 Proceedings of the 2006 conference on Statistical network analysis
Conditional mixture model for correlated neuronal spikes
Neural Computation
Bridging the gaps between cameras
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Journal of Computational Neuroscience
Towards the geometry of estimation of distribution algorithms based on the exponential family
Proceedings of the 11th workshop proceedings on Foundations of genetic algorithms
Pure high-order word dependence mining via information geometry
ICTIR'11 Proceedings of the Third international conference on Advances in information retrieval theory
An information geometrical analysis of neural spike sequences
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part III
Compact genetic codes as a search strategy of evolutionary processes
FOGA'05 Proceedings of the 8th international conference on Foundations of Genetic Algorithms
Lower bounds for divergence in the central limit theorem
General Theory of Information Transfer and Combinatorics
Information-geometric approach to inferring causal directions
Artificial Intelligence
Dreaming of mathematical neuroscience for half a century
Neural Networks
Full Length Article: Information geometry of target tracking sensor networks
Information Fusion
Mining pure high-order word associations via information geometry for information retrieval
ACM Transactions on Information Systems (TOIS)
Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
Journal of Computational Neuroscience
Hi-index | 754.90 |
An exponential family or mixture family of probability distributions has a natural hierarchical structure. This paper gives an “orthogonal” decomposition of such a system based on information geometry. A typical example is the decomposition of stochastic dependency among a number of random variables. In general, they have a complex structure of dependencies. Pairwise dependency is easily represented by correlation, but it is more difficult to measure effects of pure triplewise or higher order interactions (dependencies) among these variables. Stochastic dependency is decomposed quantitatively into an “orthogonal” sum of pairwise, triplewise, and further higher order dependencies. This gives a new invariant decomposition of joint entropy. This problem is important for extracting intrinsic interactions in firing patterns of an ensemble of neurons and for estimating its functional connections. The orthogonal decomposition is given in a wide class of hierarchical structures including both exponential and mixture families. As an example, we decompose the dependency in a higher order Markov chain into a sum of those in various lower order Markov chains