Elements of information theory
Elements of information theory
A direct approach to conformational dynamics based on hybrid Monte Carlo
Journal of Computational Physics - Special issue on computational molecular biophysics
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Markov Chains and Stochastic Stability
Markov Chains and Stochastic Stability
The Kullback-Leibler divergence rate between Markov sources
IEEE Transactions on Information Theory
Oja's algorithm for graph clustering, Markov spectral decomposition, and risk sensitive control
Automatica (Journal of IFAC)
Compositional approximate markov chain aggregation for PEPA models
EPEW'12 Proceedings of the 9th European conference on Computer Performance Engineering
Compositional approximate markov chain aggregation for PEPA models
EPEW'12 Proceedings of the 9th European conference on Computer Performance Engineering
Sparse semi-supervised learning on low-rank kernel
Neurocomputing
Hi-index | 0.00 |
This paper is concerned with an information-theoretic framework to aggregate a large-scale Markov chain to obtain a reduced order Markov model. The Kullback-Leibler (K-L) divergence rate is employed as a metric to measure the distance between two stationary Markov chains. Model reduction is obtained by considering an optimization problem with respect to this metric. The solution is just the optimal aggregated Markov model. We show that the solution of the bi-partition problem is given by an eigenvalue problem. To construct a reduced order model with m super-states, a recursive algorithm is proposed and illustrated with examples.