IEEE Transactions on Information Theory
Fast bregman divergence NMF using taylor expansion and coordinate descent
Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
Proceedings of the 23rd international conference on World wide web
Hi-index | 754.90 |
We formulate and prove an axiomatic characterization of the Riemannian geometry underlying manifolds of conditional models. The characterization holds for both normalized and nonnormalized conditional models. In the normalized case, the characterization extends the derivation of the Fisher information by Cencov while in the nonnormalized case it extends Campbell's theorem. Due to the close connection between the conditional I-divergence and the product Fisher information metric, we provides a new axiomatic interpretation of the geometries underlying logistic regression and AdaBoost