2005 Special Issue: On the relationship between deterministic and probabilistic directed Graphical models: From Bayesian networks to recursive neural networks

  • Authors:
  • Pierre Baldi;Michal Rosen-Zvi

  • Affiliations:
  • School of Information and Computer Sciences, University of California, Irvine, CA 92697-3425, USA and Institute for Genomics and Bioinformatics, University of California, Irvine, CA 92697-3425, US ...;School of Computer Science and Engineering, The Hebrew University of Jerusalem, 91904 Jerusalem, Israel

  • Venue:
  • Neural Networks - Special issue on neural networks and kernel methods for structured domains
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Machine learning methods that can handle variable-size structured data such as sequences and graphs include Bayesian networks (BNs) and Recursive Neural Networks (RNNs). In both classes of models, the data is modeled using a set of observed and hidden variables associated with the nodes of a directed acyclic graph. In BNs, the conditional relationships between parent and child variables are probabilistic, whereas in RNNs they are deterministic and parameterized by neural networks. Here, we study the formal relationship between both classes of models and show that when the source nodes variables are observed, RNNs can be viewed as limits, both in distribution and probability, of BNs with local conditional distributions that have vanishing covariance matrices and converge to delta functions. Conditions for uniform convergence are also given together with an analysis of the behavior and exactness of Belief Propagation (BP) in 'deterministic' BNs. Implications for the design of mixed architectures and the corresponding inference algorithms are briefly discussed.