Probabilistic inference with noisy-threshold models based on a CP tensor decomposition

  • Authors:
  • Jiří Vomlel;Petr Tichavský

  • Affiliations:
  • -;-

  • Venue:
  • International Journal of Approximate Reasoning
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

The specification of conditional probability tables (CPTs) is a difficult task in the construction of probabilistic graphical models. Several types of canonical models have been proposed to ease that difficulty. Noisy-threshold models generalize the two most popular canonical models: the noisy-or and the noisy-and. When using the standard inference techniques the inference complexity is exponential with respect to the number of parents of a variable. More efficient inference techniques can be employed for CPTs that take a special form. CPTs can be viewed as tensors. Tensors can be decomposed into linear combinations of rank-one tensors, where a rank-one tensor is an outer product of vectors. Such decomposition is referred to as Canonical Polyadic (CP) or CANDECOMP-PARAFAC (CP) decomposition. The tensor decomposition offers a compact representation of CPTs which can be efficiently utilized in probabilistic inference. In this paper we propose a CP decomposition of tensors corresponding to CPTs of threshold functions, exactly @?-out-of-k functions, and their noisy counterparts. We prove results about the symmetric rank of these tensors in the real and complex domains. The proofs are constructive and provide methods for CP decomposition of these tensors. An analytical and experimental comparison with the parent-divorcing method (which also has a polynomial complexity) shows superiority of the CP decomposition-based method. The experiments were performed on subnetworks of the well-known QMRT-DT network generalized by replacing noisy-or by noisy-threshold models.