Given a fixed dependency graph G that describes a Bayesian network of binary variables X1,…,Xn, our main result is a tight bound on the mutual information Ic(Y1,…,Yk)=∑kj=1H(Yj)/c−H(Y1,…,Yk) of an observed subset Y1,…,Yk of the variables X1,…,Xn. Our bound depends on certain quantities that can be computed from the connective structure of the nodes in G. Thus it allows to discriminate between different dependency graphs for a probability distribution, as we show from numerical experiments.
In this paper, we generalize the noisy-or model. The generalizations are three-fold. First, we allow parents to be multivalued ordinal variables. Second, parents can have both positive and negative influences on their common child. Third, we describe how the suggested generalization can be extended to multivalued child variables. The major advantage of our generalizations is that they require only one parameter per parent. We suggest a model learning method and report results of experiments on the Reuters text classification data. The generalized noisy-or models achieve equal or better performance than the standard noisy-or. An important property of the noisy-or model and of its generalizations suggested in this paper is that it allows more efficient exact inference than logistic regression models do.
We investigate solution sets of a special kind of linear inequality systems. In particular, we derive characterizations of these sets in terms of minimal solution sets. The studied inequalities emerge as information inequalities in the context of Bayesian networks. This allows to deduce structural properties of Bayesian networks, which is important within causal inference.
Bayesian networks became a popiilar framework for reasoning with
uncertainty. Efficient methods have been developed for probabilistic reasoning with new evidence. However, when new evidence is nncertain or imprecise, different methods have been proposed. The original contribution of this paper are guidelines for the treatment of different types of uncertain evidence, the rules for combining evidence from different sources, and the model revision with nncertain evidence.
Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy ℓ-out-of-k functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing ℓ-out-of-k functions. We propose an approximation of tensors representing noisy ℓ-out-of-k functions by a sum of r tensors of rank one, where r is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.