Let P be a discrete multidimensional probability distribution over a finite set of variables N which is only partially specified by the requirement that it has prescribed given marginals {PA;A∈\SS}, where \SS is a class of subsets of N with ⋃\SS=N. The paper deals with the problem of approximating P on the basis of those given marginals. The divergence of an approximation P^ from P is measured by the relative entropy H(P|P^). Two methods for approximating P are compared. One of them uses formerly introduced concept of {\em dependence structure simplification\/} (see Perez \cite{Per79}). The other one is based on an {\em explicit expression}, which has to be normalized. We give examples showing that neither of these two methods is universally better than the other. If one of the considered approximations P^ really has the prescribed marginals then it appears to be the distribution P with minimal possible multiinformation. A simple condition on the class \SS implying the existence of an approximation P^ with prescribed marginals is recalled. If the condition holds then both methods for approximating P give the same result.
Stochastic interdependence of a probability distribution on a product space is measured by its Kullback-Leibler distance from the exponential family of product distributions (called multi-information). Here we investigate low-dimensional exponential families that contain the maximizers of stochastic interdependence in their closure. Based on a detailed description of the structure of probability distributions with globally maximal multi-information we obtain our main result: The exponential family of pure pair-interactions contains all global maximizers of the multi-information in its closure.
The problem to maximize the information divergence from an exponential family is generalized to the setting of Bregman divergences and suitably defined Bregman families., Johannes Rauh., and Obsahuje bibliografické odkazy
This work studies the standard exponential families of probability measures on Euclidean spaces that have finite supports. In such a family parameterized by means, the mean is supposed to move along a segment inside the convex support towards an endpoint on the boundary of the support. Limit behavior of several quantities related to the exponential family is described explicitly. In particular, the variance functions and information divergences are studied around the boundary.
The information divergence of a probability measure P from an exponential family E over a finite set is defined as infimum of the divergences of P from Q subject to Q∈E. All directional derivatives of the divergence from E are explicitly found. To this end, behaviour of the conjugate of a log-Laplace transform on the boundary of its domain is analysed. The first order conditions for P to be a maximizer of the divergence from E are presented, including new ones when P is not projectable to E
.