This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power α=1. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power α=2. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.
The exact range of the joined values of several Rényi entropies is determined. The method is based on topology with special emphasis on the orientation of the objects studied. Like in the case when only two orders of the Rényi entropies are studied, one can parametrize the boundary of the range. An explicit formula for a tight upper or lower bound for one order of entropy in terms of another order of entropy cannot be given.
The paper summarizes and extends the theory of generalized ϕ-entropies Hϕ(X) of random variables X obtained as ϕ-informations Iϕ(X;Y) about X maximized over random variables Y. Among the new results is the proof of the fact that these entropies need not be concave functions of distributions pX. An extended class of power entropies Hα(X) is introduced, parametrized by α∈R, where Hα(X) are concave in pX for α≥0 and convex for α<0. It is proved that all power entropies with α≤2 are maximal ϕ-informations Iϕ(X;X) for appropriate ϕ depending on α. Prominent members of this subclass of power entropies are the Shannon entropy H1(X) and the quadratic entropy H2(X). The paper investigates also the tightness of practically important previously established relations between these two entropies and errors e(X) of Bayesian decisions about possible realizations of X. The quadratic entropy is shown to provide estimates which are in average more than 100 \than those based on the Shannon entropy, and this tightness is shown to increase even further when α increases beyond α=2. Finally, the paper studies various measures of statistical diversity and introduces a general measure of anisotony between them. This measure is numerically evaluated for the entropic measures of diversity H1(X) and H2(X).