We propose a simple method of construction of new families of ϕ-divergences. This method called convex standardization is applicable to convex and concave functions ψ(t) twice continuously differentiable in a neighborhood of t=1 with nonzero second derivative at the point t=1. Using this method we introduce several extensions of the LeCam, power, χa and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power α=1. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power α=2. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than the classical power entropies. Concrete examples, tables and figures illustrate the obtained results.