We propose a simple method of construction of new families of ϕ-divergences. This method called convex standardization is applicable to convex and concave functions ψ(t) twice continuously differentiable in a neighborhood of t=1 with nonzero second derivative at the point t=1. Using this method we introduce several extensions of the LeCam, power, χa and Matusita divergences. The extended families are shown to connect smoothly these divergences with the Kullback divergence or they connect various pairs of these particular divergences themselves. We investigate also the metric properties of divergences from these extended families.
Point estimators based on minimization of information-theoretic divergences between empirical and hypothetical distribution induce a problem when working with continuous families which are measure-theoretically orthogonal with the family of empirical distributions. In this case, the ϕ-divergence is always equal to its upper bound, and the minimum ϕ-divergence estimates are trivial. Broniatowski and Vajda \cite{IV09} proposed several modifications of the minimum divergence rule to provide a solution to the above mentioned problem. We examine these new estimation methods with respect to consistency, robustness and efficiency through an extended simulation study. We focus on the well-known family of power divergences parametrized by α∈R in the Gaussian model, and we perform a comparative computer simulation for several randomly selected contaminated and uncontaminated data sets, different sample sizes and different ϕ-divergence parameters.