We consider observations of a random process (or a random field), which is modeled by a nonlinear regression with a parametrized mean (or trend) and a parametrized covariance function. Optimality criteria for parameter estimation are to be based here on the mean square errors (MSE) of estimators. We mention briefly expressions obtained for very small samples via probability densities of estimators. Then we show that an approximation of MSE via Fisher information matrix is possible, even for small or moderate samples, when the errors of observations are normal and small. Finally, we summarize some properties of optimality criteria known for the noncorrelated case, which can be transferred to the correlated case, in particular a recently published concept of universal optimality.
A random process (field) with given parametrized mean and covariance function is observed at a finite number of chosen design points . The information about its parameters is measured via the Fisher information matrix (for normally distributed observations) or using information functionals depending on that matrix. Conditions are stated, under which the contribution of one design point to this information is zero. Explicit expressions are obtained for the amount of information coming from a selected subset of a given design. Relations to some algorithms for optimum design of experiments in case of correlated observations are indicated.