On minimax estimation in uncertain-stochastic models with probability criteria
In this work we study the problem of optimal estimation in the multivariate uncertain-stochastic observation model by minimax criterion with generalized probabilistic risk functions. The most general results in this area are obtained using the mean-square error as loss. Nevertheless, the statistical references based on the mean-square error could lead to non-adequate decisions if the exact joint distribution of random parameters differs from the Gaussian law. At the same time, given a'priori statistical information in terms of restrictions on the moment characteristics, one can find the tight bounds of various non-mean-square risk functions at linear decision rules. This makes possible to suggest efficient optimization procedures for designing linear estimation algorithms, which are optimal in a minimax sense. The practical and theoretical interests motivate the following question: whether linear estimators are minimax-optimal over the class of all measurable decision rules given fixed second-order moments of random parameters? For various linear uncertain-stochastic systems this problem has been investigated in detail using the mean-square risk.
In this work we are going to show that there exists a linear operator that is minimax over the family of all unbiased estimators for the broad class of risk functions monotonous with respect to the euclidian norm of the estimation error. In addition, we treat three kinds of estimation criteria based on expectation, probability, and quantile risk functions.