Go to previous page Go up Go to next page

4.1 Bayesian estimation

We assign a cost function C(h',h) of estimating the true value of h as h' . We then associate with an estimator ^ h a conditional risk or cost averaged over all realizations of data x for each value of the parameter h :
integral ( ) Rh(^h) = Eh[C(^h,h)] = C ^h(x),h p(x, h)dx, (23) X
where X is the set of observations and p(x,h) is the joint probability distribution of data x and parameter h . We further assume that there is a certain a priori probability distribution p(h) of the parameter h . We then define the Bayes estimator as the estimator that minimizes the average risk defined as
integral integral ( ) r(^h) = E[Rh(^h)] = C ^h(x), h p(x, h)p(h)dh dx, (24) X Q
where E is the expectation value with respect to an a priori distribution p, and Q is the set of observations of the parameter h . It is not difficult to show that for a commonly used cost function
C(h', h) = (h' - h)2, (25)
the Bayesian estimator is the conditional mean of the parameter h given data x, i.e.,
integral ^h(x) = E[h|x] = ^h(x)p(h|x)dh, (26) Q
where p(h| x) is the conditional probability density of parameter h given the data x .
Go to previous page Go up Go to next page