Go to previous page Go up Go to next page

3.1 Bayesian approach

In the Bayesian approach we assign costs to our decisions; in particular we introduce positive numbers Cij, i,j = 0,1, where Cij is the cost incurred by choosing hypothesis Hi when hypothesis Hj is true. We define the conditional risk R of a decision rule d for each hypothesis as
R (d) = C P (R) + C P (R'), j = 0,1, (15) j 0j j 1j j
where Pj is the probability distribution of the data when hypothesis Hj is true. Next we assign probabilities p0 and p1 = 1- p0 to the occurrences of hypothesis H0 and H1, respectively. These probabilities are called a priori probabilities or priors . We define the Bayes risk as the overall average cost incurred by the decision rule d :
r(d) = p0R0(d) + p1R1(d). (16)
Finally we define the Bayes rule as the rule that minimizes the Bayes risk r(d) .
Go to previous page Go up Go to next page