Anda di halaman 1dari 3
132 HYPOTHESIS TESTING In terms of the likelihood function, the decision rule is to choose Hy when prly) — __ P(Ho)(Ci0 ~ Coo) Poly) ~ EL — P(Ho)M(Cor — Cis) In terms of the hypothesis H,, the decision rule is, choose H, if ayy 4 PID)» Pla\Cro = Coo) _ 66 Pov) ~ [= PHIMCox = Crs) ‘We have found the region R, which minimizes the average cost given by Eq. (5-2). The resulting minimum cost is called the Bayes risk, and this criterion is called the Bayes criterion. By comparing Eqs. (5-6) and (5-1) for the case when Cyo — Coo = Cor — C1, it follows that the maximum a posteriori probability criterion is a special case of the Bayes criterion. 5.4 Minimum Error Probability Criterion In communication systems it is usual to minimize the average error probability. No cost is associated with a correct decision, and the errors of each kind are assigned equal cost. Therefore, assume that Cc, Co=Cu=0 and Cy 1 (7) Using these costs, the average cost, Eq. (5-2), is C = P(H,)P(D; | Ho) + [1 — P(Ho)]P(Dol A) (5-8) Thus, for the assumptions in Eq. (5-7), the average cost is the average error probability P., and minimizing the average error probability is equivalent to minimizing the Bayes risk. The decision rule becomes: choose H, if Pay). PCH) Poly) P(Ho) Note that this test is identical to that of Eq. (5-1) for the maximum a posteriori probability criterion. This test is also referred to as the ideal observer test (1,2). (5-9) 5.5 Neyman—Pearson Criterion In most communication systems where the errors are assumed to be of equal importance and the a priori probabilities are known, the criterion of minimum error probability is generally used. In a radar system, however, the a priori probabilities and the cost of each kind of error are difficult to 5.5 NEYMAN-PEARSON CRITERION 133 determine. For such cases there is another criterion which involves neither a priori probabilities nor cost estimates. It is the Neyman-Pearson criterion. In radar terminology its objective is to maximize the probability of detection for a given probability of false alarm. This objective can be accomplished by using a likelihood ratio test. Specifically, there exists some nonnegative number 1 such that if hypothesis 7, is chosen when 0) Poly) and hypothesis Ho is chosen otherwise, then this rule yields the maximum P(D,|H,) for all tests subject to the constraint that P(D, | Ho) is less than some predetermined constant. This can be proven by using the Bayes criterion. We wish to maximize P(D, | H,) subject to the constraint P(D, | Ho) =. Since P(D, | H,) = 1 — P(Do| H,), we may equivalently minimize P(Do| H,). Since P(D, | Ho) is constant, adding it to P(Do|H;) will not influence the minimization. Consequently, maximizing P(D,|H,) is equivalent to min- imizing Ay) =n (5-10) Q = P(Do| H;) + HP(D, | Ho) (5-11) where yz is an arbitrary constant.f Substituting Coo = C,, = 0, [1 — P(H)] Cos = Land P(Ho)C,o = win Eq. (5-2), the average cost becomes C = P(Dol Hy) + HP(D; | Ho) (5-12) This is the same as Eq. (5-11) and is the quantity to be minimized. Thus, the Neyman-Pearson criterion is a special case of the Bayes criterion. It has already been determined that C is a minimum when hypothesis H, is chosen to satisfy Eq. (5-6). Substituting the assumed values for Coo, Cor, Cit» Cio into this equation yields the rule: choose H, when Pi) Pol¥) Thus a likelihood ratio test will maximize P(D, |) for a given P(D, | Hy). EXAMPLE 5.5-1 Define the random variable y as y =s +n, where n is a Gaussian random variable having zero mean and variance o” = 2, and s isa constant equal to either 0 or 1. On the basis of a single sample y, determine an optimum decision rule to choose between the hypotheses Hy: 5=0, Hy: s=1 using a Neyman-Pearson test with P(D, | Ho) = 0.1. + The constant jz is a Lagrange undetermined multiplier [Goldstein (3)). 134 HYPOTHESIS TESTING The likelihood functions po(y) and p,(y) are Gaussian density functions since n is a Gaussian variable. In particular 1 2 ert ws Gan 1 age Pol) and p(y) = Grane & The likelihood ratio is yy <2) goin Poly) and the Neyman-Pearson test is to choose H, if UID= > y, The threshold A is chosen to satisfy the false alarm probability constraint. Since the exponential term is monotonically increasing with y, an equivalent test is to choose H, if y 2 y. To determine the threshold, the false alarm probability is = 1 =», POIHo) =f rowdy =| Garmgrae dy = 04 ; 5 With a change of variable (x = y/2'’) this becomes P(D, | Ho) = 0.1 =f en"? dx 202 (Qn)? Therefore y = 1.8, and the decision tule is to choose H, if y > 1.8; choose H, otherwise. The probability of detection based on the single observation yis ~O- DNS dy = 0,285 1 POI) =f mOddy= | Garam e To express the decision rule in terms of the likelihood ratio A(y) and A, observe that 4()) = 4,. Therefore, since y = 1. Pi (7) (2x) 12272 OF 1) = Tg) 7 Oxy Eg TI ~ A 19 The decision rule is then: choose H, if 4(y) = 1.9 and choose Ho otherwise. EXAMPLE 5.5-2 It is of some interest to take the previous example and assume a priori probabilities for the signal ‘‘s.” This is then analogous to a binary communication problem. The criterion shall be minimum error

Anda mungkin juga menyukai