### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Detection and Estimation Theory ECE 642

UNM

GPA 3.99

### View Full Document

## 35

## 0

## Popular in Course

## Popular in Engineering Electrical & Compu

This 238 page Class Notes was uploaded by Roel Green on Wednesday September 23, 2015. The Class Notes belongs to ECE 642 at University of New Mexico taught by Sudharman Jayaweera in Fall. Since its upload, it has received 35 views. For similar materials see /class/212147/ece-642-university-of-new-mexico in Engineering Electrical & Compu at University of New Mexico.

## Reviews for Detection and Estimation Theory

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/23/15

39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 15 October 11th Thursday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory r Basic Performance Measures for a Binary Decision Rule 8 0 Optimum rules based on likelihoodratio tests are of the form 1 Y 0 gt 8m 1 if T i where T F g gt R Q3 is a mapping from F g to R Q3 0 Detector performance is primarily given by the two conditional error probabilities PF and PM PF 8 P0 8 chooses H1 P0 r1 2 and PM 8 P1 8 chooses H0 P1F0 3 k J Dr S K Jayaweera Fall 07 2 39l39 39l I39 UNIVHHI39H H NEW MEXICO k Performance Evaluation of Optimal Detectors Thus performance evaluation involves computing the probabilities of the regions TY gt T TY lt T and TY I under both hypotheses o Conceptually this looks very simple but actual computation can be often dif cult o For example ifY Y1Y2 hasjoint pdfpo under Ho PF57 39I90011739 7yndy1 39dynY 39P0y17 wad Tygtt Tyt o If n is large this nfold integral may be dif cult to compute unless we have further simplifying assumptions Dr S K Jayarweera Fall 07 ECE642 Detection and Estimation Theory k 139le LlNI39lRI39l iv 4 NEW MEXICO ECE642 Detection and Estimation Theory r k Dr S K Jayaweera Fall 07 Direct Performance Computation N o Denote by FR the cumulative distribution function cdf of T Y under the hypothesis H where T Y is the decision statistic of a detector of the form of 1 ie Fmr PJmm PltTltYgtrIHJgt 0 Then for a detector of the form of 1 PF 8T PTY gt IlHo yPTY IlHo 4 1 13 TY g TlHo y PTY g TIHO 11317 PTY since the cdf is right continuous 1 FT0T Y F7900 390 from 4 J 4 gr1 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory r k Direct Performance Computation ctd N 0 Similarly 5T PTY lt rH1 1 yPTY rH1 PTY STIH1 YPTY TIHl PMlt FT1T Y FT1T 1131 FT1I 6 0 Observe from 5 and 6 that the performance evaluation of any LRT of the form of l is possible if the cdf s FL of TY under H0 and H1 can be determined easily in a neighborhood of the threshold 17 Dr S K Jayarweera Fall 07 wk n 3939le LlN39IRI39l n s NEW MEXICO ECE642 Detection and Estimation Theory f Special Case iid Observations o A special case in which Fm can be written straightforwardly is that in which the observation Y Y 1 Y2 is a vector of independent random variables and the decision statistic T y has the form 71 NY 2 gkOk 7 k1 where gk for k 127 w n is a sequence of nonlinearities 0 Note that for independent observations the loglikelihood ratio LLR is of the form of 7 o In this case we can compute Fm using the fact that a cdf F and its characteristics function form a unique pair k Dr S K Jayarweera Fall 07 0k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory iid Observations Characteristic Function of the Decision Statistic 0 Recall that the characteristic function chf of a random variable X is Xu EemX for u E R 8 0 De ne pmu IE emm lt9 chf of TY underHJ and ladu EJ emgkYkgt 10 chf ofgkYk underHJ where E denotes the expectation under k J Dr s K Jayaweera Fall 07 7 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Characteristic Function of the Decision Statistic for iid Observations TJu TJu Dr S K Jayaweera Fall 07 c From the independence of the Y k s from 9 E 6111700 J Ej emZ fplngH from 7 Ej eiugkYk k1 71 IE emgkYk because Yk s are independent 37 SH gbju Gionn10 11 T k 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory Distribution of the Decision Statistic with iid Observations 0 Since cdf Fm and the chf on forms a unique pair we can invert 11 to obtain the cdfFT j Billingsley 1979 7m e b Tjudu 12 u eiiua u iu for all a b that are continuity points of FT j FTJU FTja o The above inversion simpli es if T Y is a continuous random variable under H J A necessary condition for T Y to be continuous under H J is that DHu be absolutely integrable ie fix 7Judu lt 00 k Dr S K Jayarweera Fall 07 0k 39I39HI39 LlNI39lR39li quot NEW MEXICO ECE642 Detection and Estimation Theory N pdf of the Decision Statistic with iid Observations 0 When T y is continuous under H FR has a corresponding pdfpfd given by 1 pTdI Eiooonuk mtdu 13 0 Since from 9 pmu emtpTJUMt 14 Note that in this case 17 J and p7 are a Fourier transform pair k J Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Performance Analysis with iid Observations ctd 0 Thus if observations are independent and T Y is a continuous random variable then from 5 11 and 13 we have MST ftwmebumle m dudr 15 since T Y is continuous the boundary has probability zero 0 Similarly PMlt8T 11001 gk1uemtdudz 16 k Dr S K Jayarweera Fall 07 l l n 39139le LlN39lRl39l o i NEW MEXICO ECE642 Detection and Estimation Theory f Example Signal Detection in Cauchy Noise 0 Let us consider the following hypothesis pair H0 Yk Nk for k l2 n versus 17 H1 Yk SkNk for k 172 n where Nd21 is a sequence of iid Cauchy random variables and SQ1 is a sequence of known signals 0 Recall that if Nk is a Cauchy random variable then its pdf is l kaX W for XER k J Dr S K Jayarweera Fall 07 n 39139le LlN39lRl39l o i NEW MEXICO ECE642 Detection and Estimation Theory f Correlation Detection in Cauchy Noise 0 Suppose that we employ a correlation detector for detecting the coherent signal SQ21 in this additive Cauchy noise Note This may not be the optimal detector for this case see Appendix K o The decision statistic T y of a correlation detector is 71 NY 2 Skyk 19 k1 0 Hence from 7 gkyk skyk for kl2n 20 k J Dr S K Jayarweera Fall 07 139le LlNIVIiRsI39I iv J NEW MEXICO ECE642 Detection and Estimation Theory k Correlation Detection in Cauchy N01se ctd 0 Hence the characteristic function ogk0 of gkYk is from 10 20 pgkp u i IE0 emgkiyw from 10 IE0 ekak from 20 i E emka from 17 Nkusk 21 where Nk is the chf of Cauchy noise Nk J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Characteristic Function of Cauchy Noise 0 Since Nk is Cauchy from 14 Nk03 f19Nkx fr0m14 eiwxmdx from18 0 m Hw dx Nk0 7 6quot for DER 22 0 Hence from 21 and 22 gk0u Nkusk ei usk 23 k J Dr S K Jayaweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory Under H0 chf of the Decision Statistic of Correlation Detector in Cauchy Noise c From 11 and 23 7001 gk0u from11 eimk from 23 k1 k1 I l HeiiuHSki eiiuiyzp sk u nizz w 24 k1 0 De ne 1 1 1 Isl 2 Iskl 25 quotk1 0 Then 24 becomes k J Dr S K Jayaweera Fall 07 k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Under H0 pdf of the Decision Statistic of Correlation Detector in Cauchy Noise 0 Clearly 26 shows that T0u is absolutely integrable Hence T y is continuous under H0 and from 1 p790 inensl emtdu from 1 and 26 iO ensuiutdquiooensuiutdu 27 79 27 o looensuiutdulooensuiutdu 27 o 27 0 iMewitudu1 ein itudu 27 o 275 0 1 Milnu 1 n 39t t E ns it 0 E nsit 0 Dr S K Jayaweera Fall 07 17 139le UNIVERSI39H v J NEW MEXICO ECE642 Detection and Estimation Theory Under H0 pdf of the Decision Statistic of Correlation Detector in Cauchy Noise ctd 0 Hence Ii1111 2ns PM i 27 ns i nsit i 27 HIS 022 1 1 319301 nlsln 2 27 t W k Dr S K Jayaweera Fall 07 k 139le UNIVLRM39I i v J NEW MEXICO Falsealarm Probability of Correlation Detector in Cauchy Noise ECE642 Detection and Estimation Theory 0 Hence the false alarm probability of the correlator in Cauchy noise is P0T Y Z 17 since T Y is continuous dz from 27 l l ns7 r L2 quotM IMF 1quot 1W tan tan nlsl 1 7c 2 nlsl 1 1 1 T i 28 2 TE Dr S K Jayarweera Fall 07 39139le LlN39lRl39l o NEW MEXICO ECE642 Detection and Estimation Theory Under H1 chf of the Decision Statistic of Correlation Detector in Cauchy Noise 0 Similarly from 10 pgkyl E1 eiugkYk E1 eiuskYk E eiuskskNk from eiusZE eiuska eiusZ NkuSk eiusieilusk 0 Hence from 1 l T1u gklu l lem ie wskl from 29 k1 k1 emzalszewuwzamm emnszeinwswuw 30 where we have de ned s2 31 k J Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Miss Probability of the Correlation Detector in Cauchy Noise 0 Hence p71t emszu mmme imdu from 13 and 30 ioo en siHu iUtiinsEMdZl ioo ein u iiainsiz du 27 79 27 79 1 1 32 HTCIS1lttins22 quot131 0 Hence PM 6 P1ltTltYgt r 1 P1ltTltYgt 2 r 1 1 1 2dt from 32 ns7lt 17 1ltt7 2 k Dr S K Jayarweera Fall 07 139le UNIVERSI39H 11 J NEW MEXICO ECE642 Detection and Estimation Theory k Miss Probability of the Correlation Detector in Cauchy Noise ctd 0 Hence Dr S K Jayarweera Fall 07 1 3 1 tan 1 I m 7 ns T 3 7c 2 nlsl Isl 1 1 1 a s3 tan 2 7c nlsl Isl 33 39I39HI39 UNIVHLNI39H 1 NEW MEXICO ECE642 Detection and Estimation Theory Performance of an OLlevel Correlation Detector in Cauchy Noise o If we want a size0L detector we can choose the threshold as from 28 l ltan 1 2 7c nlsl quotc nls ltanltlt ocgt 7 34 0 Then the ROC is given by PD 1 PM i l ltan 1 T ISSZI from 33 i 2 ns 1 1 1 1 s2 2 tan tan 7 lt 2 0L k J Dr S K Jayaweera Fall 07 139le LlNI39lRI39l tv 3 NEW MEXICO ECE642 Detection and Estimation Theory k Remarks on the Performance of the Correlator in Cauchy Noise 1 From 3 5 observe that the detection performance improves if we increase the average signal power s2 In fact note that lim PD 8 1 szeoo 2 But the performance does not seem to depend on the number of samples n This is because the correlation detector is not the optimum detector for detecting a coherent signal in Cauchy noise If we were to employ that optimum detector we will nd that the performance does improve with increased sample size n Dr S K Jayaweera Fall 07 n 3939le LlN39IRI39l n NEW MEXICO ECE642 Detection and Estimation Theory f Direct Performance Evaluation Remarks 1 Another example where direct performance evaluation can be done using the above characteristic function approach is the case of detecting an iid Gaussian signal in iid Gaussian noise We already did this in Case 01 2 However the characteristic function approach may not always lead to tractable closedform expressions for error probabilities ie 15 and 16 for PF and PM may not be solved in closed forms In those cases usually 15 and 16 may be used to nd approximations for PM and PF Also for moderate errorprobability values we may use numerical integration to evaluate 15 and 16 for PF and PM k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Direct Performance Evaluation Remarks ctd 3 If either one or both of PF and PM are very small 3 105 for example then usually there is a parameter eg threshold SNR number of samples n that is very large Then the performance can be approximated by using asymptotic expressions of the errorprobabilities in terms of this parameter k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory N Appendix J Cumulative Distribution Function cdf Fm PX g x pXxdx Note that cdf is always right continuous ie lim FXx limFXx FXx0 xaxo x x0 PXx0 PX x0 PXltx0 FXx0 iil l l PX 3x0 xexa FXx0 iil l l FXOC x xo k J Dr S K Jayaweera Fall 07 k 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Appendix K Optimum Detectors for Coherent Detection in Cauchy Noise 0 Since Yd31 are iid under both hypotheses My H1I91yk i PNk yk Sk from 17 Hk1P0yc k1 ka yk 1 rclt1ltyrskgt2gt 1yi k1 n11yi k1 1yk Sk2 0 Hence optimum tests are of the form 1 gt n 1 2 8y y 1f 2kllog log 0 lt Dr S K Jayaweera Fall 07 39 39I39HIquot UNIVERSI39IW NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Chernoff Bound for Signal Detectors k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico 5th Lecture 10 September 2 Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Detection of Coherent Signals with Unknown Amplitude in iid Noise o In many detection problems we know the form of the received signal except its amplitude 0 We can use following composite hypothesis testing problem to model this situation H0Yk Nk for kl2n versus 1 H1Yk i GskNk for kl27 Ggt0 0 Here 6 gt 0 is a signal strength parameter k J Dr S K Jayaweera Fall 07 2 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory k Detection of Coherent Signals with Unknown Amplitude in iid Noise Vector Formulation o In vector notation H0 I Y N versus H1Y 9sN7 6gt0 2 Si N 1 where s is known and N is a continuous random Sn Nn noise vector with iid components and marginal probability density function ka Dr S K Jayaweera Fall 07 k n 3939le LlN39IRI39l n NEW MEXICO ECE642 Detection and Estimation Theory f Coherent Signals with Unknown Amplitude in iid Noise UMP Detection Since noise components are iid given 6 we can write the likelihood ratio between H0 and H1 as 1 PN k GSk Leo L 3 H MM 0 Recall that model 1 is in fact a composite hypothesis testing problem Then from previous discussions we know that in general the critical region Fe y E R Ley gt T will depend on 6 except for some special cases Hence UMP tests for model 2 exist only for some special noise models eg Gaussian noise case considered in example A l k Dr S K Jayaweera Fall 07 pk k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Locally Optimum Locally Most Powerful Detection of Coherent Signals with Unknown Amplitude in iid Noise 0 But of course we can easily nd LMP tests for model 2 0 These LMP tests have a simple and intuitively reasonable structure that makes their investigation interesting 0 Assuming suf cient regularity the locally optimum test for H0 versus H1 is from 919 1 gt My v if gem 60 I lt4 0 lt J 5 Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Likelihood Ratio for Locally Optimum Detection of Coherent Signals in iid Noise o Differentiating 3 a n n m Vic 9S0 eltpNltyj esjgtgt L k J 86 a E H mm mm kw a L Zn n kayc SJ pivjoj esj 30 y i 86 9 60 F1 PNkOk PNJOj i Pin0 1 5 j1 8 where mix a x 1m ltxgt lt6 k Dr S K Jayaweera Fall 07 0k 139le UNIVLRM39I i iv J NEW MEXICO ECE642 Detection and Estimation Theory Locally Optimum Detection of Coherent Signals in iid Noise ctd 0 De ne the function I9iv1 x pN1x glox 0 Since we have assumed that noise samples are iid 0 Hence from 7 and 8 we can write 5 as a 146 Y Ms Sjgzoyj 60 j 1 k Dr S K Jayaweera Fall 07 mly 1mm forj12n 7 8 9 k 39I39HI39 UNIVERSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory k Locally Optimal Detector for Coherent Signals in iid Noise 0 Hence from 4 and 5 the LMP test for H0 versus H1 is 1 5100 Y if Zz1skgzoyk T 10 0 J Dr S K Jayaweera Fall 07 8 Locally Optimal Detector for Coherent Signals in V H 7 Estimation Theory ECE642 Detection and iid 1s l K El 39l39 a 1 M I E X 112 le iid Noise ctd yk 39 SK Locally optimum detector structure for coherent Signals in noi e FIGURE Iii 35 LMP Detector fbr Coherent Signals in iid Noise F i gure l is called a nonlinear correlator Just like the likelihood ratio Shapes the Observations yk noise as much as possible all 07 Since this has a memoryless nonlinearity E Ollowed by a correlator it the locally Optimum non linearity gzo C gt s to reduce the detrimental ef f ects of the S K Ljayawssra k Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Example A3 Locally Optimum Detection of Coherent Signals in iid Gaussian Noise 0 Suppose noise samples are all M062 Then for k 1 n 1 i N k e 202 a M 87k le 0 I9N1ykg 11 0 Hence from 7 and l l yk 810yk 7 12 o 0 Hence in this case g10yk is in fact a linear function of the observation yk n 39139le LlN39lRl39l ii NEW MEXICO ECE642 Detection and Estimation Theory Example A3 Locally Optimum Detection of Coherent Signals in iid Gaussian Noise ctd 0 Hence in this special case Fig l in fact is the same correlator that we obtained in example Al Fig 93 0 But of course from the discussion on UMP tests we know that for Gaussian noise case the detector in Fig 93 is in fact UMP for H0 versus H1 0 Since it is UMP then of course it is also LMP k Dr S K Jayarweera Fall 07 1 1 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Example A4 Locally Optimum Detection of Coherent Signals in iid Laplacian Noise 0 Then a 8 le yk k Dr S K Jayaweera Fall 07 0c MWc 56 my 0 Suppose now that noise samples are iid Laplacian with for yk E R a 0L 706W OL 70 2 1 2 i k yk 2 2 ayk ize i ze 0L2ltykgt yquot eiawm 0 Zeiocykyk 2 wig lykl OL2 7e yk sgnyk 13 0tpN1ycsgnyk 14 39I39HI39 LlNIVliRSI39l quot NEW MEXICO ECE642 Detection and Estimation Theory Example A4 Locally Optimum Detection of Coherent Signals in iid Laplacian Noise ctd 0 Hence from 7 and 14 glow 0t Sgnyk 15 0 Thus LMP test for H0 versus H1 in Laplacian noise from 10 is 1 gt my v if 2locsksgnm 16 0 lt k Dr S K Jayaweera Fall 07 ll V 4 Erl lei Llrx39 l v amp 1215 i i jaquot 39 VPWKlEXI39CIO ECE642 Detection and Estimation Theory Locally Optimum Detector for Coherent Signals in iid Laplacian Noise J OUT 77 V r yk E T H1 wp 7 I kquot1 r lt 0H0 FIGURE IVIIB6 optimum detector for Laplacian noise Figure 2 LMP Detector for Coherent Signals in iid Laplaeian Noise in this case the LMP test correlates a scaled version of the Thus with the sequence of signs of the Observations signal The inction gZO as given by 15 is called a hard limiter Compare this with the soft limiter we encountered in example A2 as the Optimal detector for the same Laplacian noise when hypotheses are simple Dr S K Ljayaweera F39aZZ 07 k 139le UN39IRI39I iv 3 NEW MEXICO Example A5 Locally Optimum Detection of Coherent Signals in iid Cauchy Noise ECE642 Detection and Estimation Theory o A noise model that has an even heavier tail than the Laplacian noise is the Cauchy Noise 0 Cauchy Noise has the pdf l for E R 17 PM yk no HIE yk 0 Then a 75 Zyk Zyk I9N yk I9N yk 18 ayk 1 n1y 2 1 My 0 Hence from 7 and 18 Zyk globla 1 H2 k Dr S K Jayaweera Fall 07 l 33971 Ir Lil X I x ae 1SI39I N 39 N E KPK RI E 4 1C C ECE642 Detection and Estimation Theory Example A5 Locally Optimum Detector for Coherent Signals in iid Cauchy Noise Then the LMP detector for Cauchy noise is om 10 1 gt 2 820 y Y 1f 221 1 3362 Sk 7 quotC 19 k 0 lt 0311 7 7 7 YR r In 7 E H T E H1 wp739 7 7 k1 7 lt V I 0 F1 G URE I II B 7 Locally optimum detector for 39 C noise Figure 3 LMP Detector for Coherent Signals in iid Cauchy Noise S K Ljayaweera Fall 07 39I39HI39 LIN39IR39I H NEW MEXICO ECE642 Detection and Estimation Theory k Example A5 Locally Optimum Detection of Coherent Signals in iid Cauchy Noise ctd 0 Note that function ggo yk in this case is approximately linear near yk 0 0 Hence for small values of observations the detector approximately correlates the observation yk with the signal sk 0 However for the observations with large magnitudes the function g10yk asymptotically goes to zero Hence the detector makes the contribution from large observations to the accumulated sum very small because there is a high probability that those large values are due to Cauchy distributed noise Dr S K Jayarweera Fall 07 39 39I39HI39 LlNIVliRSI39lW quot NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Detection of Deterministic Signals in Correlated Gaussian Noise Section 111B k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 05 September 4th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39 39I39HIquot UNIVERSI39IW NEW MEXICO ECE642 Detection and Estimation Theory f Just a Reminder No classes next week Next class on Sep 11th k Dr S K Jayaweera Fall 07 k 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory f N Examples of NeymanPearson Tests Example 1 Location Testing with Gaussian Error 0 Recall the location testing problem with Gaussian errors introduced in Example 1 of lecture 3 HOIY o8 ler me 1 0 Hence the false alarm probability is P0P1YgtTl190Y 7 P0 1 1 Yigtngt k Dr S K Jayarweera Fall 07 k 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory r Example 1 Location Testing with Gaussian Error ctd 0 Recall from equation 4 of lecture 3 that 7 La 41 2 kw 3 0 Hence 2 can be written as P0191Y gt npoY 1300 gt 11 4 where 62 n logm 5 1 o 2 0 Since Y N 5Myo62 under H0 1w gt n Q 6 k Dr S K Jayaweera Fall 07 k 39l l lli ll I39ll l39Y Hf NEW MEXICO ECE642 Detection and Estimation Theory Example 1 Location Testing with Gaussian Error ctd 0 Hence from 4 and 6 P0P1Y gt 1119000 7 ldgt7739f o M 122 gt c 1lao a 0 Figure 1 Computation of NeymanPearson Test Threshold for Location Testing with Gaussian Error Note that Qx 1 x L Dr S K Jayaweera Fall 07 J 5 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f Example 1 NeymanPearson Threshold 0 Since P0p1 Y gt npoY is a continuous function of n as can be seen from 7 and the previous gure any value of 0L can be achieved exactly by setting P0I91Ygtn0190Y or x Q no 0gt no 6Q 10c 0 8 where Q 1 is the inverse of Q function k Dr S K Jayarweera Fall 07 0k 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Example 1 NeymanPearson Optimal Decision Rule 0 Note that since PY n8 0 in this case the randomization can be chosen arbitrarily say Yo l 0 Thus an oclevel NeymanPearson test for this problem is given by 1 ify Zn 5NPy E 9 0 if y lt no 0 The detection probability of the test SM is PD8NP E18NPY 1310 2 10 0 Since Y N 5Mm62 under H1 i n6 m 7 1 PD5NP Q G gt i QltQ 006 G k J Dr S K Jayarweera Fall 07 7 r 39I39HI39 UNIVERSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Example 1 Location Testing with Gaussian Error ctd k 0 Let us de ne the the parameter 0 related to SNR as in equation 15 of lecture 3 1 0 d 1 1 G lt gt 0 Then we can rewrite the above detection probability as PD3NP QQ 10 d 12 Dr S K Jayaweera Fall 07 k lle ll39l39lR l39l39Y NEW MEXICO ECE642 Detection and Estimation Theory f N Power Function of a Test o For a given 06 12 gives the detection probability of the test 9 as a function of d Viewed like this the relationship 12 is known as the power function of the test since PD is the power of the test PD NP1 c1gtqgt11adgt A l Figure 2 Power Function for NeymanPearson Testing of Location Testing with Gaus sian Error J Dr S K Jayaweera Fall 07 9 I l lli LlNlVl39RSl IW39 2339 NEW MEXlU ECE642 Detection and Estimation Theory f N ROC Curve of a Test 0 Also 12 can be seen as giving the detection probability as a function of the falsealarm probability 0c for a xed SNR value at A parametric plot of this relationship is called the Receiver Operating Characteristic ROC 0 PF Np L Figure 3 Receiver Operating Characteristics for NeymanPearson Testing of Location Testing with Gaussian Error k J Dr S K Jayaweera Fall 07 10 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory r Example 2 NeymanPearson Testing for Binary Channel 0 Let us revisit the binary channel problem with the NeymanPearson optimality as the design criteria Figure 4 Binary Channel 0 This example will illustrate when we need randomization in the NeymanPearson test k Dr S K Jayarweera Fall 07 l l n 39139le LlN39lRl39l n NEW MEXICO ECE642 Detection and Estimation Theory f Example 2 Likelihood Ratio for Binary Channel 0 Recall that the likelihood ratio for this problem is given by k 171M 1fy0 My 13 1131 ify1 0 Suppose that we want an OLleVCl NeymanPearson Test 0 First we need to nd the required threshold and for this we need to consider the probability P0LY gt n k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVERSI39H 1 NEW MEXICO ECE642 Detection and Estimation Theory k r Example 2 An Assumption 0n the Binary Channel N o For simplicity let us assume that k0 X1 lt 1 14 0 Then we have M lt 1 lo 1 f 1M lt 1 15 and M lt 1 X1 1 10M gt 1 16 J Dr s K Jayaweera Fall 07 13 39I39HI39 LlNIVliRSI39H quot 1 NEW MEXICO ECE642 Detection and Estimation Theory f Example 2 Blnary Channel ctd c From 15 and 16 k1 1 1 17 1 10 lt M 0 Then x 1 1fn lt 171 P0LYgtn M 11330 gnlt 110 18 0 ifn 2 o A plot of18 looks like k J Dr S K Jayaweera Fall 07 I F UNIVI R5 39YW NEW MEXICO ECE642 Detection and Estimation Theory W Example 2 A Plot of P0 L Y gt n for Binary Channel PQLYgt7 A 1 gto gt1 1quotgt1 39 7 lXo 0 Figure 5 P0LY gt n of Binary Channel K J Dr S K Jayaweera Fall 07 139le UNIVLRM39I i v J NEW MEXICO ECE642 Detection and Estimation Theory Example 2 NeymanPearson threshold for Binary Channel test can be obtained as recall that it is the smallest n for which aawtwvsw o By inspection of this plot the desired threshold no for an oc level 0 if 0L 1 no IE IM ifkogoclt1 19 1g1 HO QltM K Dr S K Jayarweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory k Example 2 Randomization for NeymanPearson Testing of Binary Channel 0 Also by inspection the randomization constant is given by recall 17P0LYgtn0 that it is given by 70 POLYnO arbitrary ifoc 1 7P LY 1 a 0 51 m0 Soclt1 Y0 P0LYm 7P L Y i 0 PM ifogcxltxo Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Randomization in the NeymanPearson Test for Binary Channel 0 We can write 70 as arbitrary if 0L 1 Yo ifkogoclt1 13021 ifO g or lt to arbitrary if 0L 1 ifO S 0L lt 7amp0 k Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory k Dr S K Jayarweera Fall 07 Example 2 NeymanPearson Testing for Binary Channel If 0L 1 0 Now using 13 19 and 20 the resulting NeymanPearson test is i If 0L 1 then from 19 no 0 1 1fLy gt 0 SNPCV Y0 1110 0 21 0 ifLy lt 0 0 But since 70 is arbitrary when 0L 1 from 20 we may choose Yo 1 in 21 so that the NeymanPearson test simpli es to 81W 22gt 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory NeymanPearson Testing for Binary Channel If 0L 1 ctd 0 But using 13 Hy0 My 1 31M 2 o from 14 Hence from 22 SNPO 0 1 23 If y 1 My PM gt0 from14 K0 Hence from 22 SNPO 1 1 24 0 Thus from 23 and 24 SW 1 Vy 25 k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory r NeymanPearson Testing for Binary Channel If kg 3 0L lt 1 ii If M 3 0L lt 1 Then from 19 7V1 26 110 1 k0 0 Hence 1 ifLy gt 130 5NPy Y0 ifLO 130 27 0 ifLy lt 130 0 But from 20 in this case 0L X0 Y0 1 k0 k Dr S K Jayaweera Fall 07 139le UNIVERSI39I v 4 NEW MEXICO ECE642 Detection and Estimation Theory NeymanPearson Testing for Binary Channel If kg 3 0L lt 1 ctd If y 0 7M1 Ly from 13 28 1 X0 0 Hence from 27 and 28 SNPO 0 Yo 0L 7amp0 29 1 k0 k Dr S K Jayaweera Fall 07 139le UNIVERSI39I v J NEW MEXICO ECE642 Detection and Estimation Theory NeymanPearson Testing for Binary Channel If kg 3 0L lt 1 ctd Ify 1 Ly I M from 13 30 K0 0 But from 17 1 1 k1 k0 gt 1k0 31 Hence from 27 30 and 31 8pr 1 1 32 o Combining 29 and 32 m if 0 My HO y 33 1 1fy 1 k Dr S K Jayaweera Fall 07 139le UNIVERSI39H v J NEW MEXICO ECE642 Detection and Estimation Theory r NeymanPearson Testing for Binary Channel If 0 3 0L lt k0 iii If 0 3 0L lt M Then from 19 1 X1 7 34 110 k0 c From 20 0C 35 Yo k0 0 Hence from 34 and 35 1 ifLy gt 11 0 8Npy k1 ifLy 36 0 1fLy lt 110 k Dr S K Jayaweera Fall 07 139le UNIVERSI39H v J NEW MEXICO ECE642 Detection and Estimation Theory NeymanPearson Testing for Binary Channel If 0 3 0L lt k0 ctd Ify0 c From 13 A Lm le an Butfrom17 130 lt 110 Using this in 37 1 M lt M 38 k0 110 0 Hence from 38 and 36 8Wm0 6 k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory NeymanPearson Testing for Binary Channel if 0 3 0L lt k0 ctd Ify 1 0 Then from 13 1 7 My M 1 40 0 Hence from36 0L 5NPy 1 k O 41 o Combining 40 and 41 N130 a 42 k O 1f y 1 k Dr S K Jayaweera Fall 07 39I39HI39 LlNI39lR39lE quot NEW MEXICO ECE642 Detection and Estimation Theory Summary of NeymanPearson Testing for Binary Channel 0 Hence from 25 33 and 42 the NeymanPearson test for the binary channel is given as Ifoc1 8mm 1 Vy 43 o Ifko S 0L lt12 H0 if 0 my H0 y 44 1 1fy1 o IfOSOClt9V01 N 0 ify0 5NPy 0 45 k O 1fy1 k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Summary of NeymanPearson Testing for Binary Channel ctd o Combining 43 and 44 we can write that if kg 3 0L 3 1 H0 if i 0 7 y i 5NPy 1 k0 1 1f y 1 0 From 45 ifO 3 0L lt M N 0 if y 0 5NPy a k O 1f y 1 k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Detection Probability of the NeymanPearson Test for Binary Channel 0 We may also compute the detection probability or the Power of the NeymanPearson Test for the binary channel as below PD8NP P1Ly gt T10 Y0P1LO no 46 i Ifoc 1 c From 25 EBA120 l for V y Hence PD8NP 1 47 k J Dr S K Jayaweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory Detection Probability of the NP Test for Binary Channel ctd ii Ifxogoclt 1 c From 19 P1Ly gt no P1ltLygt 130 P1 Y 1 from 13 anc 1 X1 and P1 My no 7 P1Llty 130 P1Y 0 from 13 x1 49 0 Then using 48 and 49 in 46 and using 20 we have PD8NP 1 K1lt11 0gt71 50 0 1 Dr S K Jayaweera Fall 07 17X 43 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Detection Probability of the NP Test for Binary Channel ctd iii HO 3 0L lt M 0 Then from 19 P mgtmgtm wbl 0 from 13 and 17 51 and AMmgtPmolgf mmm P1Y 1 1 1 from 13 and 17I 2 0 Using 51 and 52 in 46 with Yo from 20 PD8NP ltgt1 M 53 K Dr S K Jayaweera Fall 07 3 1 139le LINI39IRI39I v J NEW MEXICO Channel ctd Detection Probability of the NeymanPearson Test for Binary ECE642 Detection and Estimation Theory o Combining 47 and 50 if kg 3 0L 3 l k 0L 9M PD5NP 1 7V1 l N M 54 0 Hence form 53 and 54 the detection probability of the NeymanPearson test is N 141 H0 x1 ifxogagi Plump Ho 55 u M ifogcxgxo Dr S K Jayarweera Fall 07 139le UNIVLRM39I v J NEW MEXICO ECE642 Detection and Estimation Theory r N Receiver Operating characteristics for Binary Channel 0 Note from 55 that the ROC s ie PD Vs PF plots are pieceVice linear with a change in slope at PF k0 ie for PF lt M gt slope is 110 gt 1 from 16 for PF 2 x0 slope is 1in lt 1 from 15 o For k0 k1 A g Binary Symmetric Channel 55 simpli es to PD8NP ii M ifxgocgi X ifOSOLSN k Dr S K Jayaweera Fall 07 39l lllZ U l Ill39l39Yu 9 NFVV MEXICO ECE642 Detection and Estimation Theory Receiver Operating Characteristics of a Binary Symmetric Channel PD39 NP l PF NP Dr S K Jayaweera Fall 07 Figure 6 Receiver Operating Characteristics for a Binary Symmetric Channel BSC J 34 39 39I39HIquot UNIVERSI39IW NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Minimax Hypothesis Testing Section 11C k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 14 October 09th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Detection of Gaussian Signals in Gaussian Noise Independent But Nonidentical Signal Samples 0 Consider the detection of Gaussian signals in Gaussian noise problem with nonzero signal means 1 H12 SN where N N M 0621 and S N M0423 with ZS diag 631 g 0 This is the case in which noise samples are iid MN 62 and the signal samples are independent M pk 63k k Dr S K Jayaweera Fall 07 k 39I39HI39 LlNIVliRSI39l l quot NEW MEXICO ECE642 Detection and Estimation Theory Detection of Gaussian Signals in Gaussian Noise Independent But Nonidentical Signal Samples ctd 0 Similar to 138 for this model we have 20 621 631 0 0 2S l 0 632 0 2 0 0 0 0 63quot quot0 0 H1 Iquot k Dr S K Jayaweera Fall 07 k k 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory Log LR for Detection of Independent But NonIdentical Gaussian Signals 0 Since signals and noise are independent 31 02 0 0 0 62 02 0 21 2s 621 32 3 0 0 0 0 ofquot 62 0 Hencewe can write the loglikelihood ratio for the problem in l as 16 my zn izlif log L y Xp y T2f1y gtlog 10g 1900 1 zn w Dr S K Jayaweera Fall 07 leXp y720 1y J 4 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Log LR for Detection of Independent But NonIdentical Gaussian Signals ctd 0 Hence 10gLy k Dr S K Jayaweera Fall 07 5 wk 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory k Detection of Gaussian Signals in Gaussian Noise Dependent Correlated Signal Samples 0 Now let us consider the same detection problem in l but this time with dependent signal samples ie when ZS is not diagonal 0 Let us denote by pJy 191017 quot 7yn the conditional density of Y Yhm 7Y underHj Dr S K Jayaweera Fall 07 0k 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Detection of Correlated Gaussian Signals in Gaussian Noise Conditional pdfs 191y k pjyl7y27 191ynly17y2 191ynly17y2 PJCVclylymyk1191y1 0 We have straightforwardly 7y 7yn71 191y17y27 H 7yn71 ynil 191yn71ly17y2 7yn72 191y17y2 71 My pjy1lpjyky1w yH 6 k2 0 Note that 6 holds for any density de ned on R J Dr S K Jayaweera Fall 07 7 yn72 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Detection of Correlated Gaussian Signals in Gaussian Noise Observation pdf Under H0 c From 1 under hypothesis Ho Yk s are independent because noise N is iid Hence Yk is independent of Y1 Yk1 0 Thus under H0 PJ39Oklyly ykil Myc for k 27 7n 7 0 Hence from 6 and 7 under H0 1 1900 1 1190 yk 8 k1 HIM e 262 9 k1 k1 27562 k J Dr S K Jayaweera Fall 07 8 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory k Detection of Correlated Gaussian Signals in Gaussian Noise Observation pdf Under H1 0 When signal samples are correlated under H1 Y k is not independent Of Y1 Yk1 0 However since Y is a Gaussian random vector under H1 given Y1 y17 Y2 y27 w Yk1 yk1 Yk is conditionally Gaussian ie the conditional distribution of Yk is Gaussian given Y1 Yk1 ie p1yky1 w yk1 is a Gaussian pdf 0 Hence we only need to nd the mean and variance of this density in order to specify it Dr S K Jayaweera Fall 07 0k k 139le UN39IRI39I iv J NEW MEXICO ECE642 Detection and Estimation Theory Under H1 Mean of the Conditional Density p1yky1 yk1 0 Mean of the conditional density p1yky1 yk1 is E1Yle1 y1 7Yk71 ykili E1SkNle1 y17 39 7Yk71 E1SkY1J17 7Ykil ykiliJFEliNc since Nk is independent of SH and NH for ch lt k E1SkY1J17 7Ykil ykili 3k A where wee have de ned E1SkY1y17 7Ycilykil 3k Dr S K Jayaweera Fall 07 ykil EdsCW1 y17H397Ycil Yk71E1Nle1 y17H397Yc71 yki 1 11 fr 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory k Dr S K Jayaweera Fall 07 Under H1 Variance of the Conditional Density p1yky1 yk1 o Variance of the conditional density p1yky1 yk1 is V3r1YkY1J17 7Yk71 ykilVarlSkNCY1y17quot397Yk71y V3r1SkY1yly 7Yc71yk71Var1Nle1yly 7Yc71 since Sk and Nk are independent of each other Varl SkY1 y17 Yk1 yk1 Var1NK since Nk is independent of past Yk s ie SH and MC for ch lt k 6 k62 where we have de ned A2 65k V3r1SkY1y17 7Ycil ykil 13 71 kil 12 NEW MEXICO 39I39HI39 LIN39IR39I quot ECE642 Detection and Estimation Theory k Detection of Correlated Gaussian Signals in Gaussian Noise Observation pdf Under H1 ctd Note that since Y1 Yk is jointly Gaussian under both hypotheses Y1 w Yk1Sk is alsojointly Gaussian since Yk Sk Nk Hence given Y1 w Yk1 we know that Sk is also conditionally Gaussian and 3k and 63 are the mean and variance of this conditional Gaussian density A property of the multivariate Gaussian density is that 6 k does not depend on the conditioned values of Y1 H Y 161 see Appendix G Hence from 10 and 12 for k 27 n P1yky17 7yk71 N NltSky17 yk7176 k62gt 14 Where SkOly 7yk71 Eisklyh 7yk71 Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Observation pdf Under H1 0 Then from 6 191 y is the product of n Gaussian M 31 63 62 densities where we have de ned E1S1 31 and Var1S1 63 0 Hence 7 Yk k2 1 e 26gk 62 1 27 lt6 k 62 E 15 T k J Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory N Log LLR for Detection of Dependent Gaussian Signals c From 9 and 15 we can write the loglikelihood ratio for the correlated Gaussian signal scenario as 7 W492 Hz 1 1 26 k62 27562 02 log my log log V 3k poy n 1 2ykz Hm We 1 A log Z A e 6 63k 6 k1 6 650 lilog 62 12 12yk Sk 2 k1 62 63 2 k16 2 k1 62 GS k J Dr S K Jayaweera Fall 07 NEW MEXICO 39I39HI39 LIN39IR39I H ECE642 Detection and Estimation Theory k A Relationship Between the Dependent and Independent Gaussian Signal Detection Comparing 16 with the LLR for an independent signal given in 5 we see that detecting a dependent Gaussian signal is analogous to detecting an independent Gaussian signal with a mean vector S and a covariance matrix diag lt6 1w 33 The only difference is that in the independent signal case the mean value yk of the kth signal sample does not depend on the past observations yl w yk1 But in the case of a correlated Gaussian signal the conditional mean 3k of the kth signal sample depends on the past observations y1o yk1 Dr S K Jayaweera Fall 07 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory Estimator Correlator Interpretation of the Optimum Detector for Dependent Gaussian Signals in iid Gaussian Noise 0 We can write 16 for dependent Gaussian signals as below 1 n n yk k 2 1 n 62 logLy p zyi 2 5 210g 14 17 G k1 k1 1 i k1 G 02 0 Note 6 k is the variance of Sk given Y1 w Yk1 ie the variance of the part of Sk that cannot be predicted from past Y1 w Yk1 0 Suppose that the noise variance 62 is large relative to the maximum of these prediction error variances l l laXlgkgn 6i ie 6i ltlt 62 18 ie Sk is strongly correlated with Y1 Yk1 or S1 78161 and thus the prediction of Sk is very good J Dr S K Jayarweera Fall 07 39I39HI39 LIN39IR39I quot NEW MEXICO ECE642 Detection and Estimation Theory Estimator Correlator Interpretation of the Optimum Detector for Dependent Gaussian Signals in iid Gaussian Noise ctd c From the assumption 18 we have S 1G 2quot m 1 19 0 Using 19 in 17 gives 1 1 1 71 A 71 A 71 10g1LY1 R F ZyE Ema 192 2 ZykSk E 2515 k1 k1 k1 k1 1 A Z k yk Sk 62 k1 2 1 A 1 A glts7y E II S IIZ k J Dr S K Jayarweera Fall 07 k Dr S K Jayarweera Fall 07 39I39H I39 LlNI39lR39l quot NEW MEXICO ECE642 Detection and Estimation Theory Estimator Correlator Interpretation of the Optimum Detector for Dependent Gaussian Signals in iid Gaussian Noise ctd o If we compare 21 with 1010 in Example Al we notice that 21 is the same structure that would be needed to detect a deterministic 73 signal 1 3 ie coherent detection Of 17 c That is when the approximation 18 holds the optimum detector for Gaussian signals in Gaussian noise can be Viewed as a detector that estimates the signal from the past observations ie nd 3k and then treats it as a known signal ie use the coherent detector for 3k 0 The required estimate SIC is indeed given in l 1 Ski E1SkY1y17 7Ycilykili k 39I39H I39 LlNI39lR39l quot NEW MEXICO ECE642 Detection and Estimation Theory Estimator Correlator Receivers for NonGaussian Stochastic Signals in iid Gaussian Noise o The above estimatorcorrelator implementation can be generalized to more general stochastic signals that are not necessarily multivariate Gaussian in independent Mm 621 noise 0 Let us assume the following generalized version of model 1 Ho Y N versus 23 H 1 I Y S N where N N M0621n and S has the pdfpss Dr S K Jayarweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory LR for NonGaussian Stochastic Signal Detection in iid Gaussian Noise o The likelihood ratio for this general model is Ly IE where is wrt S 190y 1 quot6 yzioszz E MSW 2 pNy 1 rich 27502 Ee zy i s Ee 2sTyiHsH2 Le gyilslzbasws 24 K J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory Estimator Correlator Receivers for NonGaussian Stochastic Signals in iid Gaussian Noise ctd 0 Within regularity on pss the meanvalue theorem for integrals then implies that see Appendix H 1 A 1 A Ly eo JSTY z W for some se R 25 Of course s depends on the values of y ie s o ie for the general stochastic signal detection model 23 with independent Gaussian noise we can write the likelihood ratio as in 25 for some s sy which is a function of y But note that 25 is the same as the coherent detector 1010 and 22 for a known signal sy k Dr S K Jayarweera Fall 07 k 139le UN39IRI39I iv 3 NEW MEXICO Estimator Correlator Receivers for NonGaussian Stochastic Signals in iid Gaussian Noise ctd ECE642 Detection and Estimation Theory 0 Thus the optimum detector for a general stochastic signal not necessarily Gaussian in iid Gaussian noise can be interpreted as an estimator sy followed by an optimum detector for sy as if it were a coherent signal ie a correlator o This structure is known as an estimatorcorrelator detector Dr S K Jayarweera Fall 07 k 139le LlNI39lRI39l iv 3 NEW MEXICO Estimator Correlator Receivers for NonGaussian Stochastic Signals in iid Gaussian Noise Remarks ECE642 Detection and Estimation Theory 1 Note that for a general stochastic signal the above estimatorcorrelator may not be particularly simple since the function sy can be dif cult to nd andor implement 2 Also if you recall from 1 l in the Gaussian stochastic signal case SA7 only depends on past observations y1o yk1 However for a general stochastic signal model this may not be the case and thus SA7 could depend on all y1w yn including future observations This will make it dif cult or impossible to compute sy in real time Dr S K Jayarweera Fall 07 k 139le LlNI39lRI39l iv 3 NEW MEXICO What is the Use of Estimator Correlator Interpretation for NonGaussian Stochastic Signals in iid Gaussian Noise ECE642 Detection and Estimation Theory o This estimatorcorrelator interpretation gives an idea of how one might at least design a suboptimal detector for detecting a stochastic signal not necessarily Gaussian in iid Gaussian noise i First build a system that could estimate the signal with suf cient accuracy using the observations ii Then implement a coherent detector for detecting the estimated signal as if it were the actual deterministic signal to be detected Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Appendlx G Condltlonal Gauss1an DenSIty 0 Consider a Gaussian random vector X N M 17 Z of the form of X1 X X2 with EX p quot1 2 and 211212 CovltXgt EltX mltx m7 2 22122 k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Appendix G Conditional Gaussian Density ctd 0 Given X2 xz X1 is a conditionally Gaussian random vector with the distribution MEX1X2 x2C0vX1X2 32 where IEX1X2 X2 IEX121222 21X2 12 and COVX1X2 X2 211 212221221 k J Dr S K Jayarweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Appendix H Meanvalue Theorem for Integrals LetA be a compact and connected set Let f be continuous on A and g be integrable over A with gx Z 0 for every x E A Then there exists an x E A such that Afxgxdx m Agxdx k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory k Appendix H An Application of Meanvalue theorem for Integrals Derivation of 25 ctd 0 Consider 24 R is compact and connected Take it as A Function exp 67 y s is a continuous function of s for 56R Since pss is a pdf on R pss Z 0 for VSER and Rnps5ds 1 gt fansws lt oo 0 Hence pss is integrable Dr S K Jayaweera Fall 07 28 139le UNIVERSI39H v J NEW MEXICO ECE642 Detection and Estimation Theory k Appendix H An Application of Meanvalue theorem for Integrals Derivation of 25 ctd 0 Thus for some 6 R L e62 1 which is 25 1 AT 1 6E6 Y7 1 AT 1 6E6 Y7 YTS S Zgtpsltsgtds W Rnpsltsgtds W2 Dr S K Jayaweera Fall 07 39I39HIquot LlNIVliRSI39H NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Performance Evaluation of Signal Detection Procedures k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 04 August 30th Thursday Fall 2007 k J Dr S K Jayaweera Fall 07 A NEW MEXICO 39I39HI39 LIN39IR39I quot ECE642 Detection and Estimation Theory r k Neyman Pearson classical Hypothesis Testing N In the Bayesian formulation for the hypothesis testing the optimality was de ned in terms of minimizing the overall expected cost de ned as the average risk This required a speci c cost structure on the decisions and also the knowledge of the prior probabilities of the two hypotheses In many practical problems of interest however such an imposition of a speci c cost structure on the decisions made and the prior probabilities are not possible or desirable In such cases another optimality criteria named Neyman Pearson optimality is often used Dr S K Jayarweera Fall 07 k 39I39HI39 LlNI39lR39l quot NEW MEXICO ECE642 Detection and Estimation Theory f Terminology in Neyman Pearson Hypothesis Testing 0 Recall our simple binary hypothesis testing problem H0 Y N P0 versus 1 H1 Y N P1 0 There are two types of errors that can be made a H0 can be falsely rejected when it is true Type I error b H1 can be falsely rejected when it is true Type 11 error k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f False Alarms o In radar or sonar problems the two hypothesis H0 and H1 usually correspond to the absence and the presence of a target 0 Thus Type 1 errors corresponds to declaring there is a target accepting H1 when there is no target For this reason Type 1 errors are called the false alarms o For a decision rule 8 the probability of a type I error is known as the size of 8 or the false alarm probability or the false alarm rate of 8 and is denoted by PF8 k Dr S K Jayarweera Fall 07 bk n 39I39Hl LlN39IRI39l ii NEW MEXICO ECE642 Detection and Estimation Theory f Misses Since a Type 11 error corresponds to declaring that there is no target present ie accepting H0 as true when in fact there is a target ie H1 is the true hypothesis this represents a miss Thus type II errors are called misses For a decision rule 8 the probability of type II errors is called the miss probability and denoted by PM8 In the above terminology a correct acceptance of H1 ie declaring that there is a target when in fact there is one is called a detection We denote by PD8 the detection probability This is called the power of 8 Clearly k PDlt6gt 1 PMlt6gt 2 Dr S K Jayarweera Fall 07 wk 39I39HI39 LIN39IR39I ij NEW MEXICO ECE642 Detection and Estimation Theory f False Alarms Vs Misses A Tradeoff The design of a decision rule for H0 versus H1 involves a tradeoff between these two types of errors false alarms and misses This is because we can make one of them arbitrarily small at the expense of making the other large For example We can make PF8 0 by always declaring that hypothesis H0 is true But this will also make PM8 l in other words PD8 0 The Bayes formulation was one way of trading these off With uniform cost function we basically minimize the average of these two types of errors k Dr S K Jayarweera Fall 07 0k n 39I39Hl LlN39IRI39l n s NEW MEXICO ECE642 Detection and Estimation Theory f NeymanPearson Optimality Criterion The NeymanPearson criterion for making this trade off is to place a bound on the false alarm probability PF8 and then to minimize the miss probability PM8 subjected to this constraint Using 2 the NeymanPearson design criterion is m 1X PD8 subject to PF8 3 0L 3 where 0L is the upper bound on the false alarm probability 0L is known as the level of the test or the signi cance level of the test Thus according to 3 the NeymanPearson design goal is to nd the most powerful OLlevel test of H0 versus H1 k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Few Remarks o The NeymanPearson criterion allows to recognize the basic asymmetry in importance of the two hypotheses unlike the Bayes criterion 0 NeymanPearson hypothesis testing is also called the classical hypothesis testing 0 Traditionally it is the most commonly used optimality criteria in radar and sonar applications while Bayes criterion is the common choice in communication systems k Dr S K Jayarweera Fall 07 k 39I39HI39 LIN39IR39I quot NEW MEXICO ECE642 Detection and Estimation Theory r k D Randomize Decision Rules Randomize Tests N 0 De nition A randomized decision rule 8 for H0 versus H1 is de ned as a function that maps the observation set F to the interval 0 l with the interpretation that for y E F is the conditional probability with which we accept the hypothesis H1 given observation Y y 0 Recall that in a nonrandomize test 8 the value of 8 is the index of the accepted hypothesis 0 However in a randomize test 8 the value of 8 is the probability with which we accept H1 When Stakes only the two values 0 and 1 these two de nitions coincide r S K Jayarweera Fall 07 0k 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f Randomize and Nonrandomized Tests 0 Thus according to the above de nition the nonrandomized decision rules that we used earlier are a special case of the randomized decision rules 0 In particular a nonrandomize rule 8 corresponds to the randomize rule 89 so k J Dr S K Jayarweera Fall 07 n 39I39Hl LlN39IRI39l g s NEW MEXICO ECE642 Detection and Estimation Theory f False Alarm Probability of a Test 0 Recall that the false alarm probability of a decision rule is the probability with which it accepts H1 given that H0 is the true hypothesis 0 Since of a randomize test is the conditional probability of accepting H1 given Y we may obtain the false alarm probability of a randomize test by averaging 8y over the distribution of Y under Ho 0 Since the density of Y under H0 is 1900 the false alarm probability of the test 8 is PMS E0 3m 8y190ydy 4 P where IE0 denotes expectation under hypothesis H0 we may sometime write this more informativer as EY HO k Dr S K Jayarweera Fall 07 1 1 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Detection Probability of a Test 0 Similarly the detection probability probability with which it accepts H1 given H1 is the true hypothesis of a randomized rule 8 is IE1 8m fr mpiwdy 5 PD 8 0 We summarize the general solution to the NeymanPearson hypothesis testing problem it in the following lemma k J Dr S K Jayaweera Fall 07 n 39139le UNIVHHI39H o s NEW MEXICO ECE642 Detection and Estimation Theory f Proposition 41 The NeymanPearson Lemma 0 Consider the hypothesis pair of l in which distribution PJ has density 19 for j 0 and j 1 Suppose that 0L gt 0 Then the following statements are true i Optimality Let Sbe any decision rule satisfying PMS 3 0L and let 8 be any decision rule of the form 1 ifPiO gt 711900 5 y vy if 191 y npoy 6 0 ifPiO lt 711900 where n 2 0 and 0 S 7y S l are such that PM 0L Then PDlt8 gt 2 PDltSgt That is any size0L decision rule of the form of 17 is an N P rule k Dr S K Jayaweera Fall 07 n 39139le LlN39lRl39l o s NEW MEXICO ECE642 Detection and Estimation Theory f Proposition 41 The NeymanPearson Lemma ctd ii Existence For every 0L 6 07 1 there is a decision rule 8N of the form of 17 with 70 70 a constant for which PF8NP 0L iii Uniqueness Suppose that is any OLlevel NeymanPearson optimal decision rule for H0 versus H1 Then must be of the form of 17 except possibly on a subset of F having zero probability under H0 and H1 k J Dr S K Jayarweera Fall 07 139le UN39IRI39I iv J NEW MEXICO ECE642 Detection and Estimation Theory r Proof of the NeymanPearson Lemma i Optimality 0 Assume that 8 and 3 are as de ned above note that both 8 and 3 only take values in 07 1 0 Because of the way 8 is de ned we always have that Sim 8m p1 npoy 2 0 w e r 7 0 Thus from Sltygtgtltp1ltygt npoydy 2 o 8 0 Expanding and rearranging the terms S pldu 8pm 2 n S podu f 819007 9 1 1 1 1 k J Dr S K Jayarweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory k Proof of the NeymanPearson Lemma i Optimality ctd 0 Using 4 and 5 in 9 V PDltS gt PDltSgt n PM PFlt8gt 2 0 Since PFS gm 0 Thus PD8 Z PD3 as required and any size 0L decision rule of the form of 17 is a NeymanPearson rule Dr S K Jayaweera Fall 07 n Oi PMS Since PF8 a 10 39l lll Llquotl39iil39lquotl39 39w NEW MEXICO ECE642 Detection and Estimation Theory Proof of the NeymanPearson Lemma ii Existence ctd 0 Let n no he the smallest number such that Po 191Y gt 1119007 E 06 11 it Pop1Ygt np0Y Cl gt 7 Pop1Y 770p0Y Pop1Y nopom Q39P0p J a 1 i l J pomlm gt nopoY quot7 Figure 12 Note that P0 p1 Y gt np0Y increases as 11 decreases k J Dr S K Jayaweera Fall 07 17 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory Proof of The NeymanPearson Lemma ii Existence ctd o If P0p1 Y gt nopoY 0L then choose 70 arbitrarily say Yo 0 IfP0I91Y gtnopoY 0LP0191Ygt TloP0Y 12 YO Po191Y TloP0Y 0 Then if we de ne 8N to be the decision rule of 17 with n no and 7y 70 above then we have PFSNP IEo SA1307 P0 1910 gt T101900 Y0P0 1910 1101900 OL 13 lt 0L then choose 70 as 0 Thus we have chosen a decision rule of the form of 17 with 7y 70 a constant and false alarm probability 0L as required Dr S K Jayaweera Fall 07 139le LlNI39lRI39l iv 4 NEW MEXICO ECE642 Detection and Estimation Theory k Proof of The NeymanPearson Lemma iii Uniqueness ctd 0 Suppose that 3 is an oclevel NeymanPearson rule of the form of 17 with PF8 0L and let 8 be any other OLlevel NeymanPearson rule 0 Then by the de nition of the NeymanPearson optimality of the two detectors they must have N pD8 PD5 14 0 Then from 10 letting 8 be the test 8 in 10 0 Z X PF8H Z 0 since PMS 3 0L PF8 x 15 Dr S K Jayaweera Fall 07 39I39HI39 LIN39IR39I quot NEW MEXICO ECE642 Detection and Estimation Theory Proof of The NeymanPearson Lemma iii Uniqueness ctd o By starting with 14 and 15 and then working backward from 10 to 8 we get 0 PDQ PD8 n 0 PF3 1PD3 PD3 nOs PMS 3191d 3 191d n ltsP0d sPod gt 0 1 1 1 1 fr Sky 8 p1y npoydy o 16 0 Since the integrand is nonnegative note that 8 also can take values only in 0 1 16 implies that the integrand is equal to zero except possibly on a set of zero probability under H0 and H1 0 Thus 8 and 8H di er only on the set y 6 Fl p1 y np0y which Dr S K Jayarweera Fall 07 39I39HIquot LlNIVliRSI39H NEW MEXICO ECE642 Detection and Estimation Theory N N implies that 8 is also of the form of 17 possibly differing from 8 only in the function 70 o This completes the proof k Dr S K Jayaweera Fall 07 n 39I39Hl LlN39IRI39l g NEW MEXICO ECE642 Detection and Estimation Theory f Bayesian Vs NeymanPearson Optimality o The NeymanPearson test for a given hypothesis pair differs from the Bayes test only in the choice of threshold and randomization 1 N i i P101 i 8NPy Y 1f 140 p00 n 17 0 lt where n 2 0 and 0 S y S l are chosen such that PF8NP 0L 0 Under Bayesian optimality the threshold is determined by the prior probabilities and the cost function But in the NeymanPerson test it is determined by the required false alarm probability 0 NeymanPearson lemma reinforces the optimality of the likelihood ratio tests Dr S K Jayarweera Fall 07 m 39I39HIquot LlNIVliRSI39lW NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Examples of NeymanPearson Hypothesis Testing Section 11D k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 08 September 18th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory CompOSIte HypotheSIS Testing 0 Consider the general binary hypothesis testing problem H0 Y N P0 H1 Y N P1 1 1 Simple hypothesis testing problems Each of the two hypothesis in 1 corresponds to only a single distribution for the observation this is what we have considered so far 2 Composite hypothesis testing problems Under each of the hypothesis there can be many possible distributions for the observation k Dr S K Jayarweera Fall 07 k k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory Composite Hypothesis Testing Example Radar Detection 0 When a returned signal is present in the observation it may have some unknown parameters such as its exact time of arrival TOA which is related to its position and its Dopple shift that is related to its velocity 0 Thus the target present or H1 hypothesis can be composite in this case Dr S K Jayaweera Fall 07 k NEW MEXICO r 39I39H I39 LlNI39lR39l H ECE642 Detection and Estimation Theory Composite Hypothesis Testing Problem N k 0 Let us consider a family of probability distributions on the observation set F indexed by a parameter 6 that takes values on a set A ie we have a family P3 6 E A where P9 is the distribution of the observation Y given that 6 is the true parameter value c For the simple hypothesis pair of l o More generally we may have a parameter space A that is the union of two disjoint parameter sets A0 and A1 representing the ranges of the parameter under the two hypotheses Dr S K Jayarweera Fall 07 bk n 39I39Hl LlN39IRI39l ii NEW MEXICO ECE642 Detection and Estimation Theory Bayes CompOSIte HypotheSIS Testing In the Bayesian formulationwe assume that the parameter is a random quantity 9 that takes values in A Thus in this case P9 is interpreted as the conditional distribution of Y given 9 6 To choose an optimum decision rule we must assign costs to our decisions through a cost function C i 0 where C i 6 is the cost of choosing decision 139 or the hypothesis 139 when Y N P9 for i E 0 l and 6 E A Note that For simplicity we are considering only nonrandomized tests and assume that C is nonnegative and bounded ie 0 S C lt 00 k J 5 Dr S K Jayarweera Fall 07 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory r Average Risk in Bayesian Composite Hypothesis Testing o For a decision rule 8 we can then de ne conditional risks as we did for the simple hypothesis problems Via 1365 E9C8YG for 6 E A EC5Y7 9l 9 2 where Re8 is the risk incurred by a test 8 when the parameter is equal to 6 and IE9 means the expectation assuming Y N P9 0 Then the average or Bayes risk is 45 i E 1385 3 where expectation is wrt 9 o A Bayes optimal rule is a rule that minimizes r8 in 3 k Dr S K Jayarweera Fall 07 0k 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory Average Risk in Bayesian Composite Hypothesis Testing ctd Since EeC5Y79 IEC5Y79 97 from 3 r5 IEIEC5Y7 IEC8Y sinceEXEEXlY 4 c From 4 r8 is the cost of using 8 averaged over both 9 and Y k Dr S K Jayaweera Fall 07 k n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory N Bayes Optimal Rule for Composite Hypothesis Testing 0 But we can write 4 as rlt6gt EEClt6ltYgt mm 5 EEClt6ogtegtlr y 6 c From 6 we see that r8 is minimized over 8 if for each y E P we choose 8y to be the decision that minimizes the posterior cost EC5y7lY y 0 Since 80 can only take either 0 or 1 we see that a Bayes rule for this problem is then 1 if EClt1 gtIY y lt ECltoegtlr y 830 0 or 1 if EC1Y y EC0 Y y 7 0 if EClt1 gtIY y gt ECltoegtlr y k J 8 Dr S K Jayarweera Fall 07 139le LlNI39lRI39l v 3 NEW MEXICO ECE642 Detection and Estimation Theory Bayes Composite Hypothesis Testing Interpretation of 7 N given the observation y in 226 Then I3C7l397 lY y k Dr S K Jayarweera Fall 07 0 Hence 83 chooses the hypothesis that is least costly on average 0 In the case where A 07 l 7 reduces to the Bayes rule for simple hypothesis testing we discussed earlier Recall that we interpreted it also as minimizing the posterior cost Ci0PHoY yCi1PH1Y y Ci70750yCi71751y posterior cost of choosing HZ 0k 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Bayes Composite Hypothesis Testing Special Case o For many problems of interest the parameter space A can be decomposed into two disjoint sets A0 and A1 representing the two hypotheses H0 and H1 respectively with costs being uniform over these sets ie Ci0 CU for 06A and jE0l 8 0 Then EC1 IYJ CIOPiGEAOIYylC11P A1Yyl where P E A J Y y is the conditional probability that 9 lies in A J given Y y and EC0 Y y CO0P E AolY y C01P 6 MW y k J Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Bayes Composite Hypothesis Testing Special Case ctd 0 Hence IECLGNY y lt IEC079IY y 9 is equivalent to C10P 6 MW y C11P e AIIY y lt C00P 6 MW y C01P e AIIY y o SinceP E A0Y y 2 0 andP E A1Y y 2 0 C10 C00P EAOYy lt C01 C11P EA1Yy P A1Yyi C C C C 10 00 lt 01 11P AOIYy k Dr S K Jayaweera Fall 07 1 1 139le LINI39IRI39I v J NEW MEXICO ECE642 Detection and Estimation Theory Bayes Composite Hypothesis Testing Special Case ctd 0 Assuming that C11 lt C01 as before 9 becomes P A1Yyi Clo Coo 10 P A0Yyi C01 C11 Usmg 9 and 10 in 7 the Bayes rule becomes 1 gt i P oeA Y C C 830 i 0 or 1 1f PbeA H i C3761 0 lt K J 12 Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Bayes Composite Hypothesis Testing Special Case ctd 0 To further simplify 11 let us assume that Y has conditional densities py E Aj for 6 01 0 From the Bayes formula we have that for j E 07 1 i i W i Py AjP Aj 11 my where 1 My 2 19y E AJP E Aj 12 K 1 1 o k Dr S K Jayaweera Fall 07 139le LIN39IRI39I v J NEW MEXICO Bayes Rule for Composite Hypothesis Testing Special Case ECE642 Detection and Estimation Theory k 0 Using 11 P EAIIY y 19049 6A1P 6A1 13 P EAOIY y 19049 EAOP 6A0 0 Substituting 12 in 11 1 gt 83y 0 or 1 ifLy 553333 14 0 lt where 9049 6 A1 L and 75PGEA for 15 14 Dr S K Jayaweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory k Bayes Rule for Composite Hypothesis Testing Special Case ctd 0 Thus from 14 and 15 we see that the Bayesian composite hypothesis testing problem is equivalent to a simple Bayesian hypothesis testing problem with pJy py E Aj Dr S K Jayaweera Fall 07 n 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory Finding Conditional Density p048 E Aj 0 Suppose that for each 9 E A the distribution P9 has density 199 ie observation Y has the conditional distribution P9 y and conditional density 199 given 9 0 0 Also assume that 9 has density w6 Then fAjpeyw9d9 IA W9d9 0 De ne conditional densities w 16 of 9 given 9 E A for j 07 l as 19y9 6 A1 16 e y ireeAj e J 17 WJ 0 ifeAj where 7 P6 Aj w6yd6 18 AJ39 J Dr S K Jayarweera Fall 07 39 39I39HI39 LlNIVliRSI39lW quot NEW MEXICO ECE642 Detection and Estimation Theory Finding Conditional Density py 6 AJ ctd 0 Hence using 17 and 18 in 16 me 6A APGyWj9d9 lt19 k J Dr S K Jayaweera Fall 07 r 139le UN39IRI39I J NEW MEXICO Example 1 Testing 0n the Radius of a Point in the Plane ECE642 Detection and Estimation Theory k 0 Suppose a vector observation model where F R2 ie Y Y 1Y2T 0 Our hypotheses are Y 8 H0 1 1 Y 2 82 versus 20 Y A cos I 8 H1 1 1 Y 2 A sin 1 82 A is a positive constant 1 is a random variable distributed unifome in 0271 81 and 82 are both 5M0 62 random variables that are independent of each other and of 1 J Dr S K Jayarweera Fall 07 n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory N Testing 0n the Radius of a Point in the Plane o This observation Y can then be thought of as a noisy measurement of the coordinates of a point in the plane that is either at the origin or is uniformly distributed on a circle of radius A we will see the applications of this model later 0 The parameter in this case can also be taken to be a vector parameter of the form 91 92 8 With 91 6 07A and 92 6 0275 k J Dr S K Jayarweera Fall 07 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory Composite Hypothesis Testing Example 1 Testing on the Radius of a Point in the Plane ctd 0 Hence the parameter set A in this case is A 0A x 0275 21 0 Corresponding sets A0 and A1 for the two hypotheses can then be written as A0 E A I 91 0 and 22 A1 E A I 91 A 0 Then given 9 0 observation vector Y under both hypotheses is a vector of two independent components where the two components are 2amp0 62 with mean shifted by 61 cos 02 and 01 sin 02 Dr S K Jayarweera Fall 07 39I39HI39 LIN39IR39I quot NEW MEXICO ECE642 Detection and Estimation Theory Composite Hypothesis Testing Example 1 Testing on the Radius of a Point in the Plane ctd 0 Thus given 8 0 the density of Y is pey pemyyz 199 01 19902 since Y1 and Y2 are independent 23 Y1 Glcos 281 Y2 1sin 282 1 701176100562 196y1 2766 2 52 1 7yzielsin622 and 19902 mce 262 k Dr S K Jayaweera Fall 07 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE642 Detection and Estimation Theory N Densities under Two Composite Hypotheses 0 Hence from 23 1 eq yyel pey 2Twze 262 for yER2 24 where qye yl elcosez2y2 elsmez2 25 0 Hence from 16 INN 1 ye 1 275 pyl0 Ao EO peltygteodez peltygteo We 20 and 1 2 1 2n W pye A1 EO POY61Ad92 WO 262 d K J 22 Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory Testing on the Radius of a Point in the Plane Likelihood Ratio 0 Since from 15 the likelihood ratio is 19yl0 6 A1 19yl0 6 A0 0 Substituting for py0 6 A0 and py0 6 A1 in 26 Ly 26 275 i yl 7A cos 622yziA sinez2 1 f0 6 262 2 27 3 e 262 1 2 iLizA 672A396 A2 EA 6 262 yi COS 2 yz 5111 2 dez Ly Ly 622 2nelt ylcosezJ2Sin62d62 27 0 k Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory V Likelihood Ratio ctd 0 Introduce the following change of variables r vy y 28 and j tan 1 29 yl 0 Then yi F0050 30 y2 rsin 31 0 Substituting 30 and 31 in 27 2 275 A A L y e 22 A e 2 rcos cosGzr sin sin62d62 32 K J Dr S K Jayaweera Fall 07 139le UN39IRI39I iv 3 NEW MEXICO ECE642 Detection and Estimation Theory r k N Likelihood Ratio ctd 0 Using the identity cosA B cosAcosBsinAsinB 33 likelihood ratio becomes A2 275 r 7 Ly e 2 613 2 00562 10062 34 27 0 i Ar e 26210 a 35 where 102 is the zerothorder modi ed Bessel function of the rst kind de ned as Dr S K Jayaweera Fall 07 l 10 Z E 275 eZCOSeide 0 NEW MEXICO 39I39HI39 LIN39IR39IW quot ECE642 Detection and Estimation Theory Likelihood Ratio Tests for Testing on the Radius of a Point in the Plane k Dr S K Jayaweera Fall 07 0 Thus the Bayes NeymanPearson and minimaX tests are all of the form 1 gt N A2 50y v if E10 7 37 0 lt where r i y y and the threshold 17 and randomization y depend on the speci c optimality criteria 39I I lli Ll39quotl RSI39I39Y v NEW MEXICO ECE642 Detection and Estimation Theory Modi ed Bessel Functions 1 Tc GRAPHS OF THE BE EL F r I S 1V 2 g 0 ezc secosltvegtde SS ON SinVTC foce zcosht vtdt TC 0 0139 W Z i 39 l 2 3 r FIG 43 Inx n 0 l 2 K 27 Dr S K Jayaweera Fall 07 139le UN39IRI39I iv 3 NEW MEXICO ECE642 Detection and Estimation Theory f N Testing 0n the Radius of a Point in the Plane ctd 0 Since 102 is a monotone increasing function of its argument comparing Ly to a threshold 17 is equivalent to comparing r to another threshold quot5 given by 2 A2 1quot 6X4 we gt 38 0 Thus from 37 and 38 Bayes minimax and NeymanPearson tests are of the form 1 gt 80y y if r quotc 39 0 lt k J 28 Dr S K Jayarweera Fall 07 THE l l7i39ll l39l iquot tiquot NEW F v lEXICO ECE642 Detection and Estimation Theory Testing on the Radius of a Point in the Plane ctd a Note that since y1y2 are the coordinates of the point on a plane r is the distance of the point 021322 from the origin 0 Hence optimum tests for problem 20 operate by comparing this distance r to a threshold FIGURE IIE1 Decision regions for Example IIE1 F1 F3 Figure 1 Decision Regions for Example 1 F1 8 Dr S K Jayaweem Fall 07 29 39l39HI39 LIN39IRI39I v 4 NEW MEXICO ECE642 Detection and Estimation Theory Uniformly Most Powerful UMP Tests for Composite Hypothesis Testing Problems 0 When we do not have a prior distribution w6or conditional priors w 10 for the parameter 6 in a composite hypothesis testing problem we look at a generalization of the NeymanPearson optimality criteria 0 Suppose as before that we can decompose the parameter space into two disjoint sets A0 and A1 0 For a randomized decision rule 8 we can de ne falsealarm and detection probabilities as below PFS6 Ee8Y 6er 40 PDS6 EeSY eem 41 k Dr S K Jayaweera Fall 07 n 39I39Hl LlN39IRI39l n NEW MEXICO ECE642 Detection and Estimation Theory f UMP Tests for Composite Hypothesis Testing 0 Suppose that as in the NeymanPearson formulation we wish to be assured that the falsealarm probability does not exceed a given value 0L Then an ideal test would be one that maximizes PDS 0 for every 6 6 A1 subject to the constraint PF8 6 3 0L for 6 6 A0 ie an OLlevel test that is uniformly most powerful for each 6 6 A1 0 Hence Such a test is known as a uniformly most powerful UMP test of level 0L 0 Of course we would like to have such UMP tests but they exist only under special circumstances k Dr S K Jayarweera Fall 07 3 l NEW MEXICO 39I39HI39 LIN39IR39I quot ECE642 Detection and Estimation Theory k Dr S K Jayarweera Fall 07 Existence of UMP Tests for Composite Hypothesis Testing Problems Consider a case in which null hypothesis H0 is simple so that A0 consists of the single element 00 and H1 is composite with 6 6 A1 Assume that P9 has density 193 for each 6 E A Then the most powerful ie the one that maximizes PD oclevel test for H0 versus Y N P9 has the critical region Fe y E Flpeu gt 11960 y with quotc chosen to give size 0L and randomization if necessary Of course from the NeymanPearson lemma we know that this test is essentially unique so that any other oclevel test will have a smaller power see next slide k 39I39H I39 LlNI39lR39l quot NEW MEXICO ECE642 Detection and Estimation Theory When Does a UMP Test Exist for a Composite Hypothesis Testing Problem o For example suppose that we choose two elements 6 and 6 of A1 0 Then the test with critical region Fe will have a smaller power in testing H0 versus Y N Pew than does the test with critical region Fen unless these two critical regions are essentially identical 0 Thus we have the following condition for the existence of UMP tests A UMP test exists for H0 versus the composite hypothesis H1 Y N P9 6 6 A1 if and only if the critical region Fe is the same for all 6 6 A1 Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory k H12 6 gt 0 where go is a xed constant K N Example 1 UMP Testing of Location 0 Consider the parametric family of distributions P9 6 E Awhere P9 N M09 62 and A is a subset ofR 0 Suppose we have the following hypothesis pair Ho 0 0 versus 42 0 Now this is a problem with a simple null hypothesis A0 yo and a composite alternative hypothesis A1 090 Dr S K Jayarweera Fall 07 r k 139le UN39IRI39I iv J NEW MEXICO Example 1 UMP Testing of Location ctd ECE642 Detection and Estimation Theory N c From example 59 of location testing with Gaussian error under NeymanPearson tests we know that for each 6 6 A1 the most powerful OLlevel test of Ho versus Y N M09 62 where 6 gt go is of the form 1 if y 2 n 5NPV 0 43 0 if y lt 116 where n6 6Q 10c 0 44 0 Hence for each 6 6 A1 it has the critical region Fe y 6 Fly 2 6Q 10to 45 J Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Example 1 UMP Testing of Location ctd 0 Clearly from 45 the critical region Fe does not depend on 6 0 Thus 43 in this case is in fact a UMP test for 42 which we denote by 81 0 Also recall from 511 and 512 that the detection probability of this test 81 for each 6 6 A1 is G PD319 QQ10t 46 0 Note that the detection probability do depend on the speci c 6 value but still this is the best detection probability you can achieve for each 6 with a falsealarm rate no larger than 0L k J Dr S K Jayarweera Fall 07 n 39139le LlN39lRl39l n s NEW MEXICO ECE642 Detection and Estimation Theory f Example 2 UMP Testing of Location 0 Now let us consider for the same family of distributions ie Y N M09 62 the following hypothesis pair Ho 0 0 versus 47 H 1 I 9 i 0 o In this case the composite hypothesis corresponds to A1 ooy0 Uy0oo note that still A0 yo 0 F01 6 gt 02 The most powerful test is of the form of 43 with the critical region 45 k J Dr S K Jayarweera Fall 07 n 39139le LlN39lRl39l n s NEW MEXICO ECE642 Detection and Estimation Theory 7 N Example 2 UMP Testing of Location ctd 0 F01 6 lt 02 It is straightforward to show that the most powerful OLlevel test has critical region see Appendix D re yerlme lU OOW 48gt This critical region is also independent of 6 6 A1 However it is different from the critical region of test 81 given in 45 Thus no UMP test exists for the problem 47 1 Dr S K Jayarweera Fall 07 39l39l39ll39f lVe39lRSI39quot139My NEW MEXICO ECE642 Detection and Estimation Theory F o o V Example 2 UMP Testing of Location ctd 0 Let us denote by 32 the test with critical region 48 Then similar to 46 we can show that PD82 o 49 PD 2 9 5 IGURE IIE2 Power curves for test of 0 no versus 6 gt yo and 0 no versus lt 0 for location testing with Gaussian error Figure 2 Power Functions for Test of 9 2 yo versus 9 gt yo and 9 2 yo versus 9 lt m in Gaussian Noise 0 From the gure it is clear that neither test performs well when G is outside of its region of optimality ie neither is a UMP test k J Dr S K Jayaweera Fall 07 n 3939le LlN39IRI39l n NEW MEXICO ECE642 Detection and Estimation Theory f UMP is a Too Strong a Requirement to Ask for The conclusion we draw from this example is that the UMP criterion is too strong for many situations Sometimes this can be overcome by applying other constraints to eliminate unreasonable tests from consideration 0 eg One such condition is unbiasedness This means that we require PDSe a for ve 6 A1 in addition to the constraint PASS 6 3 OL 50 V Note this requirement would have eliminated both 81 and 82 tests from consideration more details see Lehmann 1986 k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory Locally Most Powerful Tests LMP for Composite Hypothesis Testing Problems o In many applications we have a parameter set A of the form A 90700 With A0 90 and A1 9000 0 In this case we have the hypothesis pair H0 2 e 90 versus 5 1 H1 I 6 gt 60 o For example this happens in signal detection problems in which 00 0 and 6 is a signal amplitude parameter k Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f39 LMP Tests for Composite Hypothesis Testing ctd o In many problems of the form of 51 we are primarily interested in the case in which under H1 6 is near 00 ie missing a signal If 6 is a signal amplitude parameter this corresponds to the case when the signal strength is small 0 Now consider a decision rule 0 Within regularity we can expand PD8 0 in a Taylor series about 6 902 PDSe PDS600 60P3s00O6 90252 53 We W k 1 Dr S K Jayaweera Fall 07 NEW MEXICO 39I39H I39 LlNI39lR39l H ECE642 Detection and Estimation Theory 7 LMP Tests for Composite Hypothesis Testing ctd Note that PD590 m8 Thus using 54 in 52 for all size0L tests we can write PD8 6 for 6 near 00 as PDS0 m 1 0 00P3860 for 6 near 00 and 6 gt 60 55 Thus for 6 near 00 and 6 gt 00 we can achieve approximate maximum power with size 0L by choosing 8 to maximize P563 60 Such a test that maximizes 13138 6 subject to falsealarm constraint PF8 3 0L is called an OLlevel locally most powerful LMP test or a locally optimum test Dr S K Jayarweera Fall 07 54 J 43 r7 39l39HI39 LIN39IRI39I v 3 NEW MEXICO General Structure of LMP Tests ECE642 Detection and Estimation Theory 0 Assuming that P9 has density 193 for each 6 6 A1 we have that PDlt8egt Ee sm fr ltygtpeltygtultdygt 0 Again assuming suf cient regularity on 1996 6 A1 so that we can interchange order of integration and differentiation Pam sweoww 66 fray a 196 660 0 Now if we compare 57 with 45 of NeymanPearson tests we can see that the oc level LMP design problem is the same as the oc level NeymanPearson design problem with 191 replaced by 796 V leeo Dr S K Jayarweera Fall 07 56 a lt v 57 n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory f General Structure of LMP Tests ctd 0 Using this analogy we can easily show that within regularity an OLleVCl LMP test for 51 is of the form 1 gt My v if 7960 660 npeoy lt58 0 lt where n and y are chosen so that PF8 0L Note You can prove 58 by following the proof of the NP lemma with the maximizing quantity being Pb8 6 as required in 55 rather than PD8 6 0 We will see applications of LMP tests later k J Dr S K Jayarweera Fall 07 39l39HI39 LIN39IRI39I iv 4 NEW MEXICO ECE642 Detection and Estimation Theory k Generalized Likelihood Ratio Test GLRT or Maximum Likelihood Test 0 When none of the above mentioned optimality criteria are applicable for a particular composite hypothesis testing problem in which A is the union of disjoint A0 and A1 a test that is often used is based on comparing the following quantity to a threshold 161331269 16133194 0 This test is known as the generalized likelihood ratio test or a maximum likelihood test 0 We will later justify this type of tests in a more systematic way Dr S K Jayarweera Fall 07 39I39HI39 UNIVERSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Appendlx D HozY N Nuo62 H1Y N Ne62 Gltyo my elty egt Ly 400 FO eyJ J9 Ly gt 1 gt e lt52 2 gt n 6210gn 0 9 16 y 0e 2 k J Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory r N Appendix D continued 1 ifysn 520 i 0 ifygtn 21 where n 603gen 02e PF820 P0 1960 gt Whoa P0 MY gt n P0Y ltnl 0 Since underHo Y Ny062 P0YltT PFSe l Qlt gt k Dr S K Jayaweera Fall 07 NEW MEXICO 39I39HI39 LIN39IR39IW quot ECE642 Detection and Estimation Theory f Appendix D continued 0 Hence 11 0 0L 1 Q a gt o This then implies that n GQ IU 0 0 0 Hence the critical region of an OLlevel NP test is Fe y 6 Fly lt 6Q 11 0to o The detection probability of this test is N i e PD529gt MY 311 1 QQ11 oc T ogt k J Dr S K Jayarweera Fall 07 39 39I39HIquot LlNIVliRSI39lW NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Deterministic Signal Detection in Discretetime Chapter III k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 02 August 23 Thursday Fall 2007 k J Dr S K Jayaweera Fall 07 A n 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory f N HypotheSIS Testing 0 Consider an Mary communications receiver We observe an electrical waveform that consists of one of M possible signals corrupted by random channel or receiver noise We wish to decide which of the M possible signals is actually present based on this observation 0 This is an M ary hypothesis testing problem 0 Most signal detection problems can be cast into such an M ary hypothesis testing problem k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVERSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Mary hypothe51s Testing Problem On the basis of an observation possibly a vector or a function we wish to decide among M possible statistical situations describing the observations k J Dr S K Jayaweera Fall 07 3 n 39139le LlN39lRl39l n s NEW MEXICO ECE642 Detection and Estimation Theory V Optimality Criteria for HypotheSIS Testing 0 Given such a problem there are a number of possible decision strategies or rules that can be applied But it makes sense to choose a decision rule that is optimum in some sense 0 There are several meaningful optimality criteria that can be used 0 Three most commonly used optimality criteria are l Bayes 2 NeymanPearson 3 MinimaX k Dr S K Jayarweera Fall 07 bk n 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory V Binary HypOtheSlS Testing o In binary M 2 hypothesis testing we assume that there are two possible hypotheses or states of nature denoted by H0 and H1 corresponding to two possible probability distributions Po and P1 respectively on the observation space F Q where F observation set 9 set of observation events a collection of subsets of F 0 We may write this problem as H0 I Y N P0 versus 1 H1 IY N P1 where the notation Y N P means that Y has distribution P k Dr S K Jayarweera Fall 07 wk 39l39HIquot UNIVERSI39H NEW MEXICO ECE642 Detection and Estimation Theory f N Terminology H0 Null hypothesis signal absent H1 Alternative hypothesis k J Dr S K Jayaweera Fall 07 6 n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory Dec1s10n Rules Hypothes1s Tests o A decision rule 8 for H0 versus H1 is any partition of the observation set F into sets F1 6 g and F0 I such that i We choose hypothesis H1 as true when y 6 F1 and ii We choose hypothesis H0 as true when y 6 F0 F1 is called the critical region 0 Note that the decision rule 8 can be think of also as a function de ned on F ie 1 if y 6 F1 so lt2 0 if y E I ie the value of the function 8 for a given y E F is the index of the hypothesis accepted by the decision rule 8 k J 7 Dr S K Jayarweera Fall 07 39I39HI39 LlNIVliRSI39l l quot NEW MEXICO ECE642 Detection and Estimation Theory Dec1s10n Rules Hypothes1s Tests ctd 0 Clearly designing 8 is equivalent to nding the set F1 critical region Since we would like to choose F1 or 8 in some optimum way we assign costs to our decisions k Dr S K Jayarweera Fall 07 k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory r k Costs and Conditional Risks N 0 Costs CU cost incurred by choosing hypothesis 139 when in fact hypothesis j is true forz 07 l and j 07 1 NOTE We Will take CU s to be nonnegative o Conditional Risks Conditional risk for each hypothesis is the average or expected cost incurred by decision rule 8 when that hypothesis is true Dr S K Jayarweera Fall 07 0k 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Costs and Conditional Risks ctd o For example conditional risk for hypothesis H0 under the decision rule 8 is 1305 C00P0Fo C10P0F1 where PJl i Probability of choosing HZ when H J is true 0 Thus for j 07 l the conditional risk R J 8 in general is RJ5 COJPJ39F0 C1ijF1 f0r1071 3 k J Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f A Priori Probabilities o The a priori probability 7 is the probability that hypothesis HJ is true unconditioned on the value of Y 0 Thus Probability of the occurrence of hypothesis H0 750 71 Probability of the occurrence of hypothesis H1 0 Note that 73951 1 750 4 k Dr S K Jayarweera Fall 07 l l 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Bayes1an Hypothes1s Testing Bayes Risk o Bayes or average risk r8 is the average cost incurred by decision rule 8 for given values of priors ie 750R08751R18 5 k J Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f N Bayes Rule for hypothes1s Testing o Bayes optimal decision rule for H0 versus H1 is the decision rule that minimizes over all decision rules the Bayes risk r8 de ned above 0 Such a decision rule is known as a Bayes rule or a Bayes test for H0 versus H1 k J Dr S K Jayaweera Fall 07 n 39139le LlN39lRl39l n NEW MEXICO ECE642 Detection and Estimation Theory f Minimization of Bayes Risk 0 Substituting for Ro8 and R18 we have that I 5 750C00P0FoC10P0F1751C01P1FoC11P1F1 6 0 But note that PJF0 P1Fi 1 PJF1 0 Hence TEQC001 P0F1C10P0F1 751C011 P1F1C11P1F1 750C700 TE1C01 750 C10 C00P0F1TE1 C11 C01P1F1 1 1 2 751C01 2751711 C01PJF1 7 10 10 K J Dr S K Jayaweera Fall 07 n 39139le LlN39lRl39l n NEW MEXICO ECE642 Detection and Estimation Theory r N Minimization of Bayes Risk ctd 0 Now assume that PJ ie distribution of Y under Hj has density 19 either pdf or pmf for j 07 1 0 Then using our compact notation earlier we can write the bayes risk as 1 1 J TEjCOjJ lt75 Clj COJgt11Pj dygt 8 1 1 i 2 751C701 L2 751711 C01Pjy MW 9 10 1quot1 390 M5 k Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory Minimization of Bayes Risk ctd 0 Recall that our goal in Bayes hypothesis testing is to choose F1 or 8 so that r8 is minimized c From 9 it is clear that r8 is minimized by choosing F1 so that it include only those values of y for which the integrand in the second term is nonpositive o ie r8 is minimized if we choose F1 as l 1 l y 6 Fl 2751ltClj C0119jy 0 10 y 6 1 lm C11 C01I91V S 750 C00 C10I90V 10 k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory Minimization of Bayes Risk ctd o In order to further simplify the above decision rule 10 let us assume that C11 lt C01 note that this make sense since the cost of correctly choosing H1 when H1 is true should be less than the cost of incorrectly choosing H0 when H1 is true 0 Then we have that 751C11 C0191y3750C00 C1019007 11 750 C00 C10 191y 71C11C01I90y or gt 750 C10 C00 71C01C11190y 12 191y k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H u NEW MEXICO ECE642 Detection and Estimation Theory Minimization of Bayes Risk ctd 0 Hence the Bayes optimal decision rule is F1 y E Flp1y Z 11909 13 where the threshold 17 is given by i 750C10 C00 TiTE1C01 C11 14 0 NOTE The region y E Flpl quotcpoy does not contribute to the Bayes risk ie the average error and thus this region can be omitted in whole or in part from Flwithout affecting the risk incurred k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Few Remarks 0 We may rewrite the above Bayes decision rule as r1 yeFli gizr y FLyZI 15 where i 191y My 7 My yeF 16 o The quantity L0 de ned above is known as the likelihood ratio or the likelihood ratio statistic between H0 and H1 In computing Ly we interpret 1 as 0 for any k 2 0 k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Few Remarks ctd 0 Thus the above decision rule de ned by the rejection region F1 is known as a likelihoodratio test LRT or a probabilityratio test The likelihoodratio tests play a central role in the theory of hypothesis testing as we will see throughout this course 0 Thus the Bayes decision rule given above computes the likelihood ratio for the observed value of Y and then makes its decision by comparing this ratio to the threshold 17 k J Dr S K Jayarweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory Bayes Rule for Binary Hypothes1s Testing Bayes optimal decision rule for binary hypothesis testing is 1 if Ly 2 quotC 5 17 BO 0 if My lt T7 where 191 0 Ly i 1900 for yEF and T 750C10 C00 751C01 C11 K Dr S K Jayaweera Fall 07 n 39139le LlN39lRl39l n i NEW MEXICO ECE642 Detection and Estimation Theory f A Special Case Uniform Cost Assignment o A commonly used cost assignment is the uniform cost assignment C 0 if ji 1 18 1 1f j7 i o ie there is no cost for the correct decisions which of course makes sense 0 With this cost function the conditional risks simplify to from 3 1305 P003 1315 P1Fo 19 k J Dr S K Jayarweera Fall 07 139le LlNIVIiRsI I i v J NEW MEXICO ECE642 Detection and Estimation Theory k Special Case Bayes Risk with Uniform Cost Assignment o The Bayes risk for a decision rule 8 with critical region F1 is given by from 5 TEoRo 751R1 TEoP0F1 751P1F0 Dr S K Jayaweera Fall 07 20 NEW MEXICO 39I39HI39 LIN39IR39I quot ECE642 Detection and Estimation Theory k Dr S K Jayarweera Fall 07 Uniform Cost Assignment Minimum Probability of Error Detection Note that PlFj i probability of choosing H J when H is true Hence if i y j then PlFj is the conditional probability of making an error given that H is true Thus in this case r8 is the average probability of error incurred by the decision rule 8 Since Bayes rule given by the above likelihoodratio test minimizes r8 Bayes rule for the uniform cost assignment is in fact a minimum probability of error decision scheme ie Minimum probability of error optimality criteria can be obtained as a special case of Bayes optimality J 24 139le LIN39IRI39I39 v J NEW MEXICO ECE642 Detection and Estimation Theory k Minimum Probability of Error Decision Rule for Binary Hypothesis Testing 0 Thus minimum probability of error decision rule for binary hypothesis testing is 1 if Ly 2 quotC 530 i 0 1f Lylt I WhereLy 283 fory E Fandr Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f a Posteriori Probability 0 Let us denote by 790 the conditional probability that hypothesis H is true given that the random observation Y takes on value y ie TcJy PrHJistrueYy 21 Note that 7 de ned earlier was the unconditional or a priori probability of hypothesis H j 0 Recall the Bayes formula PBAlPAl PBAlPAz P A B 22 39 PltBgt 21PltBIAJgtPltAJgt 0 Using the Bayes formula i PJCVW i MOW 23 W my poltygtnop1ltygtm k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory r Reformulation 0f Bayes Rule Via a Posteriori Probabilities c As 7 is called the a priori probability of hypothesis H j the probability 7 y is called the a posteriori probability of the hypothesis given Y y o By using the expression for the a posteriori probability 7 y the critical region of the Bayes rule can be rewritten in a new form from 10 1 1 yEFITE1C11 C01I91y S 750C00 C10I90V yEFI 751y19yC11 C01 S W0I9yCoo C10 Hence 1ll y 6 1ll 7107509 C11751V S 7007500 C01751y 24 k J Dr S K Jayarweera Fall 07 n 39I39HIV LlN39IRI39l n s NEW MEXICO ECE642 Detection and Estimation Theory f Another Interpretation of Bayes Rule 0 Note that according to this new formulation of the Bayes rule given by 24 the optimum decisions are based on the posterior probabilities 750 y and 751 y We can think of the observation process as being a mechanism for updating the prior probabilities of the hypotheses into posterior probabilities via 23 0 Note also that the quantity Cloico y Cl1751 is the average cost incurred by choosing hypothesis H given Yy This quantity is thus called the posterior cost of choosing hypothesis H given the observation y Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Another Interpretation of Bayes Rule ctd o In the new formulation the Bayes rule makes its decisions by choosing the hypothesis that yields the minimum posterior cost 0 Thus Bayes rule for the binary hypothesis testing is in new formulation 8 y 1 ifCiOTEoy C11751y S 70075000 C017c1y B 01fC10750yC11751V gt 70075000 C017c1y where 1 W751 7v 26 JO py and py poyTEo 191 00751 and 19 is the density oij K J Dr S K Jayarweera Fall 07 k 139le LlNI39lRI39l iv 4 NEW MEXICO Special Case with Uniform Costs MAP Decision Rule ECE642 Detection and Estimation Theory N o ReVisiting the uniform cost assignment the minimum probability of error decision rule becomes from 25 8 y 11f7c1y2 70y 01f7r1ylt my ie Bayes Rule or the minimum probability of error decision rule in this case chooses the hypothesis that has the maximum a posteriori probability MAP of having occurred given Y y 0 Thus it is also known as a MAP decision rule Dr S K Jayarweera Fall 07 I39Hr LINI39IRI39I w 4 NEW MEXICO ECE642 Detection and Estimation Theory f Next Tlme Applications of Bayesian Hypothesis Testing Section 11B k Dr S K Jayaweera Fall 07 3 1 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 27 December 6th Thursday Fall 2007 k J Dr S K Jayaweera Fall 07 A 139le LlNI39lRI39l iv 4 NEW MEXICO ECE642 Detection and Estimation Theory Dr S K Jayarweera Fall 07 f N Rational Spectra 0 Power spectrum In0 of a random sequence Ynoo is rational if it can be written as no 22 nkcosk0 Y W ful 1 do 22k1dkcosk0 where m and p are positive integers and no w np 010739 dm are real numbers 0 Many random sequences have this type of rational spectra 0 A rational spectra can be rewritten as I w i no 2 1nkeikm eiikm i 2277pn k elm 7k i Neiw Y do 2221dkleik we 22 ldkKeka well2 where Nz 2 7pn k z k and Dz 217 d k z k k J 2 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f Rational Spectra 0 Note that p p ZpNZ 2127 2 n k zik 2 n k zp k 67 67 ZZP 2E1 k O quotp 0 Hence ZpNZ is a 2 pth order polynomial Thus it has 2 roots which we denote by 2122 w 22p 0 Then 2 2 ZpNZ anz zk gt NZ npz pHz zk 3 k1 k1 k J Dr S K Jayarweera Fall 07 3 39I39HI39 UNIVHLNI39H u NEW MEXICO ECE642 Detection and Estimation Theory f Rational Spectra 0 But note that 0 Hence the roots of ZpNZ must be in reciprocal pairs ie if 2k is a root then zgl must also be a root 0 Suppose that the roots are ordered such that le Z 22 w 2 22p Then 1 l l 7 5 IZZPI lzll7 3922 ll 227 7 IZP1 lZpl 0 Since 2p 2 zp1 the last equality then implies that lzpl Z 1 6 k Dr S K Jayarweera Fall 07 pk 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f Rational Spectra 0 From 3 1 2p NZ anipHZ Zk H Z Zk 7 k1 kp1 w p w p 71 f Hi zk 11312 Zkgtlt H 1Zkkl1z 29 Bltz lgtBltzgt 8 where 7 w p 1 32 H 12k 1152 Zk with 21 Z 22 Z Z IZpl Z 1 k Dr S K Jayaweera Fall 07 wk 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Rational Spectra 0 Since B z is an order p polynomial of 2 1 it can be expressed as 17 32 2 bkz k 10 k0 o RootsofBz are 1 1 72iwherei w i l P l 21 7 22 7 lZil 22 k Dr S K Jayarweera Fall 07 0k 139le UNIVLRM39I i iv J NEW MEXICO r7 Rational Spectra 0 Similarly we can write DZ AltZ 1Az where 11 dm1m m 71 m 7k AZ m Z pk akz 12 Hk119k k1lt gt IE6 where lpll 2 Ipzl 2 2 Ipml 2 pi 2morder polynomial z D V o RootsofAz arepilwifw where gmlt Pm Dr S K Jayarweera Fall 07 1 ECE642 Detection and Estimation Theory 2 k n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory f PaleyWiener Condition 0 Let us assume that lZpl gt 1 and lpml gt 1 so that both Az and 32 are stable transfer functions This ensures that DY 0 is bounded from above and bounded away from zero from below 0 The above assumption zp gt 1 and lpml gt 1 in turn ensures that In0 satis es the PaleyWiener condition required for the existence of a spectral factorization 1 TE Elnlog ywdw gt ltgto l3 k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Spectral Factors of Rational Spectra c From 2 8 and 11 a rational spectra can be written as Mm 14gt o It can be shown that 1 both BM A m and 467 are causal and stable transfer functions 6 36 2 both if and 481 are anticausal and stable transfer 6 Be functions 0 Hence required spectral factors of by 0 are DWOJ lt causal 15 PROD IY lt anticausal 16 K J 9 Dr S K Jayaweera Fall 07 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE642 Detection and Estimation Theory f Spectral Factors of Ratlonal Spectra ctd 0 Observe also that a Mm w w ar101ltIgtIV03 WGDM2 ltIgt1F032 l FWD is anticausal 1 b W00 1s causal and k J Dr S K Jayaweera Fall 07 n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory r Example 1 Prediction of a Widesense Markov Sequence o A covariance stationary observation sequence Yn oo is widesense Markov if its correlation function is of the form of Cyn PrW form 6 Z 17 whereP gt 0 and r lt l 0 Suppose we want to causally predict YHX based on Yn w ie X Ytx 18 0 Optimal causal WienerKolmogorov lter transfer function is A i 1 XY03 W we wen i where H 0 operation means that if H 0 2700 hne i o l then H03l 22 ohne imquot Dr S K Jayarweera Fall 07 1 1 19 39I39HI39 UNIVHLNI39H u NEW MEXICO ECE642 Detection and Estimation Theory Example 2 Filtering of a Widesense Markov Sequence in White Noise 0 Consider the observation sequence given by Yn Sn Nn for n E Z 20 where Sn2 oo and Nnoo are zeromean orthogonal wss sequences 0 NHM is a white sequence with IE 62 Hence CNn 62807 for 176 gt DA0 6 TE 3 0 at o Snl oo is widesense Markov with Cyn Prini for 176 gt 1503 whereP gt 0 and r lt l Dr S K Jayarweera Fall 07 k Dr S K Jayarweera Fall 07 n 39139le LlN39lRl39l ii NEW MEXICO ECE642 Detection and Estimation Theory Example 2 Filtering of a Widesense Markov Sequence in White Noise 0 Suppose we want to causally estimate the signal sequence at time t A X Stx 23 0 Optimal causal WienerKolmogorov lter transfer function is A l H w 7 W 0 24 Y 03 Y 03 00 where H 0 operation means that if H 0 W700 hne im then H03 22 ohne m 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 26 November 29th Thursday Fall 2007 k J Dr S K Jayaweera Fall 07 A r k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory N Linear Estimation and WienerHopf Equation o In general given the observation sequence Y j we seek an estimate of the form of b t htnYnct 710 where c is chosen to match the means so that b ct E X 2 hME rm 2 710 and hm w ht b are determined by the WienerHopf equation b CXytl 2 hulk171 for all a g l g b 3 710 J Dr S K Jayarweera Fall 07 2 n 39I39HIV LlN39IRI39l n s NEW MEXICO ECE642 Detection and Estimation Theory f WienerKolmogorov Filtering As in Levinson ltering WienerKolmogorov ltering problems also assumes that the observation sequence Yn is widesense stationary ie Cynl Cyn l0 Cyn l for all n and l 0 Furthermore it also assumes that the observation sequence Yn and the sequence Xn are jointly widesense stationary ie CXY 717 Cox0611 CXY I n for all I and n 4 MD We also assume that both Yn and Xn sequences are zeromean Most importantly we assume that the number of observations are in nite Noncausal WienerKolmogorov ltering a ltgto and b oo Causal WienerKolmogorov ltering a ltgto and b t k J 3 Dr S K Jayarweera Fall 07 NEW MEXICO r 39I39HI39 LlNI39lR39l H Noncausal WienerKolmogorov Filtering ECE642 Detection and Estimation Theory N k Here we try to estimate X at time I using observations of Yn for all ltgt ltnltltgtoiea ltgtoandbltgto 5 Since ht may not be zero for I gt n 18 is called noncausal ltering When samples are spatial samples as in images or antenna arrays this noncausality is not necessarily a problem The WienerHopf equation for 18 is CXytl 2 hulk171 forall ltgtoltlltltgt 71790 Dr S K Jayarweera Fall 07 6 J 4 k Dr S K Jayaweera Fall 07 NEW MEXICO 39I39HI39 LIN39IR39I quot ECE642 Detection and Estimation Theory Noncausal WienerKolmogorov Filtering WienerHopf Equation 0 Using the widesense stationary assumption 6 becomes CXyt l 2 htanCYOq l forall ltgtoltlltltgto 7 71790 0 Substituting 1 I quotc in 7 i 2 htnCYnT I forall ooltrltoo 8 71790 CXY T 7 0 Substituting n I 0L in 8 2 h0 CY quotc 0L for all ltgto lt quotC lt co 9 1700 CXY T wk n 3939le LlN39IRI39l n NEW MEXICO ECE642 Detection and Estimation Theory f Solution to WienerHopf Equation is TimeInvariant 0 From 9 we see that the solution to WienerHopf equation can be chosen independently of I ie an optimum htn oo can be chosen such that hhtia only depends on 0L or that 11mm 1060 hoc In other words if a solution to the WienerHopf equation exists we can choose it to be timeinvariant or shiftinvariant such that hth httitin Ilia20 htin With this the WienerHopf equation 9 becomes the discretetime convolution of the two sequences hn x and Cy 1420 CXyT 2 haCYT X forall ltgtolttltoo 10 1790 hTCyT forall ltgtolt cltltgto 11 k J 6 Dr S K Jayaweera Fall 07 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE642 Detection and Estimation Theory k Discretetime Fourier Transforms 0 Suppose that the following discretetime Fourier transforms exist XY 03 and Y 03 2 hne im l for 7 0 7 71790 transfer function of the lter hn ne l for TE 3 0 S 7 HP 2 Cy14e m l for TE DST nioo Dr S K Jayaweera Fall 07 cross power spectral density of the sequences Xn and power spectral density spectrum of the sequence Yn oo 12 14 J 7 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory Noncausal WienerKolmogorov Filtering Solution to WienerHopf Equation 0 With the de nitions 12l4 the WienerHopf equation 10 can be written in frequency domain simply as XY 03 H0 y0 forall 7 o 7 15 0 Hence the transfer function of the optimum estimator hn oo or the optimum lter is given by 0 H0 XY forall 430337 16 Y 03 c From 16 we can obtain the optimum estimator as 1 7 i Me mdw for oo lt n lt oo 17 275 in DY k J Dr S K Jayaweera Fall 07 8

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.