### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Detection and Estimation Theory ECE 642

UNM

GPA 3.99

### View Full Document

## 38

## 0

## Popular in Course

## Popular in Engineering Electrical & Compu

This 20 page Class Notes was uploaded by Roel Green on Wednesday September 23, 2015. The Class Notes belongs to ECE 642 at University of New Mexico taught by Sudharman Jayaweera in Fall. Since its upload, it has received 38 views. For similar materials see /class/212147/ece-642-university-of-new-mexico in Engineering Electrical & Compu at University of New Mexico.

## Popular in Engineering Electrical & Compu

## Reviews for Detection and Estimation Theory

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/23/15

39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f ECE642 Detection and Estimation Theory Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 23 November 20th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A k 39I39HI39 LIN39IR39I H NEW MEXICO ECE642 Detection and Estimation Theory Nonrandom Parameter Estimation Problems with MVUE N 0 Although MVUE s are a natural choice in estimating nonrandom parameters in practice they may not be easy to nd We may not know the pdf of the data Y or we may not want to assume a model for the data Y It may be dif cult to nd a complete suf cient statistic even if we knew the pdf of data 0 In these situations we have to resort to suboptimal estimators As long as we can guarantee the variance of a suboptimal estimator is small enough to satisfy system speci cations it might be good enough Dr S K Jayarweera Fall 07 k 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory r k Best Linear Unbiased Estimator BLUE N o A common approach is to restrict the estimator to be linear We may want to nd an estimate that is unbiased and has minimum variance within this class 0 Such an estimator is called the Best Linear Unbiased Estimator BLUE 0 When it exists the BLUE estimator can be found with only the knowledge of rst and second order statistics of data Y no knowledge of complete pdf is necessary Dr S K Jayarweera Fall 07 k n 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory f N BLUE Estimation 0 However for some problems we may not be able to nd a linear estimator that is unbiased In those cases we can not nd a BLUE and it is not an appropriate criteria for nding an estimator at all 0 We may be able to get around this problem by rst transforming the data and then nding a BLUE estimator in terms of the transformed data See pp 134 of S M Kay Estimation Theory k Dr S K Jayarweera Fall 07 k n 39I39Hl LlN39IRI39l g s NEW MEXICO ECE642 Detection and Estimation Theory f N BLUE Estimator 0 Suppose that our observations are an n vector Y Y 1 Y2 YnT and the parameter 6 to be estimated is related statistically to Y ie we have an observation Y E P where the distribution of Y is a member of a class of distributions Pe0 E F on F g indexed by a real parameter 6 lying in some set A o The BLUE criteria restricts the estimator to be linear in data 1 eBLUEY 2 am aTy 1 k1 where the coef cient vector a 611612 anT is to be determined 0 We nd a such that the estimator BLUEy in l is unbiased for 6 and has the minimum variance within all linear unbiased estimators k J 5 Dr S K Jayarweera Fall 07 n 39139le LlN39lRl39l o s NEW MEXICO ECE642 Detection and Estimation Theory f Unbiasedness Requirement of the BLUE Estimator o The unbiasedness of BLUEy requires IE BLUEY e for all e e A 2 c From 1 this requires that EaTY aTEY e for alieeA 3 0 Clearly for 3 to be satis ed the data vector y should satisfy the condition EY se for all 6 E A 4 where s s1sz7 7Sn is assumed to be known k J Dr S K Jayarweera Fall 07 6 39I39HI39 UNIVHHI39H if NEW MEXICO ECE642 Detection and Estimation Theory f Variance of the BLUE Estimator 0 With the assumption 4 the unbiasedness criteria 3 is equivalent to the following normalization condition on the coef cient vector a aT s 1 5 The variance ofthe BLUE estimator 1 is given by Val BLUEYgt E BLUEY E BLUEY EaTY aTEY2 EaT Y IEY Y EYTa aTC0vYa aTZya 6 where C0vY ZY is the covariance matrix of data Y k Dr S K Jayarweera Fall 07 k n 39139le LlN39lRl39l n s NEW MEXICO ECE642 Detection and Estimation Theory V Derivation of the BLUE Estimator c From 5 and 6 nding the BLUE estimator is equivalent to nding the coef cient vector a that is the solution to the following constrained optimization problem argmin aTZYa a subject to aTs l 7 o The corresponding Lagrangian function L is L aTZYa7taTs l 8 where 7 is the Lagrangian multiplier k Dr S K Jayarweera Fall 07 k n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory V Derivation of the BLUE Estimator 0 Setting the gradient of L wrt a to be zero gives 8L 8a 0 We can nd the Lagrangian multiplier 7 by substituting 9 in the constrained equation 7 a 22Yaxs 0 gt a Z ls 9 10 9V T l 2 Y STZ IS 0 Back substituting 10 in 9 gives the optimal coef cient vector 2 15 a Y 11 STZ IS You may verify that this indeed is the global minimizer by using the positivede niteness of the covariance matrix 2y J 9 Dr S K Jayarweera Fall 07 n 39l39HI39 LlN39lRl39l n NEW MEXICO ECE642 Detection and Estimation Theory f N The BLUE Estimator 0 Using the optimal coef cient vector a of 11 in 1 the BLUE estimator can be written as STZ 1y STZ IS BLUE Y 12 0 Note that the determination of the BLUE estimator only requires knowledge of s ie the mean of data Y knowledge of the covariance matrix ZY of data Y k J Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory f N Minimum Variance Achieved by the BLUE Estimator o The minimum variance among all linear unbiased estimators achieved by the BLUE estimator can be obtained by substituting l l in 6 Varlt BLUEY aTZYa 71 71 sTZY Y ZY s sTZ 1s sTZ 1s l 13 Til sZYs k Dr S K Jayarweera Fall 07 l l 39I39H I39 LlNI39lR39l H NEW MEXICO ECE642 Detection and Estimation Theory k Extension to Vector Parameters BLUE Estimator of a Parameter Vector 0 We can easily generalize the above BLUE estimator when we have an mdimensional parameter vector 0 to be estimated BLUEY Ay 14 where A is an m x n coef cient matrix to be determined 0 Note that 14 implies that the BLUE estimator of the ith component of the parameter vector 0 is of the form of A T n eiBLUEY a y 2 aikyk 15 k1 where a 01170137 7am is the ith row of the matrix A Dr S K Jayaweera Fall 07 n 39139le UNIVHHI39H n s NEW MEXICO ECE642 Detection and Estimation Theory r Unbiasedness of BLUE Estimator of a Parameter Vector o The unbiasedness of BLUEy requires E BLUEY AEY e forallGEA 16 0 Again for 16 to be satis ed the data vector y should satisfy the condition EY SO for allGEA 17 where S 5152 w 7Sm is an n x 171 matrix that is assumed to be known 0 Substitution of 17 in 16 shows that the unbiasedness criteria to be equivalent to the following condition on A AS 1 18 ie alTsj 81J forilmandjlwm l9 k Dr S K Jayarweera Fall 07 39I39H I39 LlNI39lR39l quot NEW MEXICO ECE642 Detection and Estimation Theory 7 k Variance of the BLUE Estimator of a Parameter Vector o The variance of the BLUE estimator dual y of the ith component of the parameter vector 0 is Var iBLUEYgt E iBLUEY E gumMY EalTY alTIEY2 IE alT Y IEY Y EYTai aiTC0vYal alTZYal where C0vY ZY is again the covariance matrix of data Y c From 19 and 20 it is clear that what we have is an m uncoupled problems of BLUE estimator design Dr S K Jayaweera Fall 07 20 39I39HI39 UNIVHHI39H u NEW MEXICO ECE642 Detection and Estimation Theory N The BLUE Estimator of a Vector Parameter 0 Following the same procedure as previously we can show that the BLUE estimator of a vector parameter 0 is given by A 1 GBLUE y STZQIS STZQIy 21 0 Note that again the determination of the BLUE estimator only requires rst and second order statistics of data knowledge of S ie the mean of data Y knowledge of the covariance matrix ZY of data Y k J Dr S K Jayaweera Fall 07 Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE642 Detection and Estimation Theory Minimum Variance Achieved by the BLUE Estimator of a Vector Parameter o The covariance matrix of the the above BLUE estimator can shown to be 22 C0vlt BLUEY i STZQIS71 o ie the variance of the ith component of the BLUE estimator is ltST2YISV 11 Var iaBLUEY 7 23 n 39139le UNIVHHI39H n NEW MEXICO ECE642 Detection and Estimation Theory N GaussMarkov Theorem 0 Suppose that the data are generated by the general linear model Y HO N 24 where H is a known n x m matrix 0 is an mvector of parameters to be estimated and N is an nvector of noise with zeromean and the covariance matrix CN Then the BLUE estimate BLUEy of 6 is given by A 1 GBLUEy HTCRIH HTC ly7 25 and the minimum variance of the the ith component of the estimator is A 71 Var owlmm ltHTCIQIH 26 11 k J Dr S K Jayarweera Fall 07 39 39I39HI39 LlNIVliRSI39lW quot NEW MEXICO ECE642 Detection and Estimation Theory N 0 Moreover the covariance matrix of the the BLUE estimator OBLUEy ofO is A 71 Cov GBLUEY HTCIQIH 27 PROOF Straightforward k J Dr S K Jayaweera Fall 07 39 39I39HIquot UNIVERSI39IW NEW MEXICO ECE642 Detection and Estimation Theory f References o S M Kay Estimation Theory Chapter 6 k J Dr S K Jayaweera Fall 07 39 39I39HIquot UNIVERSI39IW NEW MEXICO ECE642 Detection and Estimation Theory f Next Time Maximum Likelihood Estimation Section IVD k J Dr S K Jayaweera Fall 07

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "I made $350 in just two days after posting my first study guide."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.