### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# ST Found Eng Electromagnetics ECE 595

UNM

GPA 3.99

### View Full Document

## 11

## 0

## Popular in Course

## Popular in Engineering Electrical & Compu

This 79 page Class Notes was uploaded by Roel Green on Wednesday September 23, 2015. The Class Notes belongs to ECE 595 at University of New Mexico taught by Sudharman Jayaweera in Fall. Since its upload, it has received 11 views. For similar materials see /class/212153/ece-595-university-of-new-mexico in Engineering Electrical & Compu at University of New Mexico.

## Reviews for ST Found Eng Electromagnetics

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/23/15

39I39HI39 LlN39lRl39l quot NEW MEXICO ECE595 Multiuser Communications f ECE595 Multiuser Communications Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 10 November 6th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE595 Multiuser Communications f Linear Multiuser Detection and Equalization 0 Performance of the Linear MUDs o Constrained Minimum Mean Output Energy Detectors k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVHHI39H H NEW MEXICO ECE595 Multiuser Communications f Linear Multiuser Detectors Synchronous Case 1 Extract the suf cient statistic from the observation as the output of a bank of K matched lters with each matched to a particular user s signature waveform V1 OJ21 39 39 7yKiiT 2 Apply a linear transformation L to the suf cient statistic yi 2139 LN 3 Quantize the realvalued transformation output zi to get the symbol estimates bki SgnZki SgnLyilk k Dr S K Jayarweera Fall 07 k NEW MEXICO 39I39HI39 LlN39lRl39l quot ECE595 Multiuser Communications f N Linear Multiuser Detectors in Synchronous Case ctd o The transformation matrix L is determined by the optimality criteria 1 Conventional detector L I 2 Zeroforcing or decorrelating detector L R 1 3 MMSE detector L A 1R62A 2 1 H 621 1A k J 4 Dr S K Jayarweera Fall 07 k 39I39H I39 LlNI39lR39l H NEW MEXICO ECE595 Multiuser Communications Linear Multiuser Detectors Asynchronous Multipleaccess Channel 0 Recall that a K user B 2M lsymbol block asynchronous channel can be considered as a K 2M 1 Virtual user system 0 We de ne these Virtual user s rst by sorting by symbol number and then by user number ie bLKHC 91617 for i M MlM and kl2K 0 Recall that the signalling waveform of the jth user for j KM1 KM2w KMKis V10 ViKk Sk iT Tk7 Dr S K Jayarweera Fall 07 wk 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE595 Multiuser Communications Linear Multiuser Detectors Asynchronous Multipleaccess Channel ctd o A suf cient statistic for detecting b can be written as y Ml y RmAmbn N where n N M 062Rm 0 We may in theory design linear detectors as before based on the above model 0 Of course it could be prohibitively expensive k Dr S K Jayarweera Fall 07 0k r k 39I39H I39 LlNI39lR39l H NEW MEXICO ECE595 Multiuser Communications Decorrelator for Asynchronous Multipleaccess Channel N 0 When and vJt is assumed to be nonzero only in iT quotck iT quotck T which is the interval of bki symbol recall that the matched lter bank output at time i can be given as yiU i y2i yi MO 0 The above can be considered as a channel with a transfer function Sz RT12R0R1Z o A decorrelating multiuser detector in this case is simply given by 8712 RT12R0 R1z1 Dr S K Jayarweera Fall 07 RT1Abi 1 R0Abi R1Abi 1 k NEW MEXICO 39I39HI39 UNIVHHI39H H ECE595 Multiuser Communications r k Linear Multiuser Detectors Asynchronous Case N In general we may write the matched lter bank output at time i as T yl ly1l7y2l7 H yK1l HZbl 110 Note that Hz denotes the transfer function of a multipleinput multipleoutput MIMO linear timeinvariant system The decorrelator or the zeroforcing equalizer in this case is given by bki sgnH 1zyilk Similarly an MMSE detector can also be written However the system Hz does not have a nice structure Hence a more practical approach is to apply the above ideas over a long window that spans several bits see the textbook for details Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVHHI39H H NEW MEXICO ECE595 Multiuser Communications f Remarks on Linear Multiuser Detectors In order to demodulate user k 0 Conventional detector requires only the knowledge of fk o Decorrelator requires the knowledge of f1 f2 g 0 MMSE detector requires the knowledge of f1 f2 K and 62 1 0 However the MMSE detector can be sought adaptively via LMSRLS type methods both decorrelator and MMSE can be obtained adaptively via subspace methods as we will see later k Dr S K Jayarweera Fall 07 0k NEW MEXICO 39I39HI39 UNIVHHI39H H ECE595 Multiuser Communications k BER Performance Comparison of Linear Multiuser Detectors Under various conditions the decorrelator and MMSE detectors satisfy P N QltxSINR0 where SINRO is the output SignaltolnterferencePlusNoise Ratio The above expression is in fact exact for the decorrelator The MMSE detector maximizes the SINRO over all detectors Hence we would expect that MMSE detector has a lower error probability than the decorrelator Although this is not always true due to the nonGaussian nature of the noise at the output of the MMSE detector it can be proven to be true under some reasonable conditions Dr S K Jayarweera Fall 07 J 10 39I39HI39 UNIVHHI39H H NEW MEXICO ECE595 Multiuser Communications N BER Comparison of Linear Multiuser Detectors ctd 0 Since the matched lter can be nearfar limited we may expect it to underperform both MMSE and decorrelating detectors under typical conditions 0 Again this is usually the case though not always 0 In fact none of the three detectors matched lter decorrelator and MMSE detector outperforms the others uniformly in terms of probability of error but generally the MMSE detector has the best performance under most conditions k Dr S K Jayarweera Fall 07 l l 39l lIl UNIV RH l39Y NEV MEXICO ECE595 Multiuser Communications r BER Comparison of Conventional and Decorrelating Multiuser Detectors Mm Dr S K Jayaweera Fall 07 39I39HI Ll239l39l RM l 39 w NEVV MEXICC BER Comparison of Conventional and MMSE Multiuser Detectors ECE595 Multiuser Communications m3 9 quot quot g ixJias r uga r m39 quot9 C 3939m 3935 quot iampquotquot 2 W 4 3 3 Dr S K Jayaweera Fall 07 it Hi Ll Nlh39l RHIII T MEN w MEXKIJ ECE595 Multiuser Communications BER Comparison of Conventional and MMSE Multiuser Detectors With Power Control Perfect Power Control gt E 0 9 3 b O t P 30 40 50 60 70 80 90 Number of Active Users K Biterror probabilities of the MMSE Detector Solid Line and the Conventional Detector Dashed Line Perfect Power Control SNR 10 dB 127Length Signature Sequences Dr S K Jayaweera Fall 07 14 39I39HI39 LIN39IR39IW quot NEW MEXICO ECE595 Multiuser Communications k Performance of Linear MUD Asymptotic Multiuser Ef ciency o The asymptotic multiuser ef ciency of the decorrelator and MMSE detector can shown to be 0 Note that the AME of the matched lter conventional detector depends on the user amplitudes and it can be zero Dr S K Jayarweera Fall 07 39H1INIVIRSIVI YU NEW MEXICO ECE595 Multiuser Communications Performance of Linear MUD Asymptotic Multiuser Ef ciency a 43quot W I reg j ag My t Note AME of MMSE detector is the same as that of the Decorrelator Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H quot NEW MEXICO ECE595 Multiuser Communications f Performance of Linear MUD Nearfar Resistance o Nearfar resistance is the minimum AME over all amplitudes 0 Since the AME of the decorrelator and MMSE does not depend on the amplitudes the near far resistance of these two detectors are the same as their AME ie nearfar resistance is nk W kk We have already seen that this is in fact the optimal nearfar resistance 0 However the nearfar resistance of the conventional detector is zero k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f A Closer Look at the MMSE Detector 0 Recall that the linear MMSE detector quantizes the continuous symbol estimate 00 wkizrzdz where wat is chosen so that it minimizes Mean Square Error MSE IE bki wk itrtdt 2 0 Let us assume independent and zeromean user bits ie Ebkibz139 0 for kyilyj Ebki 0 k J Dr S K Jayarweera Fall 07 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE595 Multiuser Communications f A Closer Look at the MMSE Detector ctd 0 Then it is easy to see that MSE 1 2 wklzfkz iTdt 13 wkizrzdz 2lt1 where we have used the fact that W EEkaMcU iTHW k c The linear MMSE multiuser detector nds the wk lt that minimizes the MSE in 1 k J Dr S K Jayarweera Fall 07 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f Solution to Minimum Mean Squared Error Design 0 Suppose that ful lt denotes this minimizing choice of wk t 0 Let er vvkrfkr iTdt gt 0 2 Otherwise choosing wk lt vvk it will make LHS of 2 equal to 0L gt 0 in turn making the second term of 1 positive without changing the sign of the other two terms thereby resulting in a lower MSE k J Dr S K Jayarweera Fall 07 139le LlNI39lRI39l iv 4 NEW MEXICO ECE595 Multiuser Communications k Constrained Minimum Mean Output Energy MMOE Detector 0 Note that then this same ful lt also minimizes the mean output energy MOE Eax d dtf subject to the constraint W a wz a MOE Without the constraint the minimum MOE is zero set wk lt 0 o ie ful lt can be obtained by solving the constrained optimization 3 problem arg min IE wk itrtdt 2 subject to wk it ct iTdt 0L kal 1 Dr S K Jayaweera Fall 07 k 139le LIN39IRI39I39 iv 4 NEW MEXICO o This formulation of the MMSE detector leads to an alternative implementation that is especially useful in adaptive lter design Dr S K Jayarweera Fall 07 ECE595 Multiuser Communications 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f Constrained Minimum Mean Output Energy MMOE o More generally we can say that minimizing the MOE in 3 subject to any other constraint eg mam iTdr B gt o is equivalent to minimizing EI ma atMM I2 for some y gt 0 0 To see this suppose that the MSE in l is minimized by wint nalt Then MSEmin 1 MAI fkt iTdtMOE J Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H quot NEW MEXICO ECE595 Multiuser Communications f Constrained Minimum Mean Output Energy Detector 0 Consider the following optimization problem for y gt 0 2 yZ ZywkitrtdtlE Wkilrdt kal l argmigl E ybki wk itrtdt 2 f12WE wmrtdt Y 2 2 arg gf 1 2hkizrzdziathizrzdz 4 J Dr S K Jayaweera Fall 07 0 Set aht Then the problem is equivalent to 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f Constrained Minimum Mean Output Energy Detector 0 But 4 is exactly the same as l 0 Hence the solution to 4 is given by hkalt ful lt and the problem 4 is equivalent to the constrained optimization problem minimizing IE hkitrtdt 2 subject to hkaitrtdt 0L 0 But the above constraint on hkalt is equivalent to l I subject to wk TOrUMI 0L lCgtWkaird VOL Therefore choose y g k J Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f Constrained Minimum Mean Output Energy Detector 0 Hence the problem is equivalent to minimize IE wk md dt 2 subject to wkltrtdt yet 5 0 Hence if we want to solve the problem 1 r minimize IE Wk TOFIdI 2 subject to wkltrtdt B 0 Then we should choose is a so that 5 indeed becomes minimize IE wk itrtdt 2 subject to wk itrtdt Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications N Constrained Minimum Mean Output Energy MMOE 0 Thus changing 3 and thus 7 changes only the scale of the solution ie it just multiplies W1 t by a positive constant This does not effect the detector since it uses only the sign of the decision statistic Thus all such detectors including MMSE are in fact equivalent k Dr S K Jayaweera Fall 07 39I39HI39 UNIVHLNI39H quot NEW MEXICO ECE595 Multiuser Communications N Output Signal toInterferencePlusNoise Ratio SINRO 0 Recall the received signal model K Bil r0 2 2bkJ cI JTHI klj0 bkifkt iT interference noise 0 Consider the linear detector 00 commmad 0 Then the output SINR at the output of the linear detector is lfiowzczI cI iTdf 2 E I BMWJO W bkifkI iTdl IZ k J Dr S K Jayarweera Fall 07 SINRO 6 39I39HI39 UNIVHLNI39H H NEW MEXICO ECE595 Multiuser Communications f N SINRO ctd 0 Note that from 6 the SINRO is invariant to the scaling of wk t oo 2 E MWkltrgt r0 mama 2mm oo 2 0 E kaitrtdl Ebki2 kaaiamoqndt 2E I wklzrtdtgt bko I Wki 6ll may 2 21E 1 Wkr bkz39ifk 1T 0 Also note that 2 MOE V00 wklr cr iTdr 1 N dz gtlt In139 X l wk iUM k J Dr S K Jayaweera Fall 07 139le LINI39IRI39I v J NEW MEXICO ECE595 Multiuser Communications r SINRO ctd N 0 Hence oo 2 IE fiwwmn r0 bkifkt mm oo 2 MOEHEU wklI cI iTdt oo 2 2Ebk2ltz39gt WWW ind 2 MOEE wkirfkt iTdr 2E wkizmz ifc 2 MOE E w1 irfkt iTdr k J Dr S K Jayaweera Fall 07 N 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE595 Multiuser Communications f N SINRO ctd 0 Then SINRO can also be written in the form mm mm 00142 90 2 E Unmetr r0 bkltzgtfkltr 2Tgtgt dd lffoowki 6iTd 2 MOE ffwwzcir cr iTdt 2 SINRO where as before MOE E wzcirrrdtlz k Dr S K Jayaweera Fall 07 3 1 k 39I39H I39 LlNI39lR39l H NEW MEXICO ECE595 Multiuser Communications Constrained MMOE MUD s are Maximum Output SINRO MUD s 0 Now suppose we minimize MOE de ned as above subjected to the constraint mz iTwkltdt B 0 Then from the previous expression for output SINRO it is clear that the resulting constrained MMOE detector also maximizes the output SINR 0 Hence we conclude that constrained MMOE detectors including the MMSE detector are also the maximum output SINR detectors Dr S K Jayaweera Fall 07 39I39HI39 UNIVHHI39H quot NEW MEXICO ECE595 Multiuser Communications f N References 0 Chapter 6 of S Verdu Multiuser Detection Cambridge University Press Cambridge UK 1998 o Handouts S Verdu Multiuser Detection in Advances in Statistical Signal Processing H V Poor and J B Thomas Eds JAI Press Inc Greenwich pp 369407 1993 M V Burnashev and H V Poor On the probability of error in linear multiuser detection IEEE Trans Inform Theory vol 49 pp 19221941 Sep 2003 H V Poor and S Verdu Probability of error in MMSE multiuser detection IEEE Trans Inform Theory pp 85 8871 May 1997 J Dr S K Jayaweera Fall 07 39I39HIquot UNIVERSI39H NEW MEXICO ECE595 Multiuser Communications f Next Time Jump ahead to Part III Adaptive Linear Filtering k J Dr S K Jayaweera Fall 07 IIHI39 UNIVIZRMll quot NEW MEXICO ECE595 Multiuser Communications f ECE595 Multiuser Communications Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 12 November 13th Tuesday Fall 2007 k J Dr 5 K fayaweera Fall 07 i NEW MEXICO 39I39H I39 LlNI39lR39l quot ECE595 Multiuser Communications r k Recursive Least Squares Parameter Adaptive Algorithm Outline A New Cost Function for Adaptive Algorithm Construction Minimizing Least Squares Error Minimizing Least Squares Error Vs Minimizing Mean Squared Error Exponentially Weighted Recursive Least Squares Deterministic Normal Equations Recursive Parameter Update Equations RLS Algorithm Initialization LMS Vs RLS Pros and Cons Sliding Window Recursive Least Squares A Two step Algorithm Dr 5 K fayaweera Fall 07 Nk IIHI39 UNIVIRH quot NEW MEXICO ECE595 Multiuser Communications 39 Mean Squared Error So far we considered gradient descent algorithms for minimizing the mean squared error am E dh A dif culty in this approach is that it requires the knowledge of ensemble averages IE XkXk 17 and IE dkXk 17 This forced us to introduce the LMS algorithm which replaces the ensemble averages with their instantaneous values ie Ek dh h In some applications stochastic gradient method may not provide a suf cient enough rate of convergence or a suf ciently small excess mean squared error k Dr 5 K fayaweera Fall 07 k I39HI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f Least Squares Error Cost Function o A perfectly valid cost function that does not require any statistical information about xk and dk is the Least Squares Error 0 Adaptive lters can be designed so that at each time instant k they update the parameter vector 6k 1 in order to minimize this least squares error in contrast to mean squared error as we did earlier 0 Recursive Least Squares RLS algorithm performs this minimization ef ciently k Dr 5 K fayaweera Fall 07 Bk IIHI39 UNIVIRH quot NEW MEXICO ECE595 Multiuser Communications N Mean Squared Error Vs Least Squares Error o Minimizing the mean square error E ek Z produces the same set of coef cients 0k for all sequences of xk and dk that have the same statistics ie the coef cients do not depend on the particular data but on their statistical averages o The least squares approach minimizes the least squares error k 20 e1 Z that depend on the speci c values of the incoming data sequence Filter coef cients will be optimal only for the given data set and different realizations of xk and dk lead to different solutions even if they all have the same statistical properties k Dr 5 K fayaweera Fall 07 k r k 139le LINI39IRI39I v J NEW MEXICO Exponentially Weighted Recursive Least Squares ECE595 Multiuser Communications N 0 Choose lter coef cients 0nk to minimize the weighted least squares error k 00 gkk jl e139Z where 0 lt 7 lt 1 is called the exponential weighting factor or the exponential forgetting factor and the output error is en eta yo dltigt eTltkgtxltigt Dr 5 K fayaweera Fall 07 1 2 O k I39HI39 UNIVIZRMll quot NEW MEXICO ECE595 Multiuser Communications W and 90 1 X0 6k 61 and x139 ii 1 3 GNUO Xi N 0 Note that the coef cients 0k are held constant over the entire observation interval 07 k in computing the cost function although the true parameter values used at each time 139 can be different from each other k Dr 5 K fayaweera Fall 07 k 39I39HI39 LlN39lRl39l39 H NEW MEXICO ECE595 Multiuser Communications Exponentially Weighted RLS Minimizing the Least Squares Error o Coef cients that minimize the least squares error k should set the derivative of k with respect to each of the 611k for n 07 N equal to zero 821k k 7 8e139 0 i 7 161 80nk E6 80nk 0 Using 2 and 3 k 2kk jeiX1 n 0 for n0N 4 1390 0 Using 3 again k N Zkk d139 2 GmkXi m X139 n 0 for n0N 1390 1110 k Dr 5 K fayaweera Fall 07 J 8 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications f Deterministic Normal Equations o Interchanging the order of summation and rearranging the terms gives k k N 206m Zkk j i mk n ZkkijdiXi H 1390 1390 for n0N 5 o This set of equations are known as the deterministic normal equations k Dr 5 K fayaweera Fall 07 3k l l39ll39 UNIVIR1 quot NEW MEXICO ECE595 Multiuser Communications N Vector Deterministic Normal Equations o In vector notation deterministic normal equations become Rxk9k rdxk 6 where we have de ned the N 1 x N 1 exponentially weighted deterministic autocorrelation matrix Rxk of xk as k Rxk 2 kk jxixi T 7 1390 and the deterministic crosscorrelation rdxk between data xk and the desired output dk is k rdxk 2 kk jdixi N 1 vector 8 1390 and x1 is the N 1 Vector x1 X139X139 1 7X139 T k Dr 5 K fayaweera Fall 07 IIHI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications N Least Squares Optimal Coef cients Vector 0 From 6 we have the exact solution to the parameter vector that minimizes the least squares error cost function 900 Rxk 11 dxk 9 0 Recall the optimum MMSE parameter set Gopt is eopt 13 xltkgtxltkgtTgt 1Exltkgtdltkgt 10 0 Compare the similarity with the Least Squares Error Solution k k Rxk kk jxixiT and rdxk kk jd k k Dr 5 K fayaweera Fall 07 l l I39HI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications EW RLS Minimum Least Squares Error 12 c From 29 and 2 k k N ELSA71quot e1392 gkkiie 2 0mkXi m N k k kk 39e 11 20 Gmk kk jeiXi 121 o If 0mk s are Chosen to minimize the least squares error then from 4 the bracketed term is zero for each m for m 07 N Thus k k N 00m Zlk ld d 27v 10 2 9mkXi m d 1390 1390 1110 k N k 2 kk 1d2i 2 em 2 kk 1dixi m 11 1390 1110 1390 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications EW RLS Minimum Least Squares Error 22 0 Using 8 we may write the minimum least squared error kmin as 00m N am Hi rdxltkgtTeltkgt 12 where we have de ned the weighted norm of the desired output vector as Z k k 2 dk x 27 1d139 13 1390 and the desired output vector dk is dk dk 1 dk k 1 Vector 14 d0 k J Dr 5 K fayaweera Fall 07 IIHI39 UNIVIRH quot NEW MEXICO ECE595 Multiuser Communications N Recursive Least Squares From 15 the optimal least squares weight vector is 0k Rxk 1rdxk 15 Since both Rxk and rdxk depend on k solving the deterministic normal equations directly as in 15 requires these quantities to be computed again and again at each time instant k Recursive Least Squares algorithm computes these quantities recursively thereby minimizing the computational complexity It also allows the least squares solution 15 to be found recursively as 0k 0k 1A k 1 16 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f EW RLS Recursions for Rxk and rdxk 0 From the exponentially weighted deterministic autocorrelation is k kil Rxk Zkk xixiT xkxkT Zxk xww 1 0 1 0 0 Hence Rxk can be recursively computed as Rxk xRxk 1 xkxkT 17 0 Similarly from the deterministic cross correlation rdxk can be recursively computed as k kil rdxk Ekk jd k dkxk kk jdixi krdxk 1dkxk 18 J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications f Matrix Inversion Lemma 0 Suppose that A C and C 1 DA IB are nonsingular square matrices Then ABCD 1 A l A 1BC 1 DA113 1DA1 19 o If B b and D dT are vectors then applying matrix inversion lemma gives in this case C is a scalar and we take it as C C 1 A ldeA l T71 i 71 AH d A 1dTA1b 20 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f EW RLS Recursion for Rxk 1 0 Applying the matrix inversion lemma 20 to 17 we may compute Rxk 1 recursively as Rxkil kilek1il kiZRXk171xkxkTRXk nil 1 k 1xk TRxk 1 1xk 0 De ne the inverse autocorrelation matrix Pk as Pk Rxk 1 21 and the gain vector gk as xileuf 1 1xk rlPk 1xk gm 71 T 71 71 T rm 1 xk Rxk 1 xk 1 xk Pk 1x 0 Then recursion for Rxk 1 becomes 71 m x Pk 1 gkxkTPk 1 23 k Dr 5 K fayaweera Fall 07 I39HI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f Gain Vector gk o Re writing 22 as gk 7f1gkxkTPk 1xk flPk 1xk gk W1 Pk 1 gkxkT Pk 1xk 24 0 Substituting from 23 results in which is the same as deterministic normal equations 6 for 6k except that rdxk on the RHS of 6 is replaced by the regressor vector xk Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZRMT quot NEW MEXICO ECE595 Multiuser Communications EW RLS Parameter Update Recursion 12 c From 15 and 21 0 Substituting for rdxk from 18 kPkrdxk 1 dkPkxk Pltk 1 gltkxltkTPltk 1 rdxltk 1 dltkPltkxltk using 23 for Pk Pltk 1rdxltk 1 gltkxltkTPltk 1rdxltk 1 dltkgltk using 25 for gk 61 1 gltkxltkTeltk 1 gltkdltk using 15 91 1 gk dk 91 1Txk 26 k J Dr 5 K fayaweera Fall 07 0k I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications EW RLS Parameter Update Recursion 22 0 Hence exponentially weighted Recursive Least Squares parameter update is of the form two eltk 1gtaltkgtgltkgt 27 where a priori error 0Lk is the error that would occur if the lter coef cients were not updated ie if the old parameters 0k 1 were used with the new data xk w dk Ok 1Txk 28 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZRMT quot NEW MEXICO ECE595 Multiuser Communications Exponentially Weighted Recursive Least Squares Algorithm for Parameter Adaptation 0 Suppose the update is done for time k 1 Hence we have Xk 1 Pk 1 gk 1 and 6k 1 0 Given the new data point Xk at time k Make the modi ed regressor vector xk Compute the gain vector gk using 22 Compute the new parameter vector 0k using 26 Update Pk using 23 k Dr 5 K fayaweera Fall 07 l l39ll39 UNIVIZRMll quot NEW MEXICO ECE595 Multiuser Communications N EW RLS Algorithm Initialization o RLS algorithm performs recursions to compute both 0k and the inverse autocorrelation matrix Pk Both of these needs to be initialized with some initial conditions 0 This initialization can be done in two ways k J Dr 5 K fayaweera Fall 07 l l39ll39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications N RLS Algorithm Initialization Method 1 0 Build up the autocorrelation matrix recursively until it is of full rank N 1 and then compute the inverse directly 71 0 P0 2 x1 x1x1T 1 7N o Cross correlation vector rdx0 is evaluated in a similar way 0 rdx0 2 k 1d139x139 1 7N o Initialize 60 as k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications Method 1 Pros and Cons 0 Advantages Optimality is preserved at each step since the RLS algorithm is initialized at k 0 with the vector 00 that minimizes the weighted least squares error 210 0 Disadvantages There will be a delay of N 1 samples before any updates are performed Direct inversion of N 1 x N 1 matrix Rx0 requires on the order of N 13 operations k J Dr 5 K fayaweera Fall 07 l l39ll39 UNIVIZRMll quot NEW MEXICO ECE595 Multiuser Communications N RLS Algorithm Initialization Method 2 o Initialize the autocorrelation matrix simply as Rx0 51N1 where 8 is a small positive constant and INH is the N 1 x N 1 identity matrix 0 The initialization of the recursion for Pk becomes P0 WIN1 o Initialize the weight vector to zero 00 0 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications N Method 2 Pros and Cons 0 Advantages Simple computation No initial delay 0 Disadvantages Introduces a bias in the least squares solution of the parameter vector However with an exponential forgetting factor 0 lt 7 lt 1 this bias goes to zero as k increases k J Dr 5 K fayaweera Fall 07 IIHI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f N LMS vs RLS o RLS requires on the order of NZ operations per update compared to the on the order of N operations per update required in LMS RLS update recursions require matrix vector multiplication which is on the order of NZ LMS update recursion requires only a vector vector multiplication which is on the order of N 0 Generally RLS converges faster than the LMS algorithm 0 For a stationary process RLS is less sensitive to eigenvalue disparities in the autocorrelation matrix of xk k J Dr 5 K fayaweera Fall 07 IIHI39 UNIVIRHgt quot NEW MEXICO ECE595 Multiuser Communications f N LMS Vs RLS ctd 0 However RLS does not perform very well in tracking non stationary processes unless exponential weighting is used When 7 1 all of the data is equally weighted in estimating the correlation RLS with 7 1 is referred to as growing window recursive least squares 0 Although exponential weighting improves the tracking performance of RLS there is no general rule on how to choose the best 7 0 Thus in some cases LMS may still have better performance k J Dr 5 K fayaweera Fall 07 l l39ll39 LIN39IR39gt quot NEW MEXICO ECE595 Multiuser Communications N In nite Memory of RLS Algorithm 0 Recall that RLS minimizes the exponentially weighted least squares error k 00 EKIH39l INZ 29 10 0 With the growing window RLS 7 1 algorithm each of the e1 Z for 139 07 k are equally weighted while with the exponentially weighted RLS 7 lt 1 algorithm the squared error e1 Z becomes less important for older samples 0 In either case the RLS algorithm has in nite memory k J Dr 5 K fayaweera Fall 07 IIHI39 UN39IRH quot NEW MEXICO ECE595 Multiuser Communications Sliding Window Recursive Least Squares 12 In some situations this in nite memory property may not be desirable eg for a non stationary process whose statistics are changing rapidly in time In such situations the RLS algorithm may be modi ed to minimize the least squared error over a nite window This results in the sliding window recursive least squares algorithm which can also be implemented with complexity NZ operations per update Sliding window RLS can better track non stationary processes and also allows for any data outliers to be forgotten after a nite time window k J Dr 5 K fayaweera Fall 07 l l39ll39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications Sliding Window Recursive Least Squares 22 o Sliding Window RLS minimizes the squared error 2100 2 l I39Z 30 1 k7L 0 Following the same steps as before it can be shown that the parameter vector that minimizes 21 is the solution to the set of linear equations RxLk0k rdxLk 31 where k k RxLk 2 x139x139T and rdxw 2 d1x132 139kL 139kL k Dr 5 K fayaweera Fall 07 l l39ll39 LlNI llll quot NEW MEXICO ECE595 Multiuser Communications SWRLS A Twostep Algorithm 1 Step 1 Given the solution 0k 1 to the normal equations 31 at time k 1 which minimizes the squared error Lk 1 and the new data value nd the weight vector that minimizes the L 2 length window error k ELHU 2 139Z 1 kiL71 2 Step 2 Determine the weight vector 0k that minimizes the squared error 21 by discarding the last data point Xk L 1 where ELM 2 139Z 139kiL K J Dr 5 K fayaweera Fall 07 139le LlNI39lRI39l tv 4 NEW MEXICO ECE595 Multiuser Communications SWRLS Step 1 12 c To nd that minimizes the error L1k that results from the addition of one additional data sample we use the original RLS with 7 1 as below 0 Recall that the gain vector is Pk 1xk k i 33 g 1xkTPk 1xk 0 Then the parameter vector update is 50 0k 1 k CKf 9k 1Txk 34 0 Corresponding update of the inverse correlation matrix is 13k Pk 1 gkxkTPk 1 35 0 Continued J Dr 5 K fayaweera Fall 07 IIHI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications N SW RLS Step 1 22 0 Observe that 13k is the inverse of the matrix Rxk which is based on L 2 data samples given by k RM 2 xkxkT 36 kiLil 0 Similarly is the parameter vector solution to the set of equations Rxk k 12m 37 where k mm 2 dltigtxltigt 38 k J Dr 5 K fayaweera Fall 07 IIHI39 UNIVIZRMT quot NEW MEXICO ECE595 Multiuser Communications f N SW RLS Step 2 15 0 Now we need to discard the oldest data sample Xk L 1 to restore the L 1 length data Window 0 From the de nitions in 32 and 36 observe that the L 1 sample autocorrelation matrix is RxLk Rxk xk L 1xk L 1T 39 0 Similarly from 32 and 38 the L 1 sample cross correlation matrix is raw idxltkgt dltk L 1gtxltk L 1gt 40gt k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications f N SW RLS Step 2 25 0 Apply matrix inversion lemma to 39 to obtain Pk Rg k Rxk xk L 1xk L 1T 1 R1kxk L 1xk L 1TR1 k 1 xk L 1TR1kxk L 1 13k gkxk L 1T13k 41 R1k where the new gain vector gk is de ned as i Pkxk L 1 gk 1xkL 1Tl3kxk L 1 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVERM IW quot NEW MEXICO ECE595 Multiuser Communications f N SW RLS Step 2 35 c As we did before we can manipulate 42 to show that gk gkxk L 1 T13kxk L 1 13kxk L 1 0 Hence gk 131 gkxk L 1 T 131 xk L 1 43 0 Substituting from 41 results in gk Pkxk L 1 44 k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications SW RLS Step 2 45 0 Since 0k is 90 Pkl dxLk 45 0 Substituting for rdx Lk from 40 W Pk Mk dltk L 1gtxltk L 1 Pkf39dxk dk L 1Pkxk L 1 Pkf dxk dk L 1gk 0 Substituting for Pk from 41 results in 0k 13k gkxk L 1 T1300 ma dk L 1gk Pltkgtidxltkgt gltkgtxltk L 1gtTPltkgtidxltkgt dk L 1g k K Dr 5 K fayaweera Fall 07 IIHI39 UNIVIZREHW quot NEW MEXICO ECE595 Multiuser Communications N SW RLS Step 2 55 0 Using 37 nally we have the required sliding Window RLS parameter vector update cpz em k gkxk L 1T k dk L 1gk 60 gk ldk L 1 60 Txk L 1 46 0 Note that sliding Window RLS requires twice the number of operations per update compared to that of exponentially weighted RLS the order of computational complexity is still NZ k J Dr 5 K fayaweera Fall 07 I39HI39 UNIVIR1 quot NEW MEXICO ECE595 Multiuser Communications N SWRLS Algorithm for Parameter Adaptation 0 Suppose the update is done for time k 1 Hence we have Xk 1 Pk 1 gk 1 and 0k 1 0 Given new data point Xk at time k form new regressor vector xk Step 1 Compute the gain vector gk using 33 Compute parameter vector that minimizes 2111 via 34 Compute 15k using 35 Step 2 Compute the new gain vector gk using 42 Update the new parameter vector 6k that minimizes 21 using 46 Update Pk using 41 k Dr 5 K fayaweera Fall 07 39I39Ht LINI IRI39I w 4 NEW MEXICO ECE595 Multiuser Communications r Next Time N k Dr 5 K fayaweera Fall 07 Revisit Part II Adaptive Linear Multiuser Detection 39I39HI39 LlN39lRl39l quot NEW MEXICO ECE595 Multiuser Communications f ECE595 Multiuser Communications Dr Sudharman K Jayaweera Assistant Professor Department of Electrical and Computer Engineering University of New Mexico Lecture 08 October 11th Tuesday Fall 2007 k J Dr S K Jayaweera Fall 07 A 39I39HI39 LlNIVliRSI39H quot NEW MEXICO ECE595 Multiuser Communications f Linear Multiuser Detection and Equalization 0 Examples of Linear Multiuser Detectors 0 Implementation of Linear Detectors 0 Interpretation Correlator Decomposition o Decorrelator for Synchronous CDMA k Dr S K Jayarweera Fall 07 k 39I39HI39 UNIVERSI39H quot NEW MEXICO ECE595 Multiuser Communications f N The Multipleaccess Channel Signal Model 0 Received signal K Bil rr Z bkifkt iTnt kl 10 where Mr Skfhkl o M 2K3 possible data signals K Bil mbU 2 2 bkifkI iT k1 10 k J 3 Dr S K Jayaweera Fall 07

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.