### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# RANDOM SIGNALS IN ELECTRICAL ENGINEERING SYSTEMS ELEC 303

Rice University

GPA 3.54

### View Full Document

## 46

## 0

## Popular in Course

## Popular in Electrical Engineering & Computer Science

This 180 page Class Notes was uploaded by Deondre Ullrich on Monday October 19, 2015. The Class Notes belongs to ELEC 303 at Rice University taught by Farinaz Koushanfar in Fall. Since its upload, it has received 46 views. For similar materials see /class/224978/elec-303-rice-university in Electrical Engineering & Computer Science at Rice University.

## Similar to ELEC 303 at Rice University

## Reviews for RANDOM SIGNALS IN ELECTRICAL ENGINEERING SYSTEMS

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/19/15

ELEC 303 Random Signals Lecture 10 Conditioning Continuous Bayes rule Farina Koushanf39ar ECE Dept Rice University Sept 29 2008 ELECsnA Koushanrav Paii 08 Lecture outline Reading 3536 Conditioning Independence Continuous Bayes rule ELECaUa Konshamai aii 08 9302008 Properties of CDF review Defined by FXX PXSX for all x FXX is monotonically nondecreasing If Xlty then FXX S FXy FXX tends to O as xoo and tends to 1 as xoo For discrete X FXX is piecewise constant For continuous X FXX is a continuous function PMkF and PDF obtained by summingdifferentiate FXUC ZpX PXkPXSk PXSk 1FXk FXk 1 FXltxgt foecram dFX mega ELEC303 Koushanr ai Faii 03 Discrete INC pXYIa y pXYa y ImpMy W0 PXCT ZIJXY y 1 Properties of distribution review Continuous fXU fXY77739U Ml fXY y fxiYUiJ fyy W Ij fXywydy FXW PX S 30 EX varX ELEC303 Kousbanfar Faii 08 9302008 Conditioning review A RV on an event PltX e BIA l inAltxgtdx If we condition on an event of form XeA with P XeA gt0 then we have l I new PXeBXeAW PX e A By comparing we get fXOC 1f X e A fXX X 6 ADOC PX e A 0 Otherwise ELECSEB Koushanrav Fall 08 Example the exponential RV The time t until a light bulb dies is an exponential RV with parameter k lfone turns the light leaves the room and return t seconds laterATgtt X is the additional time until bulb is burned What is the conditional CDF of X given A PXgtxAPTgttxTgtt PT gtzme gt1 PT gttx WW w PT gt1 PT gt z e Memoryless property of exponential CDF EL llL Ecatva Koushamaiz 78 ngq 9302008 Example total probability theorem Train arrives every 15 mins startng 6am You walk to the station between 710730am Your arrival is uniform random variable Find the PDF of the time you have to wait for the first train to arrive fXx I fle I 15 710 5 730 V M 30 My 110 115 120 15 5 MW ELECsoA Koushanrav Fall 08 RlCE Conditioning a RV on another The conditional PDF M leYxy fYy Can use marginal to compute fYy my foofmxwx Note that we have i AM l me 1 HE 303 KoLishani aij all 08 9302008 Summary of concepts Discrete Continuous m quot mm FX 93quot l ElX varX V IRA90711 fXY7 y xpXYZ box y ELEC303 lltouslnanfal Fall 03 Conditional expectation Definitions EX lA fomxxmx EX W y1 xfmxlymx The expected value rule EgXlA lgxgtleAxdx EgXlY y1 liwwgxleyxlydx Total expectation theorem EX 21PAIEX l A1 EX LEW l Y ymowy ELEC303 lltoushanfan Fall 08 9302008 Mean and variance of a piecewise constant PDF 13 if 0 g x g 1 fXx12if1ltxg2 0 otherwise 39 Consider the events A1x is in the first interval 01 A2x is in the second interval 12 Find PA1 PA2 39 Use total expectation theorem to find EX and VarX RICE ELEV 303 Knushanfan Fall ma Example stick breaking 1 0 Break a stick of length 5 twice X first break point chosen uniformly between 0 and Y second break point chosen given Xx uniformly from 0 to x All leXUJlI K53 RICE m 333 mm a ma 9302008 Example stick breaking 2 fxI fylxltylx 11 16 a U 6 a 0 JOlnt PDF fXY39Tvy fXQ inXQJW O ylt1ge lac fXYEviU ELEC 303 Kuuslwanlan Fall DE RICE Example stick breaking 3 u Conditional Expectation of Y given X ElY l X 1Ufylx1llX my 1 2 V 45 o Expectation of i EY O yf3y 1y My IfXJ39CBsy 1 1 idrzilogg 03138 1 I 5 y 3911 1 Z 1 EY IO yiloggd39yzz ELEC 303 Kuuslwanfei Fall DE 9302008 Independence Two RVs X and Y are independent if fXYxy fXxfYy This is the same as for all y with fYygt0 fXMx l y fXx Can be easily generalized to multiple RVs fXYZxyZ fXxfYyfZZ ELEcsnsi Koushanrav Fall 08 Thecon nuousBayesrum X unobserved RV with PDFfX A noisy measurement Y is available which is related to X by a conditional PDnyX Once Y measured what do we know about X The inference problem is to evaluatefXYxy X Y Mquot fvixl llxl fxllel ll ELECaUs Kousham ai all 08 9302008 Continuous Bayes foYX fxv fvav IffoIIows fxyxyfxxfyxyxmy Use normalization property lfjfmxl ydx 1 This is equivalent to fXxleXy l x inYxly 60 loo fX tleXy l 0611 ELECsnA Koushanrav Fall 08 Example light bulbs A light bulb with an exponential PDF lifetime Y X is a RV on any given day uniformly distributed in the interval 132 We test a light bulb and record its lifetime What can be said about the parameter k Solution model k in 132 as fAt2 Simply write the conditional PDF fyf yll new fAiy1ly MA A M if for1sts32 leAlttgtfyiAltymdt 1 2m ydt ELEC303 KoLlsliam al Ell OS 9302008 Conditioning slices the joint PDF 0 Recall the stickbreaking example 1 0 lt y lt m lt Z z 7 7 fX x y Oz otherwise inXylzo lmo Pictorially ELEC303 Koushanr ai Fall 08 Inference about a discrete RV Let PA be the probability of event A The conditional PDFs fYIA and fYIA are known Estimate PAYy given value y of Y Instead of conditioning on event Yy O prob condition on ySYSy8 8 is a small positive Take the limit as 8 goes to zero PAlYymPAlySYSy5 PAPySYSy6lA PySYSy5 EPMleAO ELEC303 lltousharwjfgll08 9302008 10 RICE Inference about a discrete RV Cont d Total probability theorem frO P Ayn1 y PAlfl lA39y PAleA y 0 PAY y HAW1oPltA39inAvltygt More generally if N is discrete with PMF pN then we have fYyZiPNifYNy i Thus PNnleNy l 77 ZPN wfmy I i ELEC 303 Koushanrai Fall 08 PNnYy Example signal detection For a binary transmitted signal S we are given PS1p and PS11p The received signal Y has noise added YSN N is standard normal N01 independent of 5 Find the probability 51 as a function of observed value y of Y P 1 1 HW PS1 Yzy S fyisyl 77122pe 7122 My pe y 1 ppe y Pey y y peELEC alziglKousl yni RAH OS 9302008 11 Inference based on discrete observations Theformula can be turned around fyyPAYy PM Based on the normalization property P A Y fmm 0 y fiwfytPAlY 0dr thy ELEcsosi Koushanrai Faii 08 9302008 12 ELEC 303 Random Signals Lecture 7 Discrete Random Variables Conditioning and Independence Farina Koushanfar ECE Dept Rice University Aug 31 2008 I E ML ELEC 303 Koushaniai39 Faii 08 Lecture outline Reading Fh sh ChapterZ Review Joint Ph Fs Conditioning Independence 1 E Hi I ELEC 303 Koushamar Faii us 9182008 Random Variables A random variable is a Realvalued function of an experiment outcome A function of a random variable defines another random variable We associate with each RV some averages of interest such as mean and variance A random variable can be conditioned on an event or another random variable There is a notion of independence of a random variable from an event or from another Hi l ELEC 303 Kousbaniar Fall us Discrete random variables It is a realvalued function of the outcome of the experiments can take a finite or infinitely finite number of values A discrete random variable has an associated probability mass function PMF It gives the probability of each numerical value that therandonwva ablecantake A function of a discrete random variable defines another discrete random variable RV Its PMF can be found from the PMF of the original RV E Hi r ELEC 303 Koushanlar Fall U8 9182008 3132003 Probability mass function PMF 39 Notations l It mathematically defines a probability law Probability axiom 2K Pm 1 Example Coin loss Define lel1 X1Tl01indicator RV WRIH Review discrete random variable PMF expectation variance Probability mass runcu39on PMF w x 39 2x PXX1 a Expamion lhe cenlml gravilv oi the nrobmbililv ma Egtlt Lxx xx ElslxllrE xM lexi Em isa iiiicai oimigiicii in xemi Egxl gisix a Varlane m picbariiiwmmmitmi n x 5 mm gm 4er xii Elm EIXD 7X Elxl gunr W Mimi Expected value for functions of RV 0 Let X be a random variable with PMF pX and let gX be a function of X Then the expected value of the random variable gX is given by ElgXl ngxlpxlxl VarX EXEX121 2X XElxllzpdxl 0 Similarly the nth moment is given by ElX l 2 WWW x RICL mama m Properties of variance 0 varaX azvarX o varX a varX EaX 5 0413le 3 0 EX Y Z EX EY l EZ o IfXand Yare independent EX Y EX EY ElgX hY E9X EhYl RII Emma menm mica 9182008 angutr Joint PMFs of multiple random variables Joint PMF of two random variabels pxrY PxvXvPXXYv Calculate the PMFs of X and Y by the formula PXX 2y Pxylxv Pyv Zx PXYXv We referto PX and PY as the marginal PMFs ELEC 303 Koushanfar Fall 08 Tabular method For computing marginal PMFs 0 19XCU ZprYay Iain pmf rw 539 If all mtabulaz foam W 39 10XY90ly li llil 1521 113921 moi734 1311 2320 axeu 1321 al all pyw Row 7133 1113511331 PMR Lu 1 l 1 i rayIn 3113 l39 lil 139 I D r Assume ZX2Y l l l l Find EZ 5533 ti quotJIII Z 21 Columm marginal pmf Ex 1339 E Hull I ELEC 303 Koushantar Fall 08 9182008 glean Expecta on E9XY Ea Zyg7y pXYwy In general EgXY 72 gEXEY EaX 5 aElX I B EX Y Z EX EY EZ IfXand Yare independent EX Y EX EY E9X hYl E9Xl ElhY ELEC 303 ltoLIshanfar Fall 08 Elma Va ances varaX a2varX varX a varX Let Z X Y IfXand Yindependent varX Y varX varY Examples If X Y varX Y 4varX IfX Y varXY O If XY indep and Z X 3Y varZ varX 9varY ELEC 303 Koushanfar Fall 08 9182008 9182008 Example Binomial mean and variance X of successes in n independent trials Probability ef success p EIl sis2g A I EMU p t x 1 if success in trial L i EL otherwise o EXZ39 p o varXL 10 132 TL 0 varX ELECSOS Iltoushanfar FaIIOS More than two variables PXIYIZxyz PXxYyZz Pxv XIV 22 PXYZ XIYIZ lexl 2y 2z 3x352 XIYIZ The expected value rule Elglxlvlzll 2X Z 22 gxyzPXIYIZ Xyz g RI 39 l ELEC 303 ltoushanfar Fall 08 httpwwwcoventryacuk 9182008 Conditioning Conditional PMF of a RV on an event A PXAXPXXlA PXX AlPA PA ZXPXX WA 9 2X PXAX 1 Hi l ELEC 303 Koushaniar Fall us Example A student will take a certain test up to a max of n times each time with a probability p of passing independent of the number of attempts Find the PMF of the number of attempts given that the student passes the test Athe event of passing X is a geometric RV with parameter p and AXSn Zm1 to n139pmlp k4 M if k n mrl UNAc mZ UP P 0 otherwise ELEC 303 Koushaniar Fall us Conditioning a RV on another PxMle PXXIYV PxMXIv PXXYvPYv PxyxvPyv The conditional PMF is often used for the joint PMF using a sequential approach lelery Pvlylprley Hi l ELEC 303 Koushaniar Fall us Conditional expectation Conditional expectation ofX given A PAgt0 ZXX leAlxlA EgXlA ngX PXAXlA If A1An are disjoint events partitioning the sample space then EX 2i PAiEXAi For any event B with PAinBgt0 for all i EXl 3 2i PlAilBlElxlAi B EX 2y pyvEXYv 1 E l39tL r ELEC 303 Koushamar Fall Us 9182008 Mean and variance of Geometric Assume there is a probability p that your program works correctly independent of how many times you write Find the mean and variance of X the number of tries till it works correctly pXk1pk391p k12 EN 2k k1pk391p VarX 2k kEX21pk391p Hi i ELEC 303 Koushaniar Faii us Mean and variance of Geometric EXX11 e EX EX2X11 EX2Xgt1E1X212ExEX2 EX2 121pEXp EXXgt11EX 1 E Hi r ELEC 303 Koushaniar Faii Us 9182008 Egintr Independence Independence from an event PXx A PXxPA PXX PA for all x PXx A PXx and A PXAXA PXAXPXX for all x Independence of random variables PXxYyA PXxAPYyA for all x and y For two independent RVs EXY EXEY Also EgXhY EgXEhY ELEC 303 Koushamar Fall us E lei Multiple RVs sum of RVs Three RVs X Y and Z are said to be independent if PXIYIZ xyz PXXPYyPZZ ELEC 03 Koushamar Fall U8 9182008 ELEC 303 Random Signals Lecture 8 Continuous Random Variables PDF and CDFs Farina Koushanf39ar ECE Dept Rice University Sept 22 zoos ELECsnA Koushanrav Fall 08 Lecture outline Reading Reading 3133 Continuous random variables Probability density function PDF Cumulative density function CDF Normal random variable ELECaUa Koushamai all 08 9222008 Continuous random variables Random variables with a continuous range of values Eg speedometer people s height weight Possible to approximate with discrete Continuous models are useful Fine grain and more accurate Continuous calculus tools More insight from analysis ERICE mums Kuushanran Fall m Probability density functions PDFs A RV is continuous if there is a nonnegative PDF st for every subset B of real numbers PltX e B 330ch The probability that RV X falls in an interval is l a b 39 Pa X g b a fXwdc Flgurf tuurr sy ofBer rsekangsirsiklis j 9qu RICE magma Dushanfai Fall up lnrroducrio im Probability 21108 9222008 9222008 PDF Cont d Continuous prob area under the PDF graph For any single point PX a IfXxdx 0 Pa XSbPaltXltbPa sXltbPaltXsb The PDF function fx nonnegative for every x Area under the PDF curve should sum up to 1 ljfXxdxPOOltXltoo1 PSXSx5fX5 ELEC303 lltoushanlan Fail 08 PDF example A PDF can take arbitrary value as long as it is summed to one over the interval eg 1 fXxA a 03 otherwise ljwwf xmx J g1 RICE Ema all 8 Iweanandvanance 39 Expectation EX and nth moment EXn are defined similar to discrete 39 A realvalued function YgX of a continuous RV is 3 RV Y can be both continous or discrete o fXxda Ean mm wm o varX 0 MMVJMW RICE IweanandvananceofUn onnRV qu amor aSme O oIMXh b b2 1 b 2 a 92 dmlt agt a 2 b a 12 ELEC 303 Koushanfar Fall 08 9222008 9222008 ExponenHaIRV fX x Z 26 if x 2 0 0 otherwise k is a positive RV characterizing the PDF Eg time interval between two packet arrivals at a router the lift time of a bulb The probability that X exceeds a certain value decreases exponentially for any aZO we have PX2aJwie lxdx e lx ljequot 1 W 39Mea n 39Variance I39IECSDS Koushanfar Fall 08 Cumulative distribution function CDF The CDF of a RV X is denoted by FX and provides the probability PX x For every x FXX PXlt x X fxxdt Uniform example 397 X x I CDF LL 1 b A Defined for both continuous and discrete RVs UK 303 Knushanf ar Fall 08 CDF of discrete RV o CDF of a discrete rv X denoted by FXgtlt is de ned as Fm PiXSXprk kgx 36 x x L 3 4 i Z 4 Hum the lzx39b aki o IfX 5y then FAX g Fxy F mct Wm m Properties of CDF Defined by FXx PXSX for all x FXx is monotonically nondecreasing fxlty then Fxx g Fxy Fxx tends to O as x oo and tends to l as x900 For discrete X Fxx is piecewise constant For continuous X Fxx is a continuous function PM and PDF obtained by summingdifferentiate Fch ZPXO39 1306 PXS kPXS k4 FxkFxk1 dF 34x watdt amid s g RICE mm we an 05 9222008 Example You are allowed to take an exam 3 times and final score is the max of 3 XmaxX1X2X3 Scores are independent uniform from 110 What is the PMF pXkFXkFXk1 k110 FXkPX k PXlsk Xzsk xgsk PXlsk PX2 k PX3sk k103 lelltllt103 k1103 ELECsnA Koushanrav Fall 08 Geometric and exponential CDFs CDF of a Geometric RV with parameter p A is number of trials before the first success Fagin p1 pk 1 37 1 1 pquot nl2 For an exponential RV with parameter 7tgt0 Fexpx JZewdx ew l31 ew for Xgt0 The exponential RVs can be interpreted as the limit for the Geometric RV ELECan Koushaman all 08 9222008 9222008 Standard Gaussian normal RV A continuous RV is standard normal or Gaussian N01 if 1 2 fxc fer 2 27f EX 0 varX 1 NF MK Normal CDF mm 1 039 1 2 x 39 3 r ELEC303 Kousrianr amFallOS General Gaussian RV 0 General Normal Nua2 1 a M22 2 fX90 amt a o It turns out that EX M varX 0 2 0 Let YaXb then EY varY 0 Fact Y Nau l 211202 ELEC303 lltoushanfan Fall 08 9222008 Notes about normal RV Normality preserved under linear transform It is symmetric around the mean No closed form is available for CDF Standard tables available for N01 Eg p155 The usual practice is to transform to N01 Standardize X subtract u and divide by G to get a standard normal variable y Read the CDF from the standard normal table PX x 4 3 PY 3 in ELECsoA Koushanrar Fall 08 ELEC 303 Random Signals Lecture 21 Random processes Dr Farinaz Koushonj or ECE Dept Rice University New 24 2008 Lecture outline Basic concepts Random processes and linear systems Power spectral density of stationary processes Power spectra in LT systems Power spectral density of a sum process Gaussian processes 1272008 e a lulu n hlulu quot1 h b luu m U e d I n e lnnlv n U w Vllu 39 p W u d S lulu And 5 p n lulun D r S U H a n e CE 0 n I l I l S H I 0 a I B u r lulu I I u I n lulu m luluU I I we X I l u nlvm S e p u e 0 Ilulul e lulu 0 0 l V S 39 I n n n n r m 0 0 a a a e I u I I u e S P n lulun J lulu l I l m e R I 0 VI Y an nu u n Stu a a C v S 0 0 U 0 a lulu lulu S C C p H p 0 S n e lulu a D S a w nlu In If I In If I U R n 0 NV 0 ra r h Vin h VI 0 nrlv I n J D W W r n v V a I u I I Hn lulu Anu u n h 0n v n u m l u r 0 e 0 lulu S numlv d llulu d llulu d S S U S F C The response mean Using the convolution integral to relate the output Yt to the input Xt YtlXchtcdc E Y 0 II E U00 Xtht 0dr joo EXchr tdt 00 ll 00 f mxhlt IMI 00 utt mxf hudumy quot00 This proves that mY is independent oft Cross correlation The cross correlation function between output and the input is E X W tzll Eleleltzll X00 00 Xsht2 s 15 E 00 j EX11Xshr2 s ds 2 RXUI sMUg sws 2 RN r2 h u do ruWuldu RX Rm win r E KinTl This shows that RXYt1t2 depends only on ct1t2 1272008 Outputautocorreiation riMwmmmmomemonommo Rxithttiti Rois ihisshows that RYand RXYdepend oniy onrt1t2 9 Output process is stationary and inputoutput arejointiy stationary Power spectraldensity oiastationary process rtmeomtmmeWamimwwatmn meWwwomemmmHmow momm mommmmmmmm h rtteomkmmowwhtmow m omeWmmmmmwanimmmy p rtammmmwaomWoawtms dooaooaomoooeooomoo moomWaammmnomwmo i rhMMHosMonam A 4 n 9 AW u e n 39t S 2 h a 1 VI p In W W NU a m k d 0 AW AH v An4 mm W I m n W I e II a lulu e a t a n l F v u p I 39 I C th a law bu nvrxv nC iii 9 0 C m L 0 u II A 0 Pf fl 0 n I I I l h In h I AD l e we A U C m r nu H a II X p HIV AlfalJ m L lw d B e Zdn 2 f a e h 90 0 r n Xl P t L Hll ANIM an ne d R u m In inn 4 a e n h 0r R IJH m Exampet 39 tindttepowerintteprooessotexamptet Sototioo We can use either the rotation 00 PX W a to 00 A2 I KW quotfottw tfott W 2 A2 7f At 2 X or the relation toosattoototrequeocydomato 39 orthetttsystemandstationaryinputtindthetranslation tthe relationships between the inputoutput in trequency omain tototoo tquot toot F o d uCIJ 39 Compute the touriertranstorm ot both sidesto obtain oo mom sooooooott 39 Which saysthe meanot aRP is it Dtoalue Alto s 5 phase IS irrelevanttor power Onlythe magnitude attc the power e spectrum to powerdepende ton amplitude not phase 39 Ms goes through differentiatiomhen Sy4n29530 vaquotJ2 53 0 Power spectral density of a sum process Zt XtYt Xt and Yt arejointly stationary RPs Zt is a stationary process with Rlel RxlTl vaT RXYlT RYxlT Taking the Fourier transform from both sides SZf SXF SYf 2 ReSXYf The power spectral density of the sum process is the sum ofthe power spectral of the individual processes plus a term that depends on the cross correlation If Xt and Yt are uncorrelated then RXY EmeY If at least one of the processes is zero mean RXY E0 and we get SZf SXF SYf Example 8 XtX Random variable X U11 Zt Xt ddt Xt then SXYf jAZTleO 2 8ff0 8ff0 Thug ReSXYf 0 Szf SXfSYf A214Tl2f025ff05ff0 1272008 10 Gaussian processes Widely used in communication Because thermal noise in electronics is produced by the random movement of electrons closely modeled by a Gaussian RP In a Gaussian RP if we look at different instances of time the resulting RVs will be jointly Gaussian Definition 1 A random process Xt is a Gaussian process if for all n and all t1t2tn the RVs Xti i1n have a jointly Gaussian density function Gaussian processes Cont d Definition 2 The random processes Xt and Yt are jointly Gaussian if for all n and all t1t2tn and 1112 Emthe random vector Xti i1n YTj j1m have an nm dimensional jointly Gaussian density function It is obvious that if Xt and Yt are jointly Gaussian then each of them is individually Gaussian The reverse is not always true The Gaussian processes have important and unique properties 1272008 11 Important properties of Gaussian processes Property 1 If the Gaussian process Xt is passed through an LT system then the output process Yt will also be a Gaussian process Yt and Xt will be jointly Gaussian processes Property 2 Forjointly Gaussian processes uncorrelatedness and independence are equivalent 1272008 12 ELEC 303 Random Signals Lecture 12 More on conditional expectation and variance Dr Farina Kousha far ECE Dept Rice University 006 2008 I E ML ELEC 303 Koushaniar Faii us Lecture outline Reading 4344 Law of iterated expectations Law of total variance 1 E Hi I ELEC 303 Konshamar Faii U8 1082008 Iterated expectations EXI Y is a function on Example biased coin probheadYPDFO1 For n coin tosses record X the number of heads For any Ye01 9 EXYyny so EXY is a RV Since this is a RV it has an expectation ZEX Y yPVy Y discrete EEXY11 W V Jim EX Y yfvydy Y continuous As long as X has a welldefined expectation Law of Iterated expectations EEXYEX For any function g we have EXgYYgYEXY Hi l ELEC 303 Koushanlar Fall us Example law of iterated expectations We have a stick with length l Select a random point and break it once Keep the left piece and break it again Expected length of the remaining piece 1 E l1L r ELEC 303 Koushanlar Fall Us 1082008 Example forecast review X sales of a company over the entire year Y sales in the first sem of a coming year Assume the joint distribution of XY is known The EX serves as the forecast of the actual sale in the beginning of the year After the mid year Y is known 9 EXIY Forecast revision is EXIYEX Find the expected value of the forecast revision Hi l ELEC 303 Koushanlar Fall us Conditional expectation as an estimator Y can be observations providing info about X The conditional expectation 5EXY The estimation error X EYE5 XYE5Y EXY5 5ltO EYy0 9 EEEYO Thus the estimation error does not have a systematic upward or downward bias 1 E Hi r ELEC 303 KoLishanlar Fall Us 1082008 Conditional expectation as an estimator cont d Is there correlation Between XX E1gtltgt lt1 E1E1gtltSlt 1 Y1 Egt ltEgtlt 1 Y1 0 Thus Cov5lt Elt5 E5E 0 An important property is that varX varXvar Hi i ELEC 303 Koushaniar Faii us Law of total variance varltXYgt EX EX Y2 IY Eigtlt2 Y This is a function on with varXYyEgtlt2Yy Use the mean and the law of iterated means var E2 EE2 1 Y1 EvarX 1 Y Now rewrite varX varXvargtlt Law of total variance varXEvarXYvarEXY 1 E Hi r ELEC 303 Koiishaniar Faii Us 1082008 garlt1 Law of total variance example 1 N independent tosses of a biased coin Probhead0Yquot39U01 X is the number of obtained heads Use law of total variance to find varX i ELEC 303 Koushaniar Faii us Law of total variance example 2 Consider breaking the stick twice ex again Y length of the stick after first break X length after second break Use law of total variance to find varX 1 E Hi r ELEC 303 KoLishaniar Faii Us 1082008 1082008 Law of total variance example 3 Continuous RV X with the PDF fXx12 03x31 and fXx14 13x33 Define Y as Y1 for xlt1 and Y2 for x21 Use law of total variance to find the variance Hi i ELEC 303 Koushaniar Faii us ELEC 303 Random Signals Lecture 18 Classical Statistical Inference Dr Farirmz Kouslianfar ECE Dept Rice University NovB 2008 Lecture outline Reading 9192 Confidence Intervals Central limit theorem Student tdistribution Linear regression 1132008 1132008 Confidence interval Consider an estimator for unknown 9 We fix a confidence level 1oc For every 9 replace the single point estimator with a lower estimate and upper one st Peses 21 oc We call 153 a 1oc confidence interval Confidence interval example Observations Xi s are iid normal with unknown mean 9 and known variance on Let oc005 Find the 95 confidence interval Confidence interval CI Wrong the true parameter lies in the Cl with 95 probability Correct Suppose that 9 is fixed We construct the Cl many times using the same statistical procedure Obtain a collection of n observations and construct the corresponding CI for each About 95 of these Cls will include 9 A note on Central Limit Theorem CLT Let X1 X2 X3 Xn be a sequence ofn independent and identically distributed RVs with finite expectation u and variance 02 gt 0 CLT as the sample size n increases PDF of the sample average of the RVs approaches Nu02n irrespective of the shape of the original distribution 1132008 CLT A probability densityfunctiun Density ufu sum of two variables Density ufu snm of three variables Density aka sulm uffuur variables CLT Let the sum of n random variables be Sn given by 5 X1 Xn Then defining a new RV Sn n m The distribution of Zn converges towards the NO1 as n approaches 00 this is convergence in distributionwritten as xH M 2 Mao In terms of the CDFs Zn Xn iS39nn X1 X n iLuwZn s i w M 3 S t 1 1132008 arendw hmean Hnmlhenormalahez M varancelmmlhesampe normaLlhevanancesaneslmaleandlheRV 1nbeowsnolnormaydslrbuled Con dence intervaapprommal0n 39EvennlhespecacasewherelheX sared Ihelruevarancesrepacesbylheeslmaled Con dence intervaapprommal0n Irealnglhesumas sanormaRV 39slmalelhemeanandwnhaedvarance 39IwodHerenlapproxmalonsme ecl 39Supposelhallheobservalons tdistribution For normalXi it can be shown thatthe PDFof Tn does not depend on9andu This is called tdistribution with nldegrees of freedom mmmmmmmn 035 F 39 z 001 K K If 39 tdistribution Itsisalsosymmetricand bellshapedlike normal The probabilities of variousintervalsare availableintables When the Xi sare normal and n is relatively smaamore accurate I is 21u2 Sn A 3 8n3938n3quotF v 13 1132008 Example The weight of an object is measured 8 times using an electric scale It reports true weight random error N0G 5547 5404 6364 6438 4917 5674 5564 6066 Compute the 95 confidence interval Using the tdistribution Linear regression Building a model of relation between two or more variables of interest Consider two variables x and y based on a collection of data points xiyi i1n Assume that the scatter plot of these two variables show a systematic approximately linear relationship between xi and yi It is natural to build a model yz9091x hnearregresswn UnemwecannolbquamodeLbulwecan eslwmalelheparamelers QFWIM yf rlii 39Ihewlhreswduahs e h Ill e I m n m 0 m S I e n n U h r C a a n e Ill I e m a r a p e h 39 Given n data pairsxiyitheestimatesthat minimizethesumo hesquaredresidua sare Soiutlon poopo iJip aoooo ii p11111133 11191 o Data Points 4 mEsiimatodLinoarModol 2361974 1976 1979 1999 1992 1994 1996 1998 Justi cation oitie east square iiaximumiieipood Approximation oi Bayesian ineariiiS under apossibipnoninearmodei Approximation oi Bayesian 1115 estimation inearmodei resduas h EV a n It f0 U d n l a l 5 a E t n V ra d 0 u T H 0 l a39 V l d VMHIU r m m an u arHII Nua n V Ta tnIIVS B n N m r w a n V 4 an V r l p m b r l m D m J U 0 a U VI S E E E p 55 Xx A5 W l hl AI A Approxmale Bayesan 1M5 WWWWM h WWW WW WWW WWWWW WWWMW 1 w J 43951quot WW Muipenearregresson WWWWW WWWWW NWMWW WWWWWm WWWW 39W W Wmmm MOM Nonlinear regression Sometimes the expression is nonlinear in the unknown parameter Variables x and y obey the form yzhx399 Min 2i Vi hXi 92 The minimization is not typically closedform Assuming Wi s are N002 Yi hxi9 Wi The Mqunction 5 jg i 39 b 1quot in J Practical considerations Heteroskedasticity Nonlinearity Multicollinearity Overfitting Causality 1132008 ELEC 303 Random Signals Lecture 15 Bayesian Statistical Inference Hypothesis testing MAP LIVIS Jr Farina Koushanfar ECE Dept Rice University Oct 22 2008 Lecture outline Reading 8283 Bayesian inference and the posterior distribution Point estimation Hypothesis testing Bayesian least mean square estimator 10282008 Bayesian inference and posterior distribution Unknown quantity of interest 6 Observations or measurements or observation vector of XX1X2Xn We assume that we know A prior distribution p or f A conditional distribution pXI or fXI A complete answer is described by p lx0x Prior p9 Xx Observation Posterior pele l l Process Calculation Conditional pX Point estimates Error analysis etc Four versions of Bayes rule 6 discrete X discrete pe xe x p ele Xle Z PoelleoX l el 9 Poefxl x l e Z Poelfxl x l el 9 6 discrete X continuous Polx 9 l X 6 continuous X discrete folx 9 l X 6 continuous X continuous f ele X l 9 folx9lx fe9fxi xle lfewwpxmx l we lf e39fxi ltx l we 10282008 MAP rule Given the value x of observations select a value for 9 that maximizes the posterior distribution p x9x and f x9x G discrete 6 continuous arg maxefe xw l x Maximum a Posteriori probability rule MAP If takes only a finite number of values the MAP rule minimizes the probability of selecting an incorrect hypothesis 9 arg max 9 Poix 9 l X MAP Rule t maximizes the probability of correct decision for any given value x Equivalently it minimizes the probability of incorrect decision It also minimizes the overall error probability Pagerlor Posterior feix 6 l X Peixe l X 10282008 Four versions of MAP rule 6 discrete X discrete po 9pxlo X 399 G discrete X continuous 1 9fxl X 9 6 continuous X discrete fo 9pxl X I 9 6 continuous X continuous fo 9fxl X 399 Example spam filter Email may be spam or legitimate Parameter 9 taking values 12 corresponding to spamlegitimate prob p 1 P 2 given Let 031 can be a collection of special words whose appearance suggests a spam For each i let Xi be the Bernoulli RV that denotes the appearance ofoai in the message Assume that the conditional prob are known Use the MAP rule to decide if spam or not 10282008 Point estimation A point estimate is a single numerical value representing our best guess of 9 An estimator is assumed to be a RV of the form C gag for some function g Different g s corresponds to different estimators An estimate is the value of the estimator determined by the value x of observations X The MAP rule sets the estimate to a value that maximizes the posterior distributions Once values x of X observed the conditional expectation LMS estimator sets thee to E Xx Couple of remarks on estimation If the posterior is symmetric around its conditional mean and unimodal the max occurs at the mean 9 MAP estimate is the same as conditional expectation lf is continuous the actual evaluation of MAP may be derivable analytically eg using derivatives 10282008 a 9 Example cont d Juliet late by a random amount X U00 0 unknown modeled as value of RV G U01 Assuming Juliet was late by x on the first date how does Romeo calculates the posterior From before feixe l X m if X S 9 S1 Since this is decreasing the MAP estimate is equal to x this can be optimistic The conditional expectation estimator 1 1K 1 E X 9 X J 9llogxl llogxl Hypothesis testing Binary hypothesis two cases Once the value x of X is observed Use the Bayes rule to calculate the posterior P X9 Ix Select the hypothesis with the larger posterior If gMAPx is the selected hypothesis the correct decision s probability is P gMAPx Xx lfSi is set of all x in the MAP the overall probability of correct decision is P gMAPxZi P 9iXeSi The probability of error is Zi P 9iXeSi 10282008 M u tip e hypothesis H x 4 meme gt N w my Mn v H hvvaxh ma ru ahva Xnmvvw m UM m r Examp e biased coin sing e toss Two bwased cums um head prob p1 and pZ Randomw se ect a com and mfer us denmy based on a smg emss 1vaothesws 1 92 vaothesws 2 xo my x1 MAP compares Pemrx emn Compare PW M1 and PW Eg p1 and Z52 and the outcome ta 7 Pdam dmzu m WHY JDIEIDDE Example biased coin multiple tosses Assume that we toss the selected coin n times Let X be the number of heads obtained Example signal detection and matched filter A transmitter sending two messages 1 2 Massages expanded If 1 Sa1a2an if 1 Sb1b2bn The receiver observes the signal with corrupted noise XiSiWi i1n Assume WiNO1 10282008 Bayesian least mean square estimation Consider the simple problem of estimating G with a constant e in the absence of X E 52 var e E 6 2 var E e 2 Selecting the e that minimizes E 52 corresponds to setting a E var9 Eie e E 7 y var E 7 y Mean squared estimation error Least Mean Thus E X Squared LMS estimation E Xx2Xx S E 92Xx Generally for an estimator gX the LMS associated with an estimator gX is defined as E gX2 minimized when gXE X Out of all esti mators of 6 based on X gX E E X2 S E gX2 10282008 LMS example 1 Let quot39U410 Suppose we observe G with noise W X W Assume Wquot39U11 and independent of 6 Find the LMS estimate of 6 given X LMS example 2 Consider the date example where Juliet is late by a RV Xquot39U0 quot39U01 The MAP estimate x The LMS estimate 1 1 1 E Xxle X X 010gX 10gx Find the conditional mean squared error for MAP and the LMS estimate 10282008 Properties of the estimation error The estimation error 3 is unbiased ie it has zero conditional and unconditional mean mo 0 E X x 0 forallx The estimation error 3 is uncorrelated with the estimate 3 9 Cox336 0 The variance of G can be decomposed as var 2 var var Uninformative observation Let us say that the observation X is uninformative if the mean squared error E2 Vaf2 is the same as var the unconditional variance of 6 When is this the case 10282008 Case of multiple observations The same discussions apply for a vector multidimensional RV For all estimators gX1Xn E E IX1Xn21 s E gX1Xn21 This is often difficult to implement Thejoint PDF of X1Xn is hard to compute E X1Xn2 can be a very complicated function of X1Xn Case of multiple parameters multiple parameters G1 m need estimation The LMS criterion commonly used is E 1 12E m am2 This is equivalent to finding for each i the estimator i that minimizes E i 3392 so that we are dealing with m decoupled estimation problems one for each unknown Yielding noi Eoi lX1 Xn 10282008 ELEC 303 Random Signals Lecture 16 Bayesian Statistical Inference Linear LMS Di Farina Koushanfar ECE Dept Rice University Oct 22 2008 Lecture outline Reading 8384 Bayesian linear least mean square estimator Uneare hna onand Nonnalva abbs Examples 10282008 10282008 Linear LMS We restrict our LMS estimators to be linear A linear estimator of a RV based on the observations X1 X2 Xn has the form a1X1a2X2 21an b Given a particular choice of scalars a1anb the corresponding mean square error is E ale azXz 21an by Linear LMS based on a single observation Suppose that a is already chosen Then choose b to estimate the RV aX bE aXE aEX It remains to minimize wrt a the expression E aX E aEX2 Linear LMS for single observation var aX G 2126 lt 200v aX 6 a2c5 lt 2acov X Where 6 and 6X are the standard deviations of and X respectively Covariance of and X C0V XE E XEX Set the derivative of var aX to zero and solve for a Linear LMS for single observation Set the derivative of var aX to zero and solve for a a cov X pcazcx p 6703 p cov X 2 6X 6X 6X GXG With this choice the error of the linear LMS is var aX 6 2126 2acov X 62 o 2 2 2 2 2 o p 9 ox 2p poxo 1 p o ox ox 10282008 10282008 Linear LMS estimation formula 63 E LQXX Eixi E p iWX EX varX 6X 6 X p cov cho The resulting mean squared estimation error var aX 1 p2c5D Intuitive interpretation cov vmX E XEXE PXEXl The formula only involves the mean variances and covariance of and X Suppose pgt0 The estimator starts with the base E for G Eg when X is larger than mean then positive correlation for G and X means that G is expected to be larger than its mean The resulting estimate is set to a value larger than E Example Back to Late on Date Recall that Juliet was always late by an amount X U0 and U01 Derive the linear LMS estimator of based on X Example biased coin The probability of hear is modeled as a RV with prior distribution U01 The coin is tossed n times resulting in a random number of heads denoted by X Thus if is equal to 9 the RV X has a binomial distribution with parameters n and 9 Calculate the linear LMS estimate of 10282008 The case of multiple observations and multiple parameters For multiple parameters i we consider E 1 12E m m2 And minimize it over all estimates that are linear function of observations For multiple observations with certain independence property the LMS estimator formula is simplified next slide Independent observations Let 9 be a RV with mean u and variance 02 Let XlXn be observations of the form Xi Wi Wi 395 Normal RVs with mean 0 and variance of which represent observation errors Assuming that 9W1W2Wn uncorrelated linear LMS of 9 based on XlXn is 053 Z1Xioi2 Z 1 of 6 10282008 10282008 Uncorrelated variables The previous derivation is based on forming the function ha1a2anb E a1X1 anXn b2 By minimizing it by setting the partial derivatives to zero wrt alanb The resulting values are HG n n J j11 1 20163 1 201163 Linear estimation and Normal Ifthe LMS estimator is linear in observations X1Xn then it is also the linear LMS Important case is estimation of Normal RV for Xi Wi We have already computer the posterior distribution of Normal The conditional mean is a linear function of X s The linear LMS and regular LMS coincide Linear estimation and Normal f X1Xn are linear functions of a collection of RVs then the LMS and linear LMS coincide They also coincide with the MAP estimator The normal is symmetric and unimodular Two alternative views on linear LMS A computational shortcut for E X As a model simplification replace less tractable distributions by normal ones Choice of variables in linear estimation Consider an unknown RV 9 observations X1Xn and transformed observations YihXi The function h is onetoone The LMS estimator based on YlYn is the same as the one based on X1Xn E hX1hXnE X1Xn Linear estimation is based on assumption that the class of linear functions of observations X1Xn contains reasonably good estimators of This may not always be the case 10282008 Conclusion Bayesian vs statistical inference Bayesian treat parameter 9 as one with known prior distribution The key is the posterior of 9 given the values The MAP rule maximizes the posterior over 6 is for both estimation and hypothesis testing The LMS conditional expectation estimator The linear LMS higher errors simpler to calculate Under the normality assumption of parameters and observationsthe MAP LMS and linear LMS estimates coincide 10282008 ELEC 303 Random Signals Lecture 14 Bayesian Statistical Inference Dr Farirmz Kouslianfar ECE Dept Rice University Oct 20 2008 Lecture outline Reading 8182 Probability versus statistics Model versus variable inference Bayesian inference and the posterior distribution Point estimation Hypothesis testing 102 12008 Probability versus statistics Are they the same In probabilistic reasoning the assumption is a fully specified model that obeys the axioms The model is taken for granted Statistics has an element of art Several reasonable methods yielding different answers Narrow down the search for methods by adding assumptions certain properties constraints Bayesian vs classical statistics Fundamentally different Bayesian class of thinking vs classical frequentist thinking Bayesian has RV with known distributions Classics assumes deterministic values that are unknown Bayesian maps the statistics back to probability to address the questions 102 12008 Elements of Bayesian Introduce a rv G characterizing the model Postulate a prior probability distribution P 9 Use the Bayes rule to derive a posterior probability distribution P X9 Ix This captures all the info x provides about 9 Model vs variable inference Model inference construct or validate a model on the basis on available data Variable inference estimate the value of more unknown variables by using some related possibly noisy information Example noisy channel Transmitter sends a sequence of binary messages sie01 and the receiver observes XiasiWi i1n 102 12008 Noisy channel inference Transmitter sends a sequence of binary messages sie01 and the receiver observes XiasiWi i1n Wi N01that models channel noise a scalar representing channel attenuation In model inference a is unknown The transmitter sends a pilot 51sn so the values are known to the receiver Based on observations of Xi s value of a is estimated Noisy channel variable inference Transmitter sends a sequence of binary messages sie01 and the receiver observes XiasiWi i1n Wi N01 that models channel noise a scalar representing channel attenuation In variable inference a is known Based on observations of Xi s the values of 51sn are inferred 102 12008 Hypothesis testing Binary hypothesis testing two hypothesis and use the available data to decide which is true In mary hypothesis testing finite number m of competing hypothesis Choose one of the hypothesis aiming to achieve a small probability of error Principal Bayesian inference methods Maximum a posteriori probability MAP Least mean square LMS Linear least mean square Bayesian inference and posterior distribution Unknown quantity of interest 6 Observations or measurements or observation vector of XX1X2Xn We assume that we know A prior distribution p or f A conditional distribution pXI or fXI A complete answer is described by p lx9x Prior p9 Xx Observation Polx I Process Conditiona pX Posterior Calculation Point estimates Error a na lysis etc 102 12008 Four versions of Bayes rule 6 discrete X discrete pe xe x PoePXioXle Z Poe39PxioX l 939 e Poefxiox l e Z Poe39fxiox l 939 9 6 discrete X continuous Poix 9 l X 6 continuous X discrete f e x f ePXioXle e x If eyle X l egdey 6 continuous X continuous fo efxio X l e f e G W lfewwfxiaxiewde39 Posterior calculation example 1 Romeo and Juliet start dating Juliet late by a random amount Xquot39U09 9 unknown modeled as value of RV U01 Assuming Juliet was late by x on the first date how does Romeo calculates the posterior 1ifoses1 19ifoses1 foe fxioXle 0 otherw1s e 0 OthefWISC faxwix f efx x e e e 1 if x g 6 1 lf e39fxi xie39de39 j levdev eilogxi 102 12008 Inference of a common mean of normal RV We observe a collection XX1Xn of RVs An unknown common mean to be inferred Assume that given the common mean the Xi s are normal and independent with known variances of on2 The common mean is RV 9 with a given prior Assume a normal prior Nx0 602 Our model is equivalent to Xi Wi i1n V9 EWiEWi 90 varWivarXi 9 of Posterior vs prior In the previous example the posterior distribution of G is the same family as prior This is not very common also Bernoulli trials and binomial distribution have this property The posterior can be expressed by 2 numbers Possibility for recursive inference We do not need to start from scratch upon new observations 102 12008 Posterior calculation example We wish to estimate the probability of heads for a biased coin denoted by 9 Assume 9 is value of RV G with known prior f Consider n independent tosses X the number of heads observed Posterior calculation Spam filtering example Email may be spam or legitimate Parameter 6 taking values 12 corresponding to spamlegitimate prob p 1 P 2 given Let coir on be a collection of special words whose appearance suggests a spam For each i let Xi be the Bernoulli RV that denotes the appearance of coi in the message Assume that the conditional prob are known 102 12008 Multiparameter problems The case of multiple unknown parameters is very similar Just replace and compute the Kdimensional objects MAP rule Given the value x of observations select a value for 9 that maximizes the posterior distribution p x9x and f x9x G discrete 6 continuous arg maxefe xw l x Maximum a Posteriori probability rule MAP lf takes only a finite number of values the MAP rule minimizes the probability of selecting an incorrect hypothesis 9 arg max 9 Poix 9 l X 102 12008 102 12008 Point estimation A point estimate is a single numerical value representing our best guess of 9 An estimator is assumed to be a RV of the form C gag for some function g Different g s corresponds to different estimators An estimate is the value of the estimator determined by the value x of observations X The MAP rule sets the estimate to a value that maximizes the posterior distributions Once values x of X observed the conditional expectation LMS estimator sets thee to E Xx ELEC 303 Random Signals lecture 22 Random processes 0 Farinaz Koushonfor IECE Dept Rice University Dec 3 2008 Lecture outline Basic concepts Gaussian processes White processes Filtered noise processes Noise equivalent bandwidth 1272008 Things to remember Stationary A random process is stationary if time shift does not affect its properties For all T and for all sets of sample times t0tn PXt0Sx0Xtnan PXTt0Sx0XTtnan Stationary random processes have constant mean defined as EXt mX For stationary RPs autocorrelation depends on the time difference between the samples RXt1t2EXt1Xt2 RxlTt139t2 Exact definition WSS A process is wide sense stationary if its expected power is finite EX2t ltoo its mean is constant and its autocorrelation depends only on the time difference between samples WSS processes stationary in 1st and 2nd moment Stationary processes are WSS but not Vice versa Power spectral density PSD Defined only for WSS processes The Fourier transform of the autocorrelation function Expected power is the integral of the PSD 1272008 1272008 Gaussian processes Widely used in communication Because thermal noise in electronics is produced by the random movement of electrons closely modeled by a Gaussian RP In a Gaussian RP if we look at different instances of time the resulting RVs will be jointly Gaussian Definition 1 A random process Xt is a Gaussian process if for all n and all t1t2tn the RVs Xti i1n have a jointly Gaussian density function Gaussian processes Cont d Definition 2 The random processes Xt and Yt are jointly Gaussian if for all n and all t1t2tn and 1112 Emthe random vector Xti i1n YTj j1m have an nm dimensional jointly Gaussian density function It is obvious that if Xt and Yt are jointly Gaussian then each of them is individually Gaussian The reverse is not always true The Gaussian processes have important and unique properties Important properties of Gaussian processes Property 1 If the Gaussian process Xt is passed through an LTI system then the output process Yt will also be a Gaussian process Yt and Xt will be jointly Gaussian processes Property 2 Forjointly Gaussian processes uncorrelatedness and independence are equivalent White processes Definition 3 A random process Xt is a called a white process if it has a flat spectral density ie Sxf is constant for all f White processes are those where all frequency components appear with equal power Thermal noise can be modeled as a white noise over a wide range of frequencies A wide range of information sources can be modeled as the output of LTI systems driven by a white process 1272008 Powerof awhite process J d m w gunman n n I I quotalum ll V e VI IVIIIHIIMII I 5 b I ICV b MHFLHLIW u 0 n 0 n f n e n E III Imam I nl an 00 an 00 E allqu n u VI u VI I In I a t t t t I I I e a I I S a I I III M S e n S III nun mall I I I m A n nnu H J a nu all I lll HH A V W1 MEI l t S e S f1 T I I Winn 0 I a f I AM a tulIII l I wilt n S r I C r I n M T VI t Ill ll 1 1 5 S a C g v 0 a cn v ilml IIIHIIr I E I n e I r n p E S a p 2 p a nial l e In I II I CnU any Dr I e any Dr III mu m r I n W I ll I n W HI HI rm u I p w a 0 III h a 0 U m mule PHquot e S h p mama ha v p flu WI r Itl e ec a W e a ITII e h C 1 IH m l JulI1 VI 0W W pr m Inna m Inna W 4m m I I u VI 0 0 7 I l b n h h U h 0 h m run I 1 e A Iv 1 e 0 t r A Iv n W n U V C p b R El C 2 Ill I 0 m IlllI CO W r W ac N0 2 t2 I a V p I I 0 0 d N n W 2 C u r 2I I II A a III N n I p In J b N a I Au 0 0 Ir W I V II I I en V S I n C l0 1 I I 0 A a I n MU n I I IIIX n A U Ae U e d I e 0 n e 0 Ill n V S a p h m 2 R a e e e W I n h D It d e k I 0 e m h I n I 5 A a I I Illa W A AINUV W er I n A a f I U 0 0 Illa I S a d n rm 11 1 W P m e I B C we n In r0 e X e 00 D Ill r U C h d U 0 h n Ill n Allillurlla UsinglheIacllhal xtxt andlhalezhasnoDCcomponenl 1272008 Properties of thermal noise Thermal noise is a stationary process Thermal noise is a zero mean process Thermal noise is a Gaussian process Thermal noise is a white process with a power spectral density Snf kTZ Thermal noise increases with increasing ambient temperature cooling circuits lowers the noise Filtered noise process 0 In many cases the noise in one stage of the process gets filtered by a bandpass filter 0 Frequency of bandpass is fc away from zero 0 The bandpass filters can be expressed in terms of the inphase and quadrature components Eg single frequency signal is an extreme case xt A COSZTEfct 9 A COS9COSZTEfCt A Sin9 Sin21tfct xc COSZTEfct xS Sin21tfct Phason Aeje x6 j x5 More generally xt xct COSZTEfct xst Sin21tfct In phase component xct At Cos 9t Quadrature component xst At Sin 9t Bandpass Fiter Xt is the output of an idea bandpass ter of bandwidtthenteredatfc Exampesz 1 2E w Hm U HS H U nthemise f 1 Mko Fitered noise Fitered therma noise is Gaussian but not White F0ridea ierleWHf N0 2 N0 Powerspectraden5Ityz 5xf3Hf 7H0 Forthe exampes on the ast stes tfchsW W 0 otherwise Filtered noise components All filtered noise signals have inphase and quadrature components that are lowpass ie Xltl Xcltl Colenfctl Xsltl Sinl2nfctl lnphase and quadrature components Xcltl and Xsltl are zeromean low pass jointly stationary and jointly Gaussian random processes lithe power in process Xltl is PX then the power in each of the processes Xcltl and Xsltl is also PX DO nrlm l urnu oo Properties of Xcand XS Both have a common PSD hiit hitting the negative frequencies to the right by lC If Hill and Hzlfl are used then P14WN022N0W P22WN02NnW S ing the positive frequencies to the left by lC S litany y yyy n l lat Saplfltlrylll szclfl3xulfl N0 N0 7 W W W W 1272008 Noise equivalent bandwidth 39 Awnite Gaussian noise passingtntougha iiitenwouid be Gaussianbut notwhite 39Wehave SyiiiSXiHi25N0Hi2 We have to integrate Sit to getthe power w M m 2 in nine2 thiitii 39 Define Bneq the noise equivaient bandwidth f2WUW HwnnTTnnnnmn L Ftquot W 2 IMWHH Noise equivalent bandwidth imUW ewnnnnnnnnmn Bite 39 quot t q 2 Hemmm Zthen w n i3 nettin m n j 2 2 XZBneq NOBnequiax gt I 39 Thus given Bneq iinding the output noise becomesa simpletask 39 The oi iiltersand ampliiiersare usually given by the manuiacturers 1272008 Example Find the noise equivalent bandwidth ofa low pass filter 1 Hll p 2 ll jalfl i PRC 00 1 lHlfll l r 2d 21 W ml f 0 V14 afrrzdf 1 112an 00 1 dbl 2 Xh 0 Hrs2 Zara 1 rr nr 2 4 Ii 0 r l 1 2c EL 1 Blair 5 le lllC39 Summary Gaussian processes 39 Xt is a Gaussian process if Ygl0T gt Xt dt is Gaussianforanyiandfunctiong Linear filtering ofa Gaussian process results in a Gaussianprocess 39 Samples ofa Gaussian process are jointly Gaussianrandomvariables 39 Uncorrelated samples of a Gaussian process are independent 11 Summary white noise Whine noise isdeiined asaWSS random processes wiinaiai PSDziniiiN02 ineauioeorrelaiion oiwniie noise is NOZ iii Whine noise isine mosirandomiorm oinoise since in deeorreaies randomly Sui Hnn 5N n 91 l W0 n 2 mmmm Siii39Hii39 so sun 25snnnsnnn httpwwwstaniordedueiasseeii9muitiieetureiomuitipdi Summary iiltered noise iiitered thermainoise is Gaussian but nonwhite ine bandpassiiiters canbe expressed in terms oi the inpnaseand quadrature components i xiixciios2nici x5iSin2nci nphasecomponeni xciiAiCosii Quadrature component nit Ai Sin Gin Deiine Bneq the noise equivaient bandwidth p w 2 n m a n iHiiiidi B onion is Z n q 2 s m 2 max quot39 quot392 XZBneq Hm nan 2 Bnn quot i M mmm th ELEC 303 Random Signals lecture 2 Conditional probability Fo nazKoushonfor ECE Dept Rice University Aug282008 L ELEC 303 Koushaniar Fall OS ERIN Lecture outline Reading Sections 13 14 Review Conditional probability Multiplication rule Tota probability theorem Bayes rule ELEC 03 Konshamai Fall39nS 82 52008 8252008 Probability theory review Mathematically characterizes random events Defined a sample space of possible outcomes Probability axioms 1 Nonnegativity OSPA51 for every event A 2 Additivity If A and B are two disjoint events then the probability PAUBPAPB 3 Normalization The probability of the entire sample space 2 is equal to 1 Le PQ1 nltr E Discretecontinuous models review Discrete finite number of possible outcomes 57 tie2 Viquot i I Enumerate the possible scenarios and count Continuous the sample space is continuous The probability of a point event is zero Probabilityarea in the sample space ELEC 303 Koushanfar Fall39OS lial Conditional probability example 1 Student stage FR 50 JU SI Student standing probation P acceptance A honor H Probation P Acceptable A 1300 900 730 654 Honors H 100 100 120 96 Fresh FR Soph 80 Junior JU Senior SE 0 Fresh FR Soph 80 Junior JU Senior SE Probation P Acceptable A 026 018 0146 0138 Honors H 002 002 0024 00192 ELEC 303 Koushanlar Fall DS E lei Example cont d What is the probability of a JU student What is the probability of honors standing if the student is a FR What is the probability of probation for a SU PA B is probability of A given B B is the new universe Fresh FR Soph 80 Junior JU Senior SE Probation P Acceptable A 026 018 0146 0138 Honors H 002 002 0024 00192 ELEC 03 Konshamai Fall iZiS 8252008 Conditional probability Definition Assuming PlB 0 then PlAlB PlA Bl PlB Consequences if HAM and awe then PA B PBPAl B P PBll X R39t r m A mm i Conditional probability example 2 1 What isthe probability of both dices showing odd numbers given that their sum is s i i 2 Let B isthe event min xv3 Let M maxXV What are the probabilitiesfor M over all of its possible values NR1 r r BZSZDDB Conditional probability example 3 53315 I quotEH Ellquot RR Rad re2 Low 0 Medium Hi h 1 Airplane Rack g Absent 045 020 005 Present 002 008 020 Radar detection vs airplane presence 0 What is the probability of having an airplane What is the probability of airplane being there if the radar reads low 0 When should we decide there is an airplane and when should be decide there is none Elixir ELEC 303 Koushanfar Fall 08 Slide courtesy of Prof Dohleh MiT a g 53315 I PAO3 PACO7 E tmttr Sequential description Ru a quota Radar 2 Low 0 Medium Hi h 1 Airplanek mek g Ii Absent 045 020 005 Present 002 008 020 A aircraft present AC aircraft absent 0 L low M medium H high PA I 01 14 Multiplication rule Assuming that all of the conditioning events have positive probability 39 PM IA m 8252008 Conditional probability example 4 Three cards are selected from a deckof 52 cardswilhoul replacement Findl probabililylhal none oflhe drawn cards is a picture ie JQK g dmwn set r i l new woman The Monty hall problem Vou are told that a prize is equally likely benind any or rne 3 closed doors Vou randornly point to one orrne doors A friend opens one orrne remaining 2 doors after n nind it urlng t at the prize is not be Consider tnerollowing strategies 7 Stick to your initial choice a switch to the other unopened dour sz i xii39i 8252008 B AlmB A208 A308 Divide and conquer Partition the sample space into A1 A2 A3 For any even B PB PAlmB PAZmB PA3mB PAllplBlAll l39PlAzlplBlAzl39l39PlAalplB lA3 Total probability theorem Figure courtesy of Bertseka5amp Tsitsikiis introduction to Probability 2008 BC ELEC 303 Koushanfar Fall 08 43 39 I Eaten Radar example 3 cont d a x Radar 2 Low 0 Medium Hi h 1 Airplanex39txxx l l l g l l Absent 045 020 005 Present 002 008 020 PPresent03 PMedium Present008O3 PPresentllow002O47 ELEC 303 Koushanfar Fall 08 Example courtesy of Prof Duheh MIT 8252008 Radar example 3 cont d Rtekc Radar Li 0 Midquot Hquot h 1 Aprne x mm e quotn m g Absent 045 020 Present 002 008 Given the radar reading what is the best decision about the plane Criterion for decision minimize quotprobability of error Decide absent or present for each reading 03mm ELEC 303 Koushanfar Fall 08 Example courtesy of Prof Dahleh MIT I Radar example 3 cont d Radar 7 L l 0 M d39 H39 h 1 m ow e lum lg J Absent 045 020 Present 002 008 ErrorPresent and decision is absent or Absent and decision is present Disjoint events Perror002008005 Elgextr ELEC 303 Koushanfar Fall OS Example courtesy of Prof Duheh MIT 8252008 k a W I Extended radar example i Threat alert affects the outcome as Radar Airplane ERR LOW0 Medium HIghUL PThr at Absent 01125 005 00125 Present 0055 022 055 HEka Radar Airplaneexma Low0 Medium High1 PN0thr at Absent 045 020 005 Present 002 008 020 Egan ELEC 303 Koushanfar Fall 08 Example courtew of Prof Danish WT PThreatPrior probability of threat p k m I AAirplane RRadar reading P39AR PThreatPAHThreat l Plo ThreatPA RlNo Trlreat If we let pPThreat then we get a Extended radar example xx xx Radar t Low 0 Medium Hll h 1 Airplane x kxx g P AR 045 f 005 Absent 0337510 020 015 03375 Present 102 0 080 14 0 2035 00145 39 39 p 39 p 30011 ELEC 303 Koushanfar Fall 08 Example courtesy of Prof Duheh MIT 8252008 k a m f I Extended radar example i E Radar i Airp aneax LowUIl MEdlLllTl High1 PAR Absent 020015p Present D39DZEE39UM ooao14p U2035p Given the radar registered high and a plane was absent what is the probability that there was a threat How does the decision region behave as a function of p ELEC 303 Koushanfar Fall 08 Example courtesy ofProf Dahleh WT k v 39 Extended radar example tee Radar mrp anee LowU Medmm High ti P CAtR Absent Og39g p 020015p O dgg p Present 30223014 008lD14p U2035p a 0125 PTlHIghampAbsent p 05 0375p Check p5 EPICI ELEC 303 Koushanfar Fall 08 Example courtesy ofProf Duheh MIT 8252008 10 82 52008 Bayes rule The total probability theorem is used in conjunction with the Bayes rule REL if ELEC 303 Koushanfar Faii OS 1022008 ELEC 303 Random Signals Lecture 11 Derived distributions covariance correlation and convolution Di Farina Koushanfar ECE Dept Rice University 0c 1 2008 ELECsnA Koushanrav Fall 08 Lecture outline Reading 4142 Derived distributions Sum of independent random variables Covariance and correlations ELECaUa Konshamai all 08 Derived distributions 39 Consider the function YgX of a continuous RV X 39 Given PDF of X we want to compute the PDF on 39 The method Calculate CDF FYy by the formula FYy PY y W dx rxrglxl y Differentiate to find PDF on d dYquot E RlCE Ema Example 1 39 LetX be uniform on 01 39 YsqrtX 39 FYvl PlYSvl PXSv2l v2 39 fylvl dFvldv dv2dv 2v 0 Sv l RlCE ELEC anal Konshanfar Fall 08 1022008 Example 2 John is driving a distance of 180 miles with a constant speed whose value is U3O6O mileshr Find the PDF of the trip duration Plot the PDF and CDFs ELEC 303 Koushanfar Fall 08 Example 3 YgXX2 where X is a RV with known PDF Find the CDF and PDF of Y ELEC 303 Koushanfar Fall 08 1022008 The linear case fYaXb for a and b scalars and a 0 NU ifx yib lal a Example 1 Linear transform of an exponential RV X YaXb fxlx Ae39 for x20 and otherwise fXX0 Example 2 Linear transform of normal RV i73mm m The strictly monotonic case X is a continuous RV and its range in contained in an interval Assume that g is a strictly monotonic function in the interval Thus g can be inverted YgX iff XhY Assume that h is differentiable The PDF of Y in the region where fvygt0 is dh Y X h 7 f y f ml E ml iglmcr 1022008 More on strictly monotonic case 0 Consider a rv X Let Y gX is a strictly monotonic Function siope 3 300 Xv ms me MIT npenwulse 5041 slides Let y gx we have x X 3 H6 gm 5 Y gltxi6ij7rxm mxw Winger As such My armdawn ELEC303 iltoushanfai FaiiOS Exa m pl e 4 Two archers shoot at a target The distance of each shot is quot39U01 independent of the other shots What is the PDF for the distance of the losing shot from the center ELEC303 Koushanfar Hi 08 1022008 1022008 Example 5 Let X and Y be independent RVs that are uniformly distributed on the interval 01 Find the PDF ofthe RV Z ELEC303 Koushanr al Fall 08 Sum of independent RVs convolution Let XY be two rvs and let W X Y 0 Points where the value W 100 is some constant lie on the following line Idea Discrete case add probabilities of all points on this line Continuous case integrate the joint density on this line ELEC303 lltoushanfan Fall 08 XY Independent integer valued Let X Y be integervalued independent Then W X Y is also integervalued Picture I 39 Thus quotu xw d roww PX y w ZPX 73PY w m PX NMw 3 ELEV 303 Koushanfav Faii 08 XY Independent continuous Let X Y be independent continuous rvs flu g fX fyCy Then the density of W X Y is given by wwgt j mom w mm ELEC 303 Koushanfav Faii U8 1022008 XY Example Independent Uniform Let XY be independent uniform on 0 1 Find the density of W X Y o Convolution idea applies 1mm j gtltxfyw soda RICE ELEcsns x ml nl r r un RICE XY Example Independent Uniform Let XY be independent uniform on 0 1 Find the density of W X l Y 0 Convolution idea applies 00 mo rxomm was OltJ fxCT YOU a l w fxm39yw 7 1 av 10 ELEFEDS Koushanlar Fall 08 1022008 RICE Two independent normal RVs Let XY be independent normal rvs X Nonm Y NWny bra904 fX1 39 fYy exp w7m2 ye 002 27ramay 20 203 0 PDF is constant on 39 ellipses a x m2 y My 2 l 7 7 E 20 203 Circles when 0 1 E 6y ELEC 303 Koushanr ai Fall 08 Sum of two independent normal RVs 0 Let XY be independent zeromean normals X N Mo 03 Y N Noa 0 Find the density of W X l Y fitw j moomm agtda 1 30 eivaQagei iia zZagdm 21r017y oo era W 2 Conclusion W is normal uw 0 03 032 a ELEC303 Koushanfar Fall 08 1022008 Covariance Covariance of two RVs is defined as follows Cule Y ELK 7 EXio39 7 El39 CovX Y EXY 7 XlEY 7 HEX 7 EXES 7 EXY 7 EXEY 7 EYEX 7 EXEY ElXY 7 EXEY An alternate formula CovXY EXY EXEY Properties CovXX VarX CovXaYb a CovXY CovXYZ CovXY Cov YZ 5amp4 RICE mm mm W Covariance and correlation IfX and Yare independent 9 EXYEXEY So the covXYO The converse is not generally true The correlation coefficient of two RVs is defined as MACH 7 with UN H 7 T r39 The range of values is between 11 RICE mm lam 1022008 1022008 Variance of the sum of RVs 0 Two RVs VmX 7n 7 EX Y7EX 7H EX 7EX ltY7Emz 7 EX 7 E1ij 7 2EX 7 EX my 7 12y 7 my 7 13139 7 VarX 7 VmY 7 2CovX Y 0 Multiple RVs n Var Z Z Z covltX X1 13971 71 J l RICE chzns Kaushamav Kauai ll ELEC 303 Random Signals Lecuue9 Con nuousRandontVa abms Joint PDFS Conditioning Continuous Bayes FormalGusherer ECE Dept Rice University Sept222008 ELECsnA Koushanrai Fall 08 Lecture outline Reading Reading 3435 Continuous RV PDF CDF review Joint PDF and multiple RVs Conditioning Independence ELECaUa KoLislianiaij all 08 9242008 9242008 PDF review A RV is continuous if there is a nonnegative PDF st for every subset B of real numbers Pwemzbgmm The probability that RV X falls in an interval is Kim a Is Exam aux1 gtb Pusxsw f wm Figure munssy of Bertsekas l TSItSIklis ELEC303 lltoushanlan Fall 08 Introduction to Probability 2008 PDF Cont d Continuous prob area under the PDF graph For any single point PX a IfXxdx 0 Pa XSbPaltXltbPa sXltbPaltXsb The PDF function fx nonnegative for every x Area under the PDF curve should sum up to 1 ljfXxdxPOOltXltoo1 PSXSx6fX5 RICE mm Far 8 Mean and variance review 39 Expectation EX and nth moment EXn are defined similar to discrete 39 A realvalued function YgX of a continuous RV is 3 RV Y can be both continous or discrete o fXxda EgX gxgt mm o varX 0 mm x Exi2 fxxd Properties of CDF review Defined by FXX PX x for all x FXX is monotonically nondecreasing f xlty then FXX S FXy FXX tends to 0 as x oo and tends to l as x900 For discrete X FXX is piecewise constant For continuous X FXX is a continuous function PMkF and PDF obtained by summingdifferentiate FXltkgtZpXltigt PXltkgtPltXsk PltXsk 1gtFXk FXltk 1gt x dF FXx fXtdt fXt d x ELEC 303 Koushanfar Fall 08 9242008 Standard normal RV review A continuous RV is standard normal or Gaussian N01 if 1 2 fxc W 2 v2w EX 0 varX 1 NF MK Normal CDF m 1 039 1 2 x QRICE ELEC303 Korislianfal FallOS Notes about normal RV review Normality preserved under linear transform It is symmetric around the mean No closed form is available for CDF Standard tables available for N01 Eg p155 The usual practice is to transform to N01 Standardize X subtract u and divide by G to get a standard normal variable y Read the CDF from the standard normal table PltXSxPsJaggkqx ELEC303 lltousbanfan Fall 08 9242008 Joint PDFs of multiple RV Joint PDF fXY where this is a nonnegative function PM fXY m ydacdy f xydxdy 1 Interpretation Po 3 X Saul6in Y s y6gt WWW62 ELEC303 Kolislianfal FallOS Marginal PDFs Consider the event XEA Px 6 A Px 6 AY e oooo LI nyY xydxdy Compare with the formula Px 6 A L fXxdx Thus the marginal PDFfX is given by 2 x 1 mm my ELEC303 lltoushanfan Fall 08 9242008 Twodimensional Uniform PDF f xy cif0SxS1and0SyS1 XY Otherwise Compute c lllflfxmxymxdy JJCdxcbz 1 ELEC303 Koushanr al FallOS Buffon s needle 2 0 Parallel lines at dlstance d a A needle of length assume I d Removlng it a hw problem a Flnd Pneedle lntersects one of the lines me Maxhwuvldl ELEC303 lltoushanfan Fall 08 9242008 2 Buffon s needle 2 0 Parallel lines at distance d o A needle of length assume l d Removing it a hw problem c Find Pneedle intersects one of the lines d ltme w Dpsnzuurse e 041 slid s a Midepointetdnearesteline distance X 9 a 0 d2 o Needlecrosselines acute angle 9 Q A Ow2 o needle intersects one of the lines X g sin e RlCE ELEC ans Kuusmrrar Fail me Buffon s needle 3 0 Parallel lines at distance d a A needle of length I assume 7 I Removing it39 a hw problem 6 Find Pneedle intersects one of the lines me MiT ou nzaursa a D41 shoes 1 o Midpointto nearestline distance X 2 77 012 o Needle cross lines acute angle e 07 072 a needle intersects one of the lines X sin 0 Assume X and e are independent relationship between rvs g RlCF tttcana KnushanfanFallDB 9242008 Buffon s needle 4 e X d ime MIT cpsncoms 5 on slides 9 Joint PDF ofX and e 2 2 d n t UT Ifx e 0 5 and 0 r 0 5 fX eX 9 fXXfe6 0T otherwise ELEC303 Koushanr ai Faii 08 Joint CDF expectation FXYxv m s Xy s Y i foo fXYxydxdy 62F XY x y 0x6y fXYxy Expectation E9X m1 4 rgltm garwoe manly Expectation is additive and linear ELEC303 Koushanfar Faii 08 PDF can be found from CDF by differentiating 9242008 More than two RVs Thejoint PDFfor more RVs is similar PXYZ e B MmefXYzxyZdxdy Marginal fXltxgt J J qu xyzgtdzdy Expectation of sum Ea1X1 a2X2 aan 2 a1EX1 a2EX2 anEXn ELECsoA Koushanrav Faii 08 Conditioning A RV on an event PltX e BIA L inAltxgtdx If we condition on an event of form XeA with PXeAgt0 then we have xdx PXeBXeAL L PX e A By comparing we get fXOC 1f X e A fXX X 6 ADOC PX e A 0 Otherwise ELEC303 KOLiSiiam aij EU 08 9242008 Example the exponential RV The time t until a light bulb dies is an exponential RV with parameter k lfone turns the light leaves the room and return t seconds laterATgtt X is the additional time until bulb is burned What is the conditional CDF of X given A PXgtxAPTgttxTgtt PT gttmegtt PTgttx WW w e PT gt1 PT gt z e Memoryless property of exponential CDF ELECSLB Koushanrav Fall 08 Example total probability theorem Train arrives every 15 mins startng 6am You walk to the station between 710730am Your arrival is uniform random variable Find the PDF of the time you have to wait for the first train to arrive 001 fylA I 15 710 730 X 5 my my 110 115 120 15 l 5 15 f ELEC303 Kolishani ai Ell OS 9242008 10 Conditioning a RV on another The conditional PDF fXYxy jy Can use marginal to compute fYy my fmmdx inYOC y Note that we have fooquotmo ygtdx 1 ELEC303 iltoushanfai FaiiOS Summary of concepts Discrete Continuous n fax a m 39 PXY7 y quotI7XY93y ELEC303 Koushanfar Faii 08 9242008 11 Conditional expectation Definitions EX M xfmxwx EX W y1 xfmxiymx The expected value rule EgXlA ux me EgXlY y jgxgtfmxiydx Total expectation theorem EX Zf PltAlgtEiX l Al EX LED i Y yifyow ELECSEB Koushanrav Fall 08 Mean and variance of a piecewise constant PDF 13 if 0 S X S1 fXx 12 if1ltXS2 0 otherwise Consider the events A1x is in the first interval 01 A2x is in the second interval 12 Find PA1 PA2 Use total expectation theorem to find EX and VarX ELECaUs Konshamai all 08 9242008 12 Example stick breaking 1 Break a stick of length 6 twice X first break point chosen uniformly between 0 and e Y second break point chosen given Xx uniformly from O to x f39B fl39lXyll39 I RICE ELEcana Koushanlan Fall DE Example stick breaking 2 fXI f lxyll 11 12 a U z I Joint PDF ismw fxoc fylxylm i 0 S y lt m S 6 Ex baraw RICE mm mm em 9242008 9242008 Example stick breaking 3 o Conditional Expectation of Y given Xx ElylXasiyfylxltylxzgtdyg o Expectation of Y EY yfyy dy my fxy1nydw l 1 1 Em 0yetog dy RICE mazes Komhanfan Fall 02 Independence Two RVs X and Y are independent if may fXxfyy This is the same as for ally with fYygt0 leYxly fXx Can be easily generalized to multiple RVs fXyZxyZ fXxfYyfzZ EAERICE mans Womhanran Fall as ELEC 303 Random Signals lecttlre 13 Transforms Dr Farinaz Kousho nfar ECE Dept Rice University Oct82008 Lecture outline Reading 4445 Definition and usage of transform Moment generating property Inversion property Examples Sum of independent random variables 1082008 Transforms The transform for a RV X aka moment generating function is a function MXs with parameter 5 defined by MXsEe5X Discrete PMF Mxs ElesX Zesmpxw 30 Continuous PDF 00 5x MXs Ees 1 e fxvdm OO Why transforms New representation Has usages in Calculations eg moment generation Theorem proving Analytical derivations 1082008 1082000 Exampes 39 Find the transforms associated with r Lmearfunctron are random vanabte Moment generating property Moments we need to tntegrate Can mstead differermate the transfur Kr r d 7M N M Example Use the moment generation property to find the mean and variance of exponential RV fXc AeM over as 3 o A gt 0 mm Inverse of transforms Transform MXs is invertible important The transform MXs associated with a RV X uniquely determines the CDF of X assuming that MXs is finite for all s in some interval aa agt0 Explicit formulas that recover PDFPMF from the associated transform are difficult to use In practice transforms are often converted by pattern matching 1082008 Inverse transform example 1 The transform associated with a RV x Is M1sp 14 e395 12 18 e 5 18 e55 We can compare thiswfth the genera formu a Mtst L e pm Thevames ofX 1045 The probabmty of each VahJe 139 Its coef cient PMF P1Xr1p14 P1 112 P1X4D1S P1X5D18 gtlt Inverse transform example 2 u we km x takes novmegative mega We v u knww Dvxt y25tl 1 L Nuw say we have at J 7 U W 71rhent for rl Reta 50 m wjuoeagte0ww weremgmze r1rl39r17 i0r 12 Thxs is the geamemc 2M 1082008 Mixture of two distributions Example fXx23 6e396X 13 4e394x x20 More generally let X1Xn be continuous RV with PDFs le fX2an Values of RV Y are generated as follows Index i is chosen with corresponding prob pi The value y is taken to be equal to Xi fYlY p1 fx1lYl 39l39 p2 fleYl 39l39 39l39 pn anlY MYs p1 MXls p2 MX2s pn MXns The steps in the problem can be reversed then Sum of independent RVs X and Y independent RVs ZXY MZs EeSZ Ee5lXYl Ee5Xe5Y ElesxlElesY MXS MYS Similarly for ZX1X2Xn MZs MXls MX2s MXns 1082008 1082008 Sum of independent RVs example 1 X1X2Xn independent Bernouli RVs with parameter p Find the transform of ZX1X2Xn Sum of independent RVs example 2 X and Y independent Poisson RVs with means 7 and M respectively Find the transform for ZXY Distribution of Z Sum of independent RVs example 3 X and Y independent normal RVs X quot Npxoxz and Y quot Npyoy2 Find the transform for ZXY Distribution of Z Review of transforms so far 2 espr x 0 Definitions Mx8 E SX fee lt d iooe X as c Moment generating properties lms EXquot 50 Transform of sum of independent rvs XY independent WXY MW3 MxmMys 1082008 Bookstore exam ple l George visits a number of book stores looking fur the Hair Book I The time George spends m each book store IS exnonentxa y d sterted with 3 George MM VISR bookstores war he nds the book We want to nd the PDF mean vanance of the time he nds m bu stores 1 A bookstore canes such a book wlth probabmtv 393 y Tatal Mme 39 X2 39 Xx Sum of random number of independent RVs V nonnegatxve Integerrvalued r X112 v4d r v5 mdependent or N Let VX Xv Then Mean Em Emmxu Frer FrlXH W 39 V Variance Var r Evarx m1 VarEYW 7 x JvamxEpn2varm 1082003 Bookstore example 2 a Number of bookstores N i 1 PMF Iii rlqri 721 germetrirr quotmm 1i r 1 7 I Mean 1133 1 31 Variance warmj i 2quot Time in each booketoreX iid ihdep of N ii PDF xiii 3r 3 39 c Mean L39fx 1Ii39arlanIe Varix I Total time F Mean l3939Y E 39EL 1 Variance wax E39Vart Eid tJJZVarL n ja 1 Transform of random sum 0 N nonnegative integer valued rv 0 X1 XN iid rvs independent of N o If YX1XNwe have Mys EeSY E EeSYN E EeSXlXNN E MXltsgtN 0 Compare with MNS E65N each occurrence of es by MX3 0 Thus to get MixS start with MN8 and replace 1082008 10 Bookstore example 3 Number of bookstore e Transform mm in t V 171 E 3 Time m each bookstor quot WM rTransfarm We Total time 7 Transform mm s 1 lei ernr Mm ip hmnenlnlrlr 1 More examples Avillage with a gas station eaeh open daily with an independent probability 05 The amount ofgas in each is 0l11nnnl Characterize the probab ty law ofthe total amount ofavailable gas in the village 1082000 1082008 More examples Let N be Geometric with parameter p Let Xi s Geometric with common parameter q Find the distribution on X1 X2 Xn ELEC 303 Random Signals Lecture 6 More Discrete Random Variables Farina Koushc39nfar ECE Dept Rice University Sept 3 2008 Lecture outline Reading Section 2124 26 Review Discrete random variables Examples of PMFs Binomial Geometric Expectation mean and variance Conditioning 942008 Review random variable A random variable is defined by a deterministic function that maps from the sample space to real numbers n A i XwQ gtR Xw 0 Real numbers Review discrete random variable PMF expectation variance 0 Probability mass function PMF PXX P Xx 2x PXX1 0 Expectation the Central gravin ml the probability mass a EX is z linear0 V n o In general EgIX 7 EIXL39I 39 bability mesa concentrates around EX EiXiF 2Ijx Eixii wwgl Exd Elixi 942008 942008 Some properties of expectation Let X be a rv and Y gX We want to compute EY Ely ZnyYOl pvy Z pxx Xiggtlty Ely ngXPxX Variance of a rv X VarX E X 7 EX2 Z x 7 EX2 pXX E X2 7 E2X Standard deviation of a rv X 0X xVarX Bernoulli indicator RV 0 Define a Bernoulli rv on a sample space 1 a particular event A happens X O otherWIse o The PMF is p ifx 1 pXX i 1 p else 942008 Binomial RV Y Ell X is a Binomial rv following a Binomial distribution Bnp n Mk P Ex k i1 7 7 k 7 nik 7 ltkgtp 1 P The expectation ofa Binomial rv Y with parameter n ancl p n quotl k nik Elll Zka 1 P 0 n 7 1 k 1 n k 7 7 7 7 up 0 k 7 1 p 1 p np nEX Conditional PMF and expectation a Conditional PMF given an event A with PITA C39 pXI th PIIZX KIA a Conditional expectation ElX 139 xii fl PXUG l 1 7 L l J l 1 2 3 t1 5 6 7 EXX 2 z 942008 Geometric PMF X Waiting time for the 1 bus at the MIT stop pxk 1 pk1p7 k 127 o What is the expected waiting time MM 0 What is the expected waiting time conditioned on the fact that you have already waited 2 minutes Geometric PMF Expected time EX 5211 A pXUar XC1 1391 1 whilp Memoryless property Given thatX gt 2 M u the rtv X 2 has same H1 is I JJJJJ L m x w geometric PMF lllll 1 332 1 MMXgtM2MH IIJJJJ1 Total expectation theorem 0 Partition of sample space into disjoint eventsA1A2 An mm PA1PBIA1 PltAnPltBiAn ll EX PA1EXIA1 PAnEXlAn ll Geometric example A1 X 1 A2 X gt 1 PX 1EXX 1 lPX gt 1EXX gt 1 Solve to get EX 11 Geometric random variable ll EX PA1EXIA1PAnEXlAn ll Geometric example 1 A11X1A21Xgt1 A2 A EX PX 1EXX 1 PX gt 1EXX gt 1 Solve to get EX 1p 942008 Geometric random variable 0 Experiment flip a coin until you see 3 HEAD assuming PH p Define a Geometric rv Z as the number of tosses needed Pzk 17 pk1p k 12 Memoryless property Given Z gt m the rv Z 7 m has the same geometric PMF as Z intepretation I I V W 7 7 4771 PZiZgtmk PZ kiZ gt m 17mm 1 P P If k 2 m 0 otherwise 1720 1mm munbmki 17 p p I 19 H I H n m H n i H H i n i 1 2 k i 1 2 k i 1 2 k 942008 ELEC 303 Random Signals lecture 4 Conditional probability Farina Koushanfar ECE Dept Rice University Aug 31 2008 RICE ELEC 303 Koushanfar Fall 09 Lecture outline Reading Section 16 Review Counting Principle Permutations Combinations Partitions RICE ELEC 303i Koushanfar Fallus 922008 Independence summary Two events A and B are independent if PAmBPAPB If also PBgtO independence is equivalent to PIA BIPIAI A and B conditionally independent if given C PCgt0 PAmB CPA C Independence does not imply conditional independence and vice versa Symmetric ELEC 303 Koushenfal Fall I78 Review independent trials and the Binomial probabilities Independent trials sequence of independent but identical stages Bernoulli there are 2 possible results at each stage HHH Inmlp3 HHT T tol72I p HTH thamhp HTT Pznlwp1m2 THH lawman 7 THT Pllvlyp1p2 TTH Pmmplhplz TTT th I ma Figure courtesy of Bertsekusamp Tsitsikii ELEC 303 Koushanfal Fall 08 lnnuducriun m Probability 2008 922008 Review Bernoulli trial PkPk heads in an ntoss sequence From the previous page the probability of any given sequence having k heads pk1p 39k The total number of such sequences is Z pimp Where we define the notation 7 1 n k01n k kn k ELEC 303 Koushanfar Fall 09 Counting discrete Uniform law Let all sample points be equally likely number of elements in A PA number of p01nts1n the sample space 5 Counting is not always simple challenging Parts of counting has to do with combinatorics We will learn some simple rules to help us count better ELEC 303 Koushanfar Fallus 922008 Counting principal rsteps At step i there are ni choices Total number of choices n1n2nr erw n3 1 n 1 11 Home lmirm minim ll0l l 3 sum 4 Mum1 so RICE ELEczos Kousmnray Fall08 intmduttian m Pmbuhiliry 2008 Figure courtesy nfBeitselrusampTsitsiklis Two examples 0 The number of telephone numbers 553 716 39 The number of subsets of an nelement set Permutations and combinations ELEC 303 Koushanf v Fall 08 922008 922008 kPermutations The order of selection matters Assume that we have n distinct objects kpermutations find the ways that we can pick k out of these objects and arrange them in a certain order n nn 1n 2n k1m Permutations kn case Example Probability that 6 rolls of a sixsided die give different numbers ELEC 303 koushanfar fauna Combinations There is no certain order of the events Combinations number of kelement subsets of a given nelement set Two ways of making an ordered sequence of kdistinct items n Choose the k items oneatatime nk n Choose k items then order them k possibilities k ThuS 7 zL k01n k kn k RICE ELEC 303 Konshanfar TallUS 922008 Different sampling methods Draw k balls from an Urn with n balls Sampling with replacement and ordering Sampling without replacement and ordering Sampling with replacement without ordering Sampling without replacement with ordering ELEC 303 Koushanfar Fall 08 Partitions Given an nelement set and nonnegative integers n1n2nr whose sum is n How many ways do we have to form the first subset The second The total number of choices quot1 n2 quot3 quotr l Several terms cancel nlln2lnn I RICE ELEC 303 Koushanfar Fall 08 922008 Examples The 52 cards are dealt to 4 players Find the probability that each one gets an ace Count the size of the sample space playerD Count the number of possible mm ways for Ace distribution gm 0 Count the number of ways tommgl gum Distribute the rest of 48 cards we RICE ELECBEB Koushanlav Fall08 Mall Summary of counting results Permutations of n objects n kPermutations of n objects nlnk Combinations of k out of n objects 7 n k kn k Partitions of n objects into r groups with the ith group having ni objects n J nu nln2 nr nllnzln ELEC 303 Koushanf v Fall08 ELEC 303 Random Signals lecture 20 Random processes Dr Farinaz Kousho nfar ECE Dept Rice University Nov 24 2008 Lecture outline Basic concepts Statistical averages Autocorrelation function Wide sense stationary WSS Multiple random processes 11252008 Random processes A random process RP is an extension of a RV Applied to random time varying signals Example quotthermal noise in circuits caused by the random movement of electrons RP is a natural way to model info sources RP is a set of possible realizations of signal waveforms governed by probabilistic laws RP instance is a signal and notjust one number like the case of RV Example 1 A signal generator generates six possible sinusoids with amplitude one and phase zero We throw a die corresponding to the value F the sinusoid frequency 100F Thus each of the possible six signals would be realized with equal probability The random process is Xtcos27 gtlt 100F t 11252008 11252008 Example 2 Randomly choose a phase 9 UO2n Generate a sinusoid with fixed amplitude A and fixed freq to but a random phase 9 The RP is Xt A cos2nf0t Xt A cos2nf0t A 7 C A 6n394 V0 V 0 37r4 Vims 11252008 Example 3 o 03 0 Random variable XU 11 06 Ra ndom processes Corresponding to each i in the sample space Q there is a signal xt Di called a sample function or a realization of the RP For the different ool s at a fixed time to the number xtO Di constitutes a RV XtO In other words at any time instant the value of a random process is a random variable Example sample functions of a random process x t2w1l A v VV xt c091 v xtm3 A VFN v WV Example 4 We throw a die corresponding to the value F the sinusoid frequency 100F Thus each of the possible six signals would be realized with equal probability The random process is Xtcos27t gtlt 100F t Determine the values of the RV X0001 The possible values are COSO27 C COSO4TC COS127 C each with probability 16 11252008 11252008 Exampe5 xrlwl Qsthesamplespace I I I I forthrowmgadle 4 3 2 1 0 1 2 3 4 For all oi let xt wi pi e391 r 2 Xisa RV taking values 39 e39l 2e391 6e391 each 3 with probability 16 I I I k Example 6 Example of a discretetime random process Let ooi denote the outcome of a random experiment of independent drawings from NO1 The discrete time RP is Xnn1to 00 X0O and XnXn1 mi for all n21 Statistica averages t isthe mean ofthe random processXt X Examp e 39memmWWM m w wwmgmmmmmm MMMMMmmmm w MWWMMWM 39WWWWMm 39mmmwmwmwwm WWMWMWWU Autocorre a onmc on WWWWWMWWWM MMMMHWM 39WW me m Rxt1 2 m Em mmm dwm M MWMWWW 2e Ehtthathhh H2 W 3 I 905Wth thtlhhihdh 0 h It The autocorrelation of the RP in ex is Rxht t2 3 E A 00827tf0t1 Acos21tf0t2 1 1 A2E coshtf0t1 s 2 i costhrhh h 2 A2 Exampe8 00523th h 2 We have used XtX Random variableX U Exampe9 Find the autocorrelation function 11252008 11252008 Wide sense stationary process A process is wide sense stationary WSS if its mean and autocorrelation do not depend on the choice of the time origin WSS RP the following two conditions hold letEXt is independent oft Rxlt1t2 depends only on the time difference rt1 t2 and not on the t1 and t2 individually From the definition RXt1t2RXt2t1 9 If RP is WSS then RxlrlRXr Example 8 cont d The autocorrelation of the RP in ex is mm a x EACOS27lf0t1l ACOS21If0lgl on 1 1 A2E200821If0f1lg COS23pr1f22 A2 TCOSZRfolh o Also we saw that mxltl0 Thus this process is WSS 10 Example 10 Randomly choose a phase 9 39 U0T Generate a sinusoid with fixed amplitude A and fixed freq f0 but a random phase 9 The new RP is Yt A cos27f0t 9 We can compute the mean For Ge17 f 91TE and zero otherwise wt EYt LO m A cos27tf0t 7d9 2AT sin27f0t Since mYt is not independent of t Yt is nonstationary RP Multiple RPs Two RPs Xt and Yt are independent if for all t1 and t2 the RVs Xtl and Xtz are independent Similarly the Xt and Yt are uncorrelated if for all t1 and t2 the RVs Xtl and Xtz are uncorrelated Recall that independence 9 uncorrelation but the reverse relationship is not generally true The only exception is the Gaussian processes TBD next time were the two are equivalent 11252008 11 11252008 Cross correlation and joint stationary The cross correlation between two RPs Xt and Yt is defined as RXYt1t2 EXt1Xt2 9clearly RXYt1t2 RXYt2t1 Two RPs Xt and Yt are jointly WSS if both are individually stationary and the cross correlation depends on Tt1t2 9 for X and Y jointly stationary RXY E RXY E 12

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I made $350 in just two days after posting my first study guide."

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.