New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Special Topics

by: Cassidy Effertz
Cassidy Effertz

GPA 3.64


Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Class Notes
25 ?




Popular in Course


This 0 page Class Notes was uploaded by Cassidy Effertz on Monday November 2, 2015. The Class Notes belongs to ECE 8873 at Georgia Institute of Technology - Main Campus taught by Staff in Fall. Since its upload, it has received 12 views. For similar materials see /class/233917/ece-8873-georgia-institute-of-technology-main-campus in ELECTRICAL AND COMPUTER ENGINEERING at Georgia Institute of Technology - Main Campus.


Reviews for Special Topics


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 11/02/15
ECE 8873 Data Compression 8 Modeling Review of Probability and Random Processes School of Electrical and Computer Engineering eorgia Institute of Technology Approaches to the Theory of Probability Relative frequency approach An event that occurs more frequently has higher probability and vice versa intuitive but difficult to generaliz Axiomatic approach Probability is a real number between 0 and 1 Built upon set theor A Probability Space consists of three components the observation or sample space S the probability measure the assignment of probability that satisfies a set of axiom Observation or Sample Space is a space whose elements are all the outcomes of an experiment Events are subsets of the observation space Pr and s Three axioms to satisfy Spring 2004 Nonnegativity PrA 2 0 Sureeventamp totalprobabiliy PIS 1 Exclusiviy If A AB q thenPrA UB PrA PrB Spring 2004 ECEVEEH E H Juana cupyviamznm Review Slide 1 Spring 2004 ECEVEEH E H Juana cupynamznm Review Slide 2 Important Corollaries Conditional Probability 3 Omeasure events Pr 0 Theoretically here could be other Omeasure events than the empty set the axioms allow that But in the relative 39equency approach they are usually either not treated or treated differently as unseen eventsquot 3 Complementary events Z S Pr A 1 Pr A7 1 AHB I AUB J A PdZnB 99 PrA o B PrA PrB PrA n B s PrA PrB spring 2004 ECEVEE73 a H Juana cepyiigmznm Review Slide 3 o If eventB is assumed to have nonzero probability the conditional probability ofA given B is PrAlBMM PrB PrB We say joint event A B occurs if the outcome of a trial satisfies the definition of both events in other words when any of the common elements OM and B appears as the outcome of the trial That is PrA B PrArB PrB gt o PrBlB1 A PrA l B can be considered as the probability of A when B is the observation space I AnBoZnBSnBB andAnBnZnB PAB PZB My AmB S spring 2004 ECEVEE73 a H Juana cepyiigmznm Review Slide 4 Probability amp Information Transmission A binary channel 0 P00 0 At the source x Pm Pm y Prx 0 p and Prx117p 1 P 1 Prx 0 and Prx 1 Source 11 Destination are called a priori probabilities The channel being nonideal causing confusions is characterized by the fourconditional probabilities PPmPmandPll R Pry j l x i probability of j received at the destination when i was actually sent by the source A posteriori probability Bayes Formula ls used to find a posteriori probability Pr 39xiPrxi Prxl y y Jl Pry J More precisely Prx 139 l y j Prsymbol 139 was sent given that j is received Pry J39l xz39Prxz39 ZPry jl x iPrx 139 Mil Pg Prx 139 7 Pg Prx 139 7 the source portion of 7 contributions that led to 7 ZPy Prx i 7 Pry j the reception of M11 The probability that iwas sent at Prx l Pry j l x iPrx i the source given thatj is received y J pro j at the destination spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 5 spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 6 Probability Distribution Function PrX S X is a function Ofx FXx PrX s x is the probability distribution functiondefined over allx X S aw 1 FX 0 X s 00 is always true a sure event thus FX 00 1 XS OOCXSCCXSOO for fooltxltoo 2 0 g FXx g1 If x1 ltx2 Xle C XSx2 X x1 X x2 X le XSx2XSx1uxlltX x2 and X x x1ltX x2 Prxl ltX S962FXCXZFXX1 FXx is a non decreasing function of x Probability Density Functions The slope of the probability distribution function at x represents the incremental probability at that point and thus gives the sense of how likelyX x might be FXx8FXx8dFXx Fm 8 dx Fm fXx iIg fXxdx Prx ltX S x dx is the probability mass at x 8 Probability mass The derivative is called the probability density function pdf Pdf is nonnegative In the case of discrete distributions the pdf consists of delta functions at those realizable values each having an area equal to the corresponding magnitude of probability spring 2004 ECEVEE73 a H Juana oupyngmznm Review Slide 7 Spring 2004 ECEVEE73 a H Juana CupyrighlZUElA Review Slide 8 Functions of Random Variable Xis a random variable with pdf fxx Yis a monotonic function ofX Y gX Findfyoz dx fyy d quot Expressed m y With g y x my mg 4ng o If Yis nonmonotonic function ofX 71 yA my 2 fXg y dg y for all pgquot y dy Mean Values and Moments Mean or expected value of a random variable I ElX xfxxvx Mean or expected value of a function of a random Variab39e EgX gwwxvx Nth moments amp central moments of a rv 2719pm 1 me vhfquot EKXJN uffyum When n 2 F EXZ I foXxdx is called the mean square value and 0392 EX 72 Ixe7zfXxdx isthe variance 0392 EX7XZEX272X772 EXZ727272 EmirZ x x x x A 25sz XENAX Example YX2 or XJ7 EFL butforanyygt07xi1 dy 2J3 1 Therefore Jy fX 415 PO Ov y lt0 2 spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide9 spring 2004 5058873 B H Juana Cupynammm Review Slide 10 Conditional Probability Distribution We define the conditional probability the same as before FXle PrX le w PrM gt o PrM If we use the event mapping concept PrX g xM is the probability of all the outcomes which realize both events X03 g x and 5 EM lfMXm FXlePrX xlX mW Characteristic Function Definition XuEe Xi ewmxmx that is the characteristic function of a random variable can be viewed as the Fourier transform of its probability density function The pdf is then the inverse Fouriertransform ofthe characteristic function 1 7 1 fXx 7 2 we Xudu PrX S m fx mFXlePIX xFX X fx2mFXlePrX m1 d m d WC m WC PrltX m Fxltmgt PrX m E4300 w Eel fXxdx w 1er fXxdx Conditional probability density function f MM dFXOClM d w dn has all the properties of a usual pdf X 51 qumy Lnjx fX xdx 1X dun GEM 0 j EX 1an Spring 2004 5055573 B H Juana CUPWENZUW Review Slide 11 Spring 2004 5055573 B H Juana CUPWENZUW Review Slide 12 Definition fXyxy fxyxgtydxdyPrxltX SxdxyltY ydy 1 Joint Probability Density Functions azFX Ax y fmwy x by fXVYxy20 700ltxlt00700ltyltoo Expectation of Functions of Two RVs Similar definition as in single random variable case EigltXYgti11gltxygtfxyltxygtdxdy When gXYX Yk EX Yquot1x ykfxyltxygtdxdy is called the joint moment ofX and Y When nk1 it is called 2 f xycbcdy1 LL X39Y correlation nk Is the order of the moment x y 3 may imiwfmuwvdu my w w EltXixgtltYiYgt xiXxervxxaymxdy 4 fxoc jiwfxyxydy my jiwfummc smargmals w w f y iwiwoyeXyexYXYgtfXyltxygtdxdy 5 Prtx1ltXx2ylltYy2L L fXyxydyasc w w 17 J xnyvYxydxdinYXYiXY Sprin92004 ECE39EE73EHJU3quotE CupynaMUm Review Slide13 Sprin92004 ECEVEEHEHJuana Cupynammm Review Slide14 Probability Distributions of Functions of T Several Random Variables he probability space is de ned on a hyperspace that contains the random variables F unction YgX1X2XN FyyPrYSyPrgXszwuXN y PM gx Xx2 Xmuxw Xm y FyyPrgXszsuXN y fXX2XNxpxpxNdxldxzquotde gm yawwa9 gt Direct integration Another method is through transformation of variables Sum of Two Random Variables Z X Y At anyy the small stripe has a probability mass of Ziy Wino fXYx ydx J Thus the shaded area which is PrZ 2 can be tained by m H PrZ z Fzz LL fXyxydxdy IfX and Yare independent 1122 J39m 00 506de Use Leibniz s rule Andv fzltzgtdinltzgt Ifyyfxziydy Z The pdf of the sum of two statistically independent random variables is the convolution of their individual pdf s Spring 2004 ECEVEE73 a H Juana CupyrightZDDA Review Slide 15 spring 2004 5058873 B H Juana Cupynammm Review Slide 16 Probability Density Functions of Two RVs Z XY X4 XY Yz Given 2 the function xyz is a line going through the origin F2 2 PrZ g z 1 g z fwd y dxdy Y shadedzrea becauseif y gt 0 then x g yz if y lt 0 then x 2 yz F2 2 if if mm y asc dyfw i mm y m dy 122 if mm y dy e i nyyyz y dy if y yum y dy Transformation of Multiple Random Variables Z 1XY and W 2XY are two functions of rv Xand Y Both 71 and p2 are continuous functions with corresponding inverse functions X y1ZW and Y y2ZW respectively Since allthe eventsthatmapto x1 ltX g x2y1ltY yz would also map to 21 lt 25 g 22w1ltW W2 we have Prltx1 ltngzy1 ltY yZPrltZ1 ltZ 22w1 ltW w2 X2 yz 2 W2 or L L fX7YxydydxJ J fZYWzwdwdz But if I fXyxydydx i if fXyl1ZWl2ZWiJ idde by way of change of variables with a a 3 3 If YgX the integral for FZz becomes a line integral A special az aw 32 aw case is when Yi Xi Jis the Jacobian that relates the 1 37y all2 all2 incremental area dzdw to dxdy az aw az aw Spring 2004 Ecmm a H Juana cupyiiamznm Review Slide 17 Spring 2004 Ec am a H Juana cupyiiamznm Review Slide 18 Transformation of Multiple RV cont d i ZfXYxydydx J IfXyl4ZWal2ZWU idde J 2 fzyzwdwdz Ther9f0rei fZWZ W l J l f ll1Z W 1142 W Example ZXY WXXW YZW Characteristic Function of Sum of RVs The characteristic function of a random variable is the Fourier transform of its pdf Very useful in deriving the moments of the rv PM f fXxe dx and fXx 27 1llt1gtxue i du lt1gtYulfyye and My27f391lt1gtyuewdu E E 1 z o 39 J a m W91 12 w1 Thus fZWZ WnyW Now Z XY where X andY are independent 1 2 W4 V l W l V W The pdf of the sum of two statistically independent random variables is the a a m 1 z convolution oftheir individual pdfs The mariginals fZz fX Mg de L l W W 200 q3Xuq3yu no 71 e m 71 7 Liz fWw Iiwinywijdz fzz 27 L Zltugte J du 27 fiw XultIgtyue J du spring 2004 ECEVEE73 a H Juana cepyiigmznm Review Slide 19 spring 2004 ECEVEE73 a H Juana cepyiigmznm Review Slide 20 Characteristic Function of A Gaussian RV Xu Ee X lime 7mm PM j The exponential term 1 1 x122 Maexp1mexpT dquot z z xiffijwdazibciffi x7ju1722 affair 2172 2172 2172 J 2 2 x1 X 1472 14262 exp1 26 41qu17 ix jux ltIgtXugtl 1 mT 2 m 1 7 X 2 2 2 2 2 expiY 11 111 expfyuXruaexpfquri 2 w 271 2 t 2 t 2 because F 1 exp X 7 27 J 622 1 Us oa uclmmmm 7m 2 the same as integrating alung the V M 2quot l Wide Sense Stationary Processes RXt1t2RXt1 T2t2 T EXt1 TXt2 T Set T 711 2 RX1112RX11412 7T RX012 711 For a wide sense stationary process the autocorrelation function does not depend on the absolute time origin The first argument 0 is thus arbitrary and the autocorrelation is a function of only the time difference 12 111 1 RAT RX0J2 t1 EXt1Xt1 1 EXIXt 1 The darkness repres N the height of X If we look at any two time instances of a wide sense stationary process their correlation is only a function of their time difference no matter where they are In subsequent discussions wide sense stationarity is always assumed spring 2004 ECEVEE73 a H Juana CupynghtZUUA Review Slide 21 Spring 2004 ECEVEE73 a H Juana CupyrightZUUA Review Slide 22 Properties of Autocorrelation Functions I RX0 7 2 o Symmetry RX1 RX7139 RAT EXIXl T EXl TXl R2201 lRXT lS RX0 EX1i X22 EX12 i 2X1X2 X 2 0 Ein X22 2RX0 2 E2X1X2 l 2 l R20 l fXt has a constant component say Xt AVt where Vt has zero mean the autocorrelation function has a constant component EA VtA Vt 1 EAZ AVt AVt 1 VtVt 1 AZ AEVt AEVt 1 EVtVt 1 AZ RV7 Properties of Autocorrelation Functions II fXt has a periodic component then the autocorrelation function has a periodic component Xl A cosal whereA and a are constant and 9 a rv uniformly distributed over 0211 ie f 6 270quot 0 g 6 lt 27r 0 elsewhere RX1 EA cosal A cosal an 6 A2 A2 E7cos2wt an 2 Toos or A2 A2 2 1 A2 cos an cos 2501 wr2 9 d6 cos an 2 2 0 2 2 fXt is ergodic and zeromean and has no periodic components That is time samples far apart tend to lim RX1 0 meow behave statistically independently spring 2004 ECEVEE73 a H Juana CupynghtZUUA Review Slide 23 Spring 2004 ECEVEE73 a H Juana CupyrightZUUA Review Slide 24 Properties of Autocorrelation Functions Ill Autocorrelation functions cannot have arbitrary shape they must correspond to some power spectrum which must be non negative over the entire frequency range More discussions later 0 F RX0 J RX0e mdr power spectrum of X0 F 11220 SXagt 2 0 forall 0 Example An ergodic random process has an autocorrelation function of the form R 412 6 T X 42 1 Find the meansquare value mean value and variance of the process 2 RXT4TZ64 22 X2RX06 7 1 7 1 X24 X2 0F7172642 Sinusoid Plus Noise Let X0 Acosal where 8 is a random variable uniformly distributed over 027r RAT AZ cosm Let V0 be a zero mean noise process statistically independent of X0 the signal with autocorrelation function Rm BZe W The observed process is Z0 Acosat 9 V0 which has an autocorrelation function R2 239 lAZ cos a Bze39W Note that Rm BZe W s 0 as 239 gt 00 It is thus possible to recover a sinusoid from noise contamination as long as we measure the autocorrelation at sufficiently long time lags spring 2004 ECEEE73 a H Juana cepyiigmznm Review Slide 25 spring 2004 ECEEE73 a H Juana cepyiigmznm Review Slide 26 Spectral Density E F a Z sXm 12 X2 RX0ISXwdw Since SX00 is an average overtime it is thus usually called a power density spectrum When SX00 is integrated over the entire frequency range we obtain the average power of the signal which is equal to the meansquare value of the wide sense stationary process 2a 2a SX0 02 a2 1 1 2a 1 2a a 2 II II X2 S d d t quot 1 271 Aw w 27 mwzaz w 27a an a im 272 2 Useful integral 1 2a 1 2a 1 55 w1a1dw m gamma Grim a Example 5300 72 wzaZ a Spectral Density of Constant or Periodic Signals Consider a process X0 A Bcos27y 0t where A B andf0 are constant and 9 is a random variable uniformly distributed over 021 Let XT0 be a truncated version of X0 over T T FX f I A Bcos27y n enemy FX0 jA Boos27 exam1d A5f JR216000618 avefne l SXf Azww 32 4gtaquotltff 6mm sxm 27542 w7rBz2 ww wwn SXf SM m El FX 0 l 1 M 2T The spectral density thus consists of three spikes delta function at DC with height A1 324 and at if with height 324 respectively a in radians f in cycles or Hz 0 fn f f Spring 2004 ECEVEEH E H Juana cupyiiamznm Review Slide 27 Spring 2004 ECEVEEH E H Juana cupyiiamznm Review Slide 28 MeanSquare Value and Total Power Xz A Bcosaol uniformly distributed in 021 SXa 2m25m 7132 26a me 50 7 w0 Total power imam imam Bz25w on 5wewndw AZB Z AZ MeanSquare Value of the process EOX2 1 E A B coswht Z Ea A2 ZAB cosmy 6 92 cos2 cont A2 E9 ZAB cosay BZE coszqt Ea ZAB c0105 0 for 6 U02r E coszt G E cos2ayz 2 1 Autocorrelation of Bandlimited White Noise A more useful concept is the bandlimited white noise whose spectral density is a constant over a finite bandwidth and zero outside the frequency range For example SE S la is 27rW S I Am 0 lwlgt27rW W W f RXT F 1sXf F 1S rectj 2WSUsinc2Wr RX7 0 at r n2Wn 1 2 Random variables from a band limited white noise are uncorrelated T if they are separated in time by any multiple of 12W seconds Therefore if a continuous time bandlimited white noise process is 1 sam led at twice the maXImum fre uenc limit 2W then the Therefore E X2t X2 A2 54 1 f SXwdw p q y 2 2n 7w resultant samples of the discrete time sequence are uncorrelated spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 29 spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 30 Spectral Density of Binary Processes 7 EiiFXwl1Alepwlz SXw 1 2T 7 1 For PC 100 rectt ti Fpwtlsinctlw27r and thus lFPwlztlzsincztlw27r Z 2 Therefore SXwMAztlsinczzlw2n l ii 2 7 1 1 Forpt7pct72lcos t1 ltlg2 0 ltlgt 2 r 39 Z FPwiI n 1cosz mWe dett Smwt 2 2 2 2 W tl 2 mil2 7r 7wt12 Sxw 5inw 1 HWY 7 2 I 4 curl2 Megan2 Notethat in both 2 Random Input to A System xt Xt yt 3 Yt The input is no longer a fixed function The same linear system concept applies 4t Example W Se 2 0 0 tlt 0 Xt M 4cos21 where M is a random variable and 9 is an independent random variable uniformly distributed in 021 Yt M 4cos2l 5e393quot d 5 3M g 3 cos2t 6 2 sin2t 69 Azfl 2 1 08888 max 5X00 Yt is also a random process whose statistical properties can be Sxf Tsmc fif W occurs at w 0 derived from the distributions of the random variables Mand spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 31 spring 2004 ECEVEE73 a H Juana CupyrighlZUUA Review Slide 32


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Anthony Lee UC Santa Barbara

"I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Jim McGreen Ohio University

"Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.