### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Stochastic Modeling STAT 416

Penn State

GPA 3.92

### View Full Document

## 360

## 0

## Popular in Course

## Popular in Statistics

This 0 page Class Notes was uploaded by Hilbert Denesik on Sunday November 1, 2015. The Class Notes belongs to STAT 416 at Pennsylvania State University taught by Staff in Fall. Since its upload, it has received 360 views. For similar materials see /class/233134/stat-416-pennsylvania-state-university in Statistics at Pennsylvania State University.

## Popular in Statistics

## Reviews for Stochastic Modeling

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 11/01/15

Joint Distribution 0 We may be interested in probability statements of sev eral RVs 0 Example Two people A and B both ip coin twice X number of heads obtained by A Y number of heads obtained by B Find PX gt Y 0 Discrete case Joint probability mass function pz y PX 51 Y y Two coins one fair the other twoheaded A ran domly chooses one and B takes the other X i l A gets head i l B gets head i 0 A gets tail i 0 B gets tail Find PX 2 Y o Marginal probability mass function of X can be ob tained from the joint probability mass function px 3 WW Z 19567 y ypfr7ygt0 Similarly p y Z WI 2 frrp7ygt0 0 Continuous case Joint probability density function f 1 y PX Y E R Rfxydxdy o Marginal pdf ik01fw y WZfw w 0 Joint cumulative probability distribution function of X and Y FabPX aY b ooltabltoo o Marginal cdf FXa Fa oo Fyb Foo b o Expectation E 9X Y 2x 9x ypx y in the discrete case ff 9x y f 1 ydady in the continuous case 0 Based on joint distribution we can derive EaX bY aEX bEY Extension ElaiXi G2X2 aanl G1EX1 a2EX2 0 Example EX X is binomial with n p X i l z th ip is head Z 7 0 2th ip is tail X EX ZEPQ 71 0 Assume there are n students in a class What is the expected number of months in which at least one stu dent was bom Assume equal chance of being born in any month Solution Let X be the number of months some stu dents are born Let Xi be the indicator RV for the 2th month in which some students are born Then X Hence ll EX 12EX1 1213x1 l 12 l Em Independent Random Variables o X and Y are independent if HXng HX HY o Equivalently Fa b FXaFyb 0 Discrete px y pXxpyy 0 Continuous x 3 fXfYZ 0 Proposition 23 If X and Y are independent then for function h and g EgXhY EgXEhY Covariance 0 De nition Covariance of X and Y CovX Y EX EXY o CovX X EX EX2 Va7 X COUX Y EXY EXEY o If X and Y are independent COUX Y 0 0 Properties 1 COUX X VarX 2 COUX Y COUY X 3 COUCX Y CCOUX Y 4 COUX Y Z COUX Y COUX Z Sum of Random Variables o Ifos are independent 2 1 2 n VarZ Xi Z VarX 1 1 VarZ aZXi Z aZZVaNXi 1 1 0 Example Variance of Binomial RV sum of indepen dent Bernoulli RVs VarX np1 p Moment Generating Functions 0 Moment generating function of a RV X is Wt 505 Elem i Zx1pmgt0 txpltx Xdiscrete i ff 6 f ak X continuous o Moment of X the nth moment of X is E X n EX We t 0 where We is the nth order derivative 0 Example 1 Bernoulli with parameter p Mt pet l p for any t 2 Poisson with parameter A Mt amt 1 for any t 0 Property 1 Moment generation function of the sum of independent RVs sz l n are independent Z X1X2 X717 205 905 0 Property 2 Moment generating function uniquely de termines the distribution 0 Example 1 Sum of independent Binomial RVs 2 Sum of independent Poisson RVs 3 Joint distribution of the sample mean and sample variance from a normal porpulation Important Inequalities 0 Markov Inequality If X is a RV that takes only non negative values then for any a gt 0 PX 2a 3 a o Chebyshev s Inequality If X is a RV with mean u and variance 02 then for any value 19 gt O 2 039 PX uZ Sg 0 Examples obtaining bounds on probabilities Strong Law of Large Numbers 0 Theorem 21 Strong Law of Large Numbers Let X1 X2 be a sequence of independent random variables having a common distribution Let EXZ u Then with probability 1 X1X2H39Xn n gtJ as n gtoo Central Limit Theorem 0 Theorem 22 Central Limit Theorem Let X1 X2 be a sequence of independent random variables having a common distribution Let E Xi u Va Xi 02 Then the distribution of X1X2Xn nl IxE tends to the standard normal as n gt 00 That is X1X2Xn nl IxE 1 Z 2 WQd cp gt 2WOoe 1 0 Example estimate probability P 2 1 Let X be the number of times that a fair coin ipped 40 times lands heads Find PX 20 2 Suppose that orders at a restaurant are iid random variables with mean u 8 dollars and standard deviation 0 2 dollars Estimate the probability that the rst 100 customers spend a total of more than 840 Estimate the probability that the rst 100 customers spend a total of between 780 and 820 Sample Space and Events 0 Sample space The set of all possible outcomes 1 Roll 21 die 12 3456 2 Flip a coin twice H H H T T H T 0 Event A subset of the sample space 1 Roll 21 die the outcome is even 2 4 6 2 Flip a coin twice and the two results are different H T T H 0 Set operations 1 Union EUE an outcome is in E UF if it is either in E or in F 2 Intersection E H F EF an outcome is in EF if it is in both E and F 3 Mutually exclusive E and F are mutually exclu sive if EF gb a Roll a die E1 outcome is below 3 1 2 E2 outcome is above 4 5 6 E1 E2 4 Complement EC outcome that is not in E Probabilities 0 Sample space S event E o The probability of event E is a number PE assigned to E that satis es the following conditions 1 0 g PE g 1 2 PS 1 3 For any sequence of events E1 E2 which are mutually exclusive that is EnE b for any n y ma PUO1En 2201 0 Compute Probabilities 1 Roll a fair die 6 equally likely outcomes 1 2 3 4 5 6 P1 16 P2 16 P6 16 2 E the outcome is even PE PE 136247 6 P2P4P6 12 0 Properties 1 PE PEC l 2 Any event E and F may not be mutually exclu sive PE U F PE PF PEF Proof LetA EFB E ACF A Then EBUAFCUAEUFBUCUA Note A B C are mutually exclusive PE U F PB PC PA 133 PUD 130 PUD PM PE PF PEF Conditional Probability 0 Given one event has occurred what is the probability that another event occurs Let F be given then the conditional probability of E is PE F 0 Example Flip a coin twice S Hv H H7 T T7 H T7 F the rst ip is H F HH H PF 12 E the two ips are not both H E HT T T T PE 34 If F occurs in order for E to occur the second ip has to be T Since the coin is fair PEF 12 0 De nition PEF PF Example the above coin ip setup PEF PH7T 7 E 7 PF PF 12T HEW PEF Independent Events 0 De nition Two events E and F are independent if PEF PEPF o Equivalently PElF PE PFlE PF o Distinguish independent and mutually exclusive Independent PEF E and F mutually exclusive EF 5 PEF 0 PE U F PE PF 0 Extension to n events E1 E2 En are independent if n PE1E2E3 En H PE 21 0 Independent trials A sequence of experiments with results being either a success or a failure and the experiments are independent Bayes Formula 0 Total probability formula Suppose events F1 F2 Fn are mutually exclusive and U1F S Given any event E we have PE PFZE PEPE Fl 0 Bayes formula Suppose events F1 F2 Fn are mutually exclusive and U1F S Given any event E i pEPE Fi Pm E 21PFjPE Fj Exercises for Chapter 4 Markov Chain 1 A particle moves on a circle through points which have been marked 0 l 2 3 4 in a clockwise or der At each step it has a probability p of moving to the right and l p to the left Let X denote its location on the circle after the nth step The process Xm n 2 0 is a Markov chain a Find the transition probability matrix b Is this MC irreducible Find all the recurrent states and transient states 2 Each of 2 switches is either on or off during a day On day n each switch will independently be on wit probability 1 on sw1tchis in day n 1 What frac tion of days are both switches on both off 3 Three out of every 4 trucks on the road are followed by a car while only 1 out of every 5 cars is followed by a truck What fraction of vehicles on the road are trucks 4 Each morning an individual leaves his house and goes for a run He is equally likely to leave either from his front or back door Upon leaving the house he 1 U1 chooses a pair of running shoes or goes running bare foot if there are no shoes at the door from which he departed On his return he is equally likely to en ter and leave his running shoes either by the front or back door If he owns a total of 19 pairs of running shoes what proportion of the time does he run bare footed Suppose Max and Patty decide to ip pennies The one coming closest to the wall wins Patty being the better player has a probability of 06 of winnig on each ip If Patty starts with 5 pennies and Max with 10 then what is the probability that Patty will wipe Max out A total of m white and m black balls are distributed among two urns with each urn containing m balls At each stage a ball is randomly selected from each urn and the two selected balls are interchanged Let X n denote the number of black balls in urn 1 after the nth interchange a Give the transition probabilities of the Markov chain X n 2 0 b Without any computations what do you think are the limiting probabilities of this chain c Find the limiting probabilities and show that the stationary chain is time reversible 2 7 At all times an urn contains N balls some white balls and some black balls At each stage a coin hav ing probability p 0 lt p lt l of landing heads is ipped If heads appears then a ball is chosen at ran dom from the urn and is replaced by a white ball if tails appears then a ball is chosen from the urn and is replaced by a black ball Let X denote the number of white balls in the urn after the nth stage a Is an 2 0 a Markov chain If so explain why b What are its classes What are their periods Are they transient or recurrent c Compute the transition probabilities Pm d Let N 2 Find the proportion of time in each state e Based on your answer in part d and your intu ition guess the answer for the limiting probability in the general case i Prove your guess in part e g If p 1 what is the expected time until there are only white balls in the urn if initially there are i 1 white and N z39 4 l 3 black What is the expected time for a general N and 2 Random Variables 0 Functions of random outcomes 1 Flip a cointwice H H H T T H T Let the number of heads in the two ips be X X 0 HTTHX1 HHX 2 0 Random variables realvalued functions de ned on the sample space Discrete The RV takes value on either a nite or countable number of possible values Example Toss a coin until the rst head appears Let N be the number oftosses N 1 2 Continuous The RV takes a continuum of possi ble values Example The lifetime of a car Cumulative Distribution Functions 0 Let X be a RV The cumulative distribution function cd ofX is Fb PX g b Toss a coin until the rst head appears N is the number of tosses PN g b PN1PN2PN W pp1 pp1 pLbJ1 1 1 29W 0 Properties of Fb 1 Fb is nondecreasing function of b 2 hrnbOO Fb Foo 1 3 hrnbOO Fb F oo 0 4 Fb is rightcontinuous Discrete Random Variables o X take at most countably many possible values 0 Probability massfunctl39on pa pa PX 0 Let the possible values be 51 239 1 2 2 0 Mr 0 for all other x 1 0 Important discrete random variables 1 Bernoulli random variable 0 X PX1p PXO1 p 2 Binomial RV Flip an unfair coin n times Assume the experi ments are independent trials Let X be the number of heads in the n trials The range ofX 0 1 PXk Zpk1 pnkkO1n If de ne X as the indicator for whether the ith ip is a head X 1 for a head and X 0 for a tail X is a Bernoulli RV X iXZ39 11 3 Geometric RV An unfair coin has probability p of coming up heads Flip the coin until the rst head appears Let X be the number of ips needed X l 2 P X n PThe rst n l ips are tails the nth ip is head 1 p 1p 4 Poisson RV X 0 12 Continuous Random Variables o X takes on a continuum of possible values 0 Probability density function pd9 f PX e B mm B 0 Properties 1 f 2 0 2 ff fxdx 1 0 Calculate probability b pm g X g b fcdr 0 Relationship with F a and fa Fm PX e ooa L mm dFa da fa 0 Important continous RVs 1 Uniform lOltxltl x 7 0 otherwise More generally lb a altxltb 0 otherwise 2 Exponential for any gt 0 Ae 1 Z 0 f 0 otherwise 3 Gamma RV for any gt 0 o gt 0 AeiAx ail x Z 0 0 otherwise 4 Normal RV for any 0 gt 0 and any u l 2 2 i 95 M 20 xi mltxltm f g XNNw l If X N N M 02 then for any constants a and b aX b N Nau b 202 Expectation 0 Discrete case EX Zwmw l Bernoulli EX p 2 Binomial EX np 3 Geometric EX 1 4 Poisson with parameter oz EX oz 0 Continuous case EX ff xf 1 Uniform on a b EX 2 Exponential with parameter A EX l 3 Gamma with A oz EX oz 4 Gaussian NLz 02 EX Lz o Expectaion of a function of a RV 0 Proposition 21 a If X is a discrete RV with pmf Mm then for any realvalued function g E 906 Z 956p lt gtgt0 b If X is a continuous RV with pdf f x for any realvalued function 9 EM 00 gltxgtfltxgtdx 0 Corollary 22 If a and b are constants then E aX b aE X b Variance 0 De nition VarX EX o VarX EX2 0 Discrete l Bernoulli VarX pl p 2 Binomial VarX npl p 3 Geometric VarX 2 2 4 Poisson VarX or same as EX 0 Continuous 1 Uniform on 01 b VarX b12602 2 Exponential VarX 3 Gaussian VarX 02 4 Gamma VarX oz2 Time Reversible Markov Chain 0 Consider a stationary ergodic irreducible Markov chain 0 Let the limiting probabilities be 7n 0 The original MC 7Xn 27Xn 17Xn7 0 Trace the MC backwards 7Xmn17Xn27 o Xnz39 0 l is also a Markov chain 0 Why Given the present the past and future are in dependent past gt future future gt past Given the present the future and past are in dependent Hence the reverse process is also a MC o What are the transition probabilities of the reversed MC Qij PiXmjiXm1i PXmj7Xm1i PXm1 2 i PX jPXm1 z Xm j PXm1 2 7Tij39 7Tz39 0 Time reversible MC A Markov chain is time reversible if QM 1 that is the reverse MC has the same tran sition probability matrix as the original MC QM PM is equivalent to 791 1317 0 Proposition Suppose an ergodic irreducible MC have transition probabilities PM If we can nd nonnegatz39ve num bers 511 summing to one 511 l and satisfying equation xZPZijP for all 239 j then the MC is time reversible and x is the limiting probability 7T1 Proof Since xiPZj CUijZ39 sum over 239 E ij I jg Pj j Hence for any j we have 512739 E Jl z39Pz39j i In addition 2 51 1 By Theorem 41 51 is the limiting probability 7n 0 Interpretation for 79PM MPH Pseeing a transition from j to Pseeing jPtransit toz j ij 0 7rijZ MPH means the probability of seeing a tran sition from j to 239 is the same as seeing a transition from 239 to j o A transition from j to 2 for the original MC is a tran sition from 239 to j for the reversed MC 0 Example Consider a random walk with states 0 1 M and PM Pm1 05 1 P 7 1Z1M 1 1301 060 1 1300 PMM OdM 1 PALM 1 0 0 0L1 0 2 0 3 0 4 0L5 0030930 93 1 0L0 1 0L1 1 012 1 0L3 10L4 10L5 o This MC is time reversible In between every 2 transitions from 239 to 21 there has to be a transition from Z 1 to z In between every 2 transitions from 21 to 239 there has to be a transition from 239 to Z 1 The transitions from 2 to 2 1 and from 2 1 to 2 are sandwiched Hence 7Tthi 7T 1P i If1j z 1 gt 1132 P 0 hence 7T P j WijZ39 0 Limiting probabilities W00 7T11 051 711061 TWO 052 7120 7TH11 244 0717quot397M 1 Solve in terms of 7m 060 7T1 0 1 051 051 051050 7Q 7r 7r0 1 0421 1 0421 041 01239710123972m010 Ingeneral 7TZ39 WWI0 fOI39Z 12 Subject to 7n 1 we have 1 M Hi 01239 il gia 7T0 Odi 1Odi 2 39 39 39Oto mea m4 u mgt for z 12M 7Q 7T0 0 Example Consider an arbitrarily connected graph A link between vertices 239 and j has weight wij or w wij z j E 1 2 A Markov chain is de ned by the transition probabilities PM 2125 Show that this Markov chain is time reversible For instance P12 P14 1 Proof 2k wz39k Cons1der 51 m z 12 M It is easy to check that 1 51 i M i 3 i l 3l2 7 2 i Zle wlk i mp 2k wz39k wzj W Z 1 3sz wlk 2k wz39k Xsz wlk39 Similarly mp xp W Zle wlk Xsz 101k Z Z By the proposition the MC is time reversible with x being the limiting probabilities Markov Chain 0 Stochastic process discrete time X17 X2 0 Markov chain Consider a discrete time stochastic process with discrete space Xn E 0 12 Markovian property PXn1 17an z39Xn1 in1 X0 to PXn1 j Xn Z Pm PM is the transition probability the probability of making a transition from 239 to j Transition probability matrix Pop 1301 P02 quot39 Pro 1311 P12 quot39 P s s s s Pip Pm Pm quot39 Example 0 Suppose whether it will rain tomorrow depends on past weather condition ony through whether it rains today Consider the stochastic process Xn7 n 1 2 0 rain on day n Xn 1 not rain on day n PXn1an7Xn 177X1 PXni i Xn a State space 0 1 0 Transition matrix 1300 1301 Pro 1311 0 P070 Ptomorrow rainitoday rain 04 Then P071 1 oz 0 P170 Ptomorrow rainitoday not rain 6 Then P1711 051 05 l 0 Transition matrix Transforming into a Markov Chain 0 Suppose whether it will rain tomorrow depends on whether it rained today and yesterday 0 PltXn1 Xn Xn1 X1 PltXn1 Xn Xn1 The process is not a rst order Markov chain oDe neYn 0 Xn0Xn10 RR Y i 1Xn0Xn1 NR i 2Xn1Xn170 RN 3 Xn1Xn171 NN PYn1YnYn1 PXn1XnXan1 PXn17Xn Xan 1 mam o Yn7 n 1 2 is a Markov chain 1300 1301 P02 P03 P170 1311 P12 P13 1320 1321 P22 P23 1330 1331 P32 P33 13071 Pm1 HY 0 PXn1 0Xn 1an 0 Xn1 0 0 P073 PltYn1 PXn1 17Xn 1XnOXn1O 0 O P171 P173 0 P270 P272 0 P370 Pg2 0 0 Transition matrix P070 0 P072 0 P070 0 1 P070 0 P170 0 P172 0 i P170 0 1 P170 0 0 P271 0 P273 0 Pg1 0 1 le 0 Pg1 0 Pg3 0 13371 0 1 13371 o The Markov chain is speci ed by P070 P170 P271 P371 1 P070 P tomorrow will rain today rain yesterday rain 2 P170 P tomorrow will rain today rain yesterday not rain 3 P271 P tomorrow will rain today not rain yesterday rain 4 P371 P tomorrow will rain today not rain yesterday not rain ChapmanKolmogorov Equations 0 Transition after nth steps Pg PXnm j Xm 2 o ChapmanKolmogorov Equations 1213 2 Eggs n m 2 0 for all z j k0 0 Proof by Total probability formula m Paw mw ZPXnm ijn k0 ZPQQL MXO z k0 PXnm Xn kX0 00 i n m iZa k0 o nstep transition matrix TL TL TL P00 1301 P02 TL TL TL P10 P171 P12 Pm s s s 3 TL TL TL Pm Pm Pm o ChapmanKolmogorov Equations Pnm P01 PUD7 P01 PH 0 Weather example Pi 051 05 i 0703 i 1 i 04 06 Find P rain on Tuesday rain on Sunday and P rain on Tuesday and rain on Wednesday rain on Sunday Solution P rain on Tuesday rain on Sunday P0270 2 07 03 07 03 P P P 04 00 X 04 00 7 001039 052 048 P rain on Tuesday rain on Sunday 061 P rain on Tuesday and rain on Wednesday rain on Sunday 07Xn1 0 Xn Z PXn O Xn2 OPXn1 0 Xn 0 Xn2 0 PXn O Xn Z 0PXn1 O Xn 0 P391309 061 X 07 0427 Classi cation of States 0 Accessible State j is accessible from state 239 if Pg gt 0 for some n 2 0 i gt j Equivalent to Pltever enter j start in gt 0 Pltever enter j start in PUZ 0Xn jXo Z ZPltXnj X0 2 710 00 i n i 2 PM 710 Hence if 0 for all n Pltever enter j start in 0 On the other hand I Pltever enter j start in PUZoXn jXo 2 PXn jHXO for any n TL 17 39 If Pin gt 0 for some n Pltever enterj startin 2 Pg gt 0 Examples 0 Communicate State 239 and j communicate if they are accessible from each other i lt gt j Properties 1 PX0 z iXO 1 Any state 239 communicates with itself 21fz lt gtjthenjlt gtz 31fz lt gtjandjlt gt kthenz39 lt gt k Proof 293 a7jgt0andpfggt0 jlt gtk gt P gt0andB 3gtO 00 m Z ChapmanKolmogorov Eq 0 n m gt Pmquot M gt 0 Similarl we can show P m gt 0 Hence 2 lt gt y kn k2 0 Class Two states that communciate are said to be in the same class A class is a subset of states that communicate with each other Different classes do NOT overlap Classes form a partition of states 0 Irreducible A Markov chain is irreducible if there is only one class Consider the Markov chain with transition proba bility matrix P OMIWIH OJIHoPIHMIH WINJle O The MC is irreducible MC With transition probability matrix 00 P OHMlele OHMlele lt3an 3 r HNH 0 Three classes 01 2 Recurrent and Transient States 0 fi probability that starting in state 239 the MC will ever reenter state 2 o Recurrent If fz 1 state 239 is recurrent A recurrent states will be Visited in nitely many times by the process starting from 2 0 Transient If fi lt 1 state 239 is transient Starting from 239 the MC will be in state 239 for ex actly n times including the starting state is ff l fi 71 12 This is a geometric distribution with parameter 1 fi The expected number of times spent in state 239 i811 fr o A state is recurrent if and only if the expected number of time periods that the process is in state 239 starting from state 239 is in nite Recurrent ltgt Enumber of Visits to ZX0 oo Transient ltgt Enumber of Visits to ZX0 lt oo 0 Compute Enumber of Visits to ZX0 De ne I i 1 ianz i 0 iany z Then the number of Visits to 239 is 20 In Enumber of Visits to ZX0 Z EIniX0 2 710 0 Proposition 41 State 239 is recurrent if 270 00 and transient if 20 PM lt oo 0 Corollary 42 If state 239 is recurrent and state 239 com municates with state j then state j is recurrent 0 Corollary 43 A nite state Markov chain cannot have all transient states For any irreducible and nitestate Markov chain all states are recurrent 0 Consider a MC with 00 1000 Pi 0100 0100 The MC is irreducible and nite state hence all states are recurrent 0 Consider a MC with pd H NH O ONIWIH QMIWIH O O NIHO O O O NH O ONIWIH QMIWIH O 3 Three classes 01 2 3 State 0 12 3 are recurrent and state 4 is transient Random Walk 0 A Markov chain with state space 239 0 i1 i2 0 Transition probability PM p 1 131714 At every step move either 1 step forward or 1 step backward 0 Example a gambler either wins a dollar or loses a dollar at every game Xn is the number of dollars he has when starting the nth game 0 For anyz lt j pH gt 013 1 19 gt 0 The MC is irreducible 0 Hence either all the states are transient or all the states are recurrent 0 Under which condition are the states transient or re current Consider State 0 f P 7 oo recurrent 1 070 7 nite transient n Only for even m 1357170 gt 0 By Stirling s approximation 71 N nn126 27F 211 N MPG 19 00 m Whenp 124pl pl 0 4p1 p 7 0 L 1 W 1 7m7 The summation diverges Hence all the states are recurrent When 29 12 4291 p lt 1 270 W93 converges All the states are transient Limiting Probabilities 0 Weather example 07 03 P P 04 06 4 4 05749 04251 P P 05668 04332 0572 0428 lt8 lt4 lt4 P P P 0570 0430 0 PM and P are close The rows in P8 are close 0 Limiting probabilities 0 Period d For state 239 if 0 whenever it is not divisible by d and d is the largest integer with this property dis the period of state 2 Period d is the greatest common divisor of all the m such that gt 0 o Aperiodic State 239 is aperiodic if its period is l 0 Positive recurrent If a state 239 is recurrent and the ex pected time until the process returns to state 239 is nite If 239 lt gt j and 239 is positive recurrent then j is posi tive recurrent For a nitestate MC a recurrent state is also pos itive recurrent A nitestate irreducible MC contains all positive recurrent states 0 Ergodic A positive recurrent and aperiodic state is an ergodic state 0 A Markov chain is ergodic if all its states are ergodic 0 Theorem 41 For an irreducible ergodic Markov chain lirnnOO Pg exists and is independent of 2 Let 7r lirnnOO Pf j 2 0 then 7r is the unique nonnegative solution of 7Tj ZTI39Z39PW39 0 2390 i791 j0 o The Weather Example 05 l or P 1 The MC is irreducible and ergodic 7T07T1 1 7T0 7T0P007T1Pi07T0057T15 7T1 7T0P071 7T1P171 7T0ltl Or 7T1ltl l a Solve the linear equations 7m m 5 7 1 0z 7T1 7 Gambler s Ruin Problem 0 At each play a gambler either Wins a unit with prob ability or loses a unite with probability q 1 p Suppose the gambler starts with 239 units what is the probability that the gambler s fortune will reach N before reaching 0 broke 0 Solution Let X be the number of units the gambler has at time t th 0 1 2 is a Markov chain Transition probabilities 1300 PNN 1 HM pBz1 1 pq Denote the probability that starting from 239 units the gambler s fortune will reach N before reaching Oby B 2 01N Condition on the result of the rst game and apply the total probability formula B me qPi l 20 Changing forms 29 QP 29PM qB1 610 Pi l MPH B PM B gm PH Recursion P0 P2 P1 2 g ng HOl 39 i l B PH gPH PM 9 P1 19 p Add up the equations B P11 i 1 i 11 1 Ma 1 We know PN 1 Q 29 1 239 V UIQ BIQ P N i PN 1 17 p NH 1 21 Hence 1 1N ifp 12 In summary Hy 39f 12 Hi W1P2 7 iN if p 12 0 Note ifN gt 00 B 1 l ifpgt 12 0 if g 12 c When p gt 1 2 there is a positive probability that the gambler will Win in nitely many units 0 When p g 12 the gambler will surely go broke with probability 1 if not stop at a nite fortune as suming the opponent is in nitely rich 22 Exponential Distribution 0 De nition Exponential distribution with parameter A x Ae 1 Z 0 0 1 lt 0 o The cdf Fltxfltxdx M x20 Ilt0 0 Mean EX1 o Moment generating function y W EequotX tlt t gt1 EX2 g gq5tt0 2 w moor EX2 EX2 12 0 Properties 1 Memoryless PX gt s t X gt t PX gt s PX gt sth gt t PX gt stX gt t PX gt15 PX gt s t PX gt15 e Ms f e At e As PX gt s Example Suppose that the amount of time one spends in a bank is exponentially distributed with mean 10 minutes llO What is the prob ability that a customer will spend more than 15 minutes in the bank What is the probability that a customer will spend more than 15 min utes in the bank given that he is still in the bank after 10 minutes Solution PX gt 15 e W 6 32 022 PX gt15lX gt10 PX gt 5 6 12 0604 Failure rate hazard rate function Mt f t t N 1 Ft PX E tt dtX gt t rtdt For exponential distribution Mt A t gt 0 Failure rate function uniquely determines Ft Ft 1 6 f5 W 2 If X z 12 n are iid exponential RVs With mean l the pdf of 2211 X is Atyl l leX2 Xnt A6 Alton 7 gamma distribution with parameters n and A 3 If X1 and X2 are independent exponential RVs with mean l1 l2 PX1 lt X2 AIEAQ 4 If X 2 12 n are independent exponential RVs with rate m Let Z minX1 Xn and Y maxltX1 X Find distribution of Z and Y Z is an exponential RV with rate 2211 m PZ gt 51 PminX1 Xn gt L gt X2 gt L Xn gt 1 PX1 gt 51PX2 gt x PXn gt x n H e Mfr e ZQQMW Z39l was m lt as 111711 Poisson Process 0 Counting process Stochastic process Ntt Z 0 is a counting process if N t represents the total num ber of events that have occurred up to time t N t 2 0 and are of integer values N t is nondecreasing in t 0 Independent increments the numbers of events oc curred in disjoint time intervals are independent 0 Stationary increments the distribution of the number of events occurred in a time interval only depends on the length of the interval and does not depend on the position o A counting process N tt 2 0 is a Poisson pro cess with rate A A gt 0 if 1 N 0 0 2 The process has independent increments 3 The process has staionary increments and N ts N 3 follows a Poisson distribution with parameter At PNts Ns n 6 2 n 0 1 0 Note ENt 8 Ns At ENt ENt 0 N0 At Interarrival and Waiting Time 0 De ne Tn as the elapsed time between 71 lst and the nth event Tm n l 2 is a sequence of inierarrival times 0 Proposition 51 Tn n 12 are independent identically distributed exponential random variables with mean l 0 De ne S as the waiting time for the nth event ie the arrival time of the nth event 11 0 Distribution of Sn At MW 1 71 l gamma distribution with parameters n and A ESn 2211 ET nA 9105 0 Example Suppose that people immigrate into a terri tory at a Poisson rate l per day a What is the expected time until the tenth immigrant arrives b What is the probability that the elapsed time between the tenth and the eleventh arrival exceeds 2 days Solution Time until the 10th immigrant arrives is Sm 13510 lO 10 13ml gt 2 e 2A 0133 Further Properties 0 Consider a Poisson process N t t 2 0 with rate A Each event belongs to two types I and II The type of an event is independent of everything else The probability of being in type I is p 0 Examples female vs male customers good emails vs spams 0 Let N1t be the number of type 1 events up to time t 0 Let N2t be the number of type 11 events up to time t O NOE N105 N205 0 Proposition 52 N1tt 2 0 and N2tt 2 0 are both Poisson processes having respective rates A and 1 p Furthermore the two processes are in dependent Example If immigrants to area A arrive at a Poisson rate of 10 per week and if each immigrant is of En glish descent with probability 1 12 then what is the probability that no people of English descent will im migrate to area A during the month of February Solution The number of English descent immigrants arrived up to time t is N1t which is a Poisson process with mean 12 1012 PltN1lt4gt o eW 0 Conversely Suppose N1tt 2 0 and N2tt 2 0 are independent Poisson processes having respec tive rates 1 and A2 Then Nt N1t N2t is a Poisson process with rate 1 2 For any event occurred with unknown type independent of every thing else the probability of being type I is p and type II is 1 p Example On a road cars pass according to a Poisson process with rate 5 per minute Trucks pass accord ing to a Poisson process with rate 1 per minute The two processes are indepdendent If in 3 minutes 10 veicles passed by What is the probability that 2 of them are trucks Solution Each veicle is independently a car with probability g and a truck with probability The probabil ity that 2 out of 10 veicles are trucks is given by the binomial distribution 12 28 Conditional Distribution of Arrival Times 0 Consider a Poisson process N tt 2 0 with rate A Up to t there is exactly one event occurred What is the conditional distribution of T1 0 Under the condition T1 uniformly distributes on 0 t 0 Proof PT1 lt sNt 1 PT1 lt sNt 1 WNW PNs 7 1 N05 Ns 0 PNt1 PNs PNt Ns0 PN t 1 Ase As e Mt s Ate At g Note cdf of a uniform o If N t n what is the joint conditional distribution of the arrival times 31 SQ Sn 0 1 SQ Sn is the ordered statistics of n independent random variables uniformly distributed on 0 t 0 Let Y1 Y2 Yn be n RVs 31l2Yn is the ordered statistics of Y1 Y2 Yn if 300 is the kth smallest value among them 0 If Yi 2 l n are iid continuous RVs with pdf f then the joint density of the ordered statistics Ya Y9 Ym lS fi17Y277Yny17 327 quot7 311 quotlH1fy 21 lt 22 lt lt Km 0 otherwise 0 We can show that 71 TL 0lt31lt32ltsnltt Proof f3132 sn l Nt 71 sh 32 smn POW 71 Ae AslAe MSQ sl Ae Asn sn1 t sn e Mt nl n 7 0lt31ltltsnltt o For n independent uniformly distributed RVs on 0 t Y1 Y 1 m1 22 yn t n Proposition 54 Given Sn t the arrival times 31 SQ SW1 has the distribution of the ordered statis tics of a set n 1 independent uniform 0 t random variables Generalization of Poisson Process 0 Nonhomogeneous Poisson process The counting pro cess N tt 2 0 is said to be a nonhomogeneous Poisson process with intensity function t t 2 0 if 1 N 0 0 2 The process has independent increments 3 The distribution of N t s N t is Poisson with mean given by mt s mt where 0 We call mt mean value function 0 Poisson process is a special case where t A a constant 0 Compound Poisson process A stochastic process X tt 2 0 is said to be a compound Poisson pro cess if it can be represented as Ne Xt Zn t 2 0 21 where N tt 2 0 is a Poisson process and YM 2 0 is a family of independent and identically distributed random variables which are also indepen dent ofNtt 2 0 o The random variable X t is said to be a compound Poisson random variable 0 Example Suppose customers leave a supermarket in accordance with a Poisson process If E the amount spent by the 2th customer 239 12 are indepen dent and identically distributed then Xt 3 Y1 the total amount of money spent by customers by time t is a compound Poisson process c Find E X 75 and VarX o EXt MEG1 o VarXt tVa7 Y1 a Proof N05 EltXlttgtwNlttgt n Ep wot n HEBMW n m Y2 nEY1 PNt nEXtNt ZPNt nnEY1 Egg 1 nPNt n EYEENt VarXt Nt n gtgtgt Va7 Xt gXm mXam2 Z paw nEltX2lttNltt n ltEltXltt2 Z PNt nnVarY1 n2E2Y1 MEG12 v armmma E2Y1EN2t MEG1 VarOl tE2Y1 t NVWODEK t E Mean Time Spent in Transient States 0 Consider a nite state Markov Chain with a set of transient states T 1 2 t o Gambler s ruin problem states 07 17 N The transient states are l 2 N l T 12N 1 e Let the transition probability matrix be P o A part of P formed by probabilities from transient states to transient states P11 P12 P12 P21 P22 P22 PT P251 1312 PttJ 0 Examples Gamblers ruin problem P PT P00 P01 P10 P11 PN0 PNl PN 11 PN 12 Pow 1 Pow PLN l PLN PNN 1 PNN PLN l P2N 1 PN LN 1 Suppose a four state MC with states 0 1 2 3 has two transient states 0 and 3 Then P00 P10 P20 P30 P P01 P11 P21 P31 P02 P12 P22 P32 P03 P13 P23 P33J PT P00 P03 P30 P33 o For transient states 239 and j 8M expected number of time periods the MC is in state j given that it starts in state 2 Special case 3M starting from 239 the number of time periods in z Transient states fi lt 1 Recall that fi is the prob ability of ever revisit state 239 starting from state 2 De ne fZj the probability that the MC ever visits state j given that it starts in 2 Special case f fl Compute sZj by conditioning If 2 j Sz39j l E visits to j after the initial state 1 If 2 y j sZj E visits to j after the initial state 2 Note E visits to j after the initial state Z Pthe rst transition is to state k X k E visits to j starting from k E PikSkj k If k is recurrent then 0 for all n Hence Skj 0 To see Pg 0 for all n suppose Pg gt 0 for some n Then 0 for all m otherwise 19 and j communicate and this con icts the fact j is transient Since 0 for all m with a positive probability state 19 will transit to state j and never come back to state 19 which con icts the fact 19 is recurrent k should be revisited in nitely many times Hence Pg gt 0 for some n is not true The above equation can be written as t E Visits to j after the initial state Z Bkskj k1 T0 combine Eq 1 and 2 into a uni ed form Hence t Sz39j 51739 Z Pikskj 3 k1 for allij 6 12 t Use matrix 811 812 3115 821 822 822 S 821 822 Stt Eq 3 is equivalent to SIPTS SI PT 1 To obtain fZj the probability of ever transit to j starting in 239 compute expectation by conditioning sZj Pever transit to j i start in gtlt Etime spent in j i start in 239 and transit to j Pnever transit to j i start in gtlt Etime spent in j i start in 239 never transit to j fz39jwz39j Sjjlt1 fz39j5z39j 6m fijsjj Hence 4 Special case Example Gambler s ruin problem Suppose p 04 N 7 Start With 3 a The expected amount of time the gambler has 5 units or 2 units b The probability that the gambler ever has a fortune of 1 Solution States 0 12 3456 7 Transient states 1 2 3 4 5 6 0040000 060040001 PT 0060040 0 00060040 3900006004 0000060 Solve S I PT1 we get 8375 8372 Hence the expected number of times the gambler has 5 units is 09228 2 units is 23677 f31 08797 The probability that the gambler ever has a fortune of l is 08797

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.