### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# 382 Class Note for STAT 416 at PSU

### View Full Document

## 22

## 0

## Popular in Course

## Popular in Department

This 22 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Pennsylvania State University taught by a professor in Fall. Since its upload, it has received 22 views.

## Similar to Course at Penn State

## Reviews for 382 Class Note for STAT 416 at PSU

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15

Markov Chain 0 Stochastic process discrete time X1X2 0 Markov chain Consider a discrete time stochastic process with discrete space Xn E 0 12 Markovian property PXn1 j Xn iXn1 Zn1 X0 Z0 PXn1jXniPm PM is the transition probability the probability of making a transition from 239 to j Transition probability matrix Pop 1301 P02 Pro 1311 P12 P s s s s Pip Pm Pm Example 0 Suppose whether it will rain tomorrow depends on past weather condition ony through whether it rains today Consider the stochastic process Xn7 n 1 2 0 rain on day n Xn 1 not rain on day n PXn1an7Xn 177X1 PXni i Xn a State space 0 1 0 Transition matrix 1300 1301 Pro 1311 0 P070 Ptomorrow rainitoday rain 04 Then P071 1 oz 0 P170 Ptomorrow rainitoday not rain 6 Then P1711 051 05 l 0 Transition matrix Transforming into a Markov Chain 0 Suppose whether it will rain tomorrow depends on whether it rained today and yesterday 0 PltXn1 Xn Xn1 X1 PltXn1 Xn Xn1 The process is not a rst order Markov chain oDe neYn 0 Xn0Xn1 RR Y7 1Xn0Xn11 NR i 2Xn1Xn10 RN 3 Xn1Xn11 NN 1300 1301 P02 P03 P170 1311 P12 P13 1320 1321 P22 P23 1330 1331 P32 P33 13071 PYn1 My 0 PXn1 0Xn 1an OXn1 0 P073 PltYnl PXnl 17Xn lanOXn10 0 O P171 P173 0 P270 P272 0 P370 Pg2 0 0 Transition matrix P070 0 P072 0 P070 0 l P070 0 P170 0 P172 0 i P170 0 l PLQ 0 0 P271 0 P273 0 Pg1 0 1 le 0 Pg1 0 Pg3 0 13371 0 1 13371 o The Markov chain is speci ed by P070 P170 P271 P371 1 P070 P tomorrow will rainltoday rain yesterday rain 2 P170 P tomorrow will rain ltoday rain yesterday not rain 3 P271 P tomorrow will rain ltoday not rain yesterday rain 4 P3 1 7 P tomorrow will rain ltoday not rain yesterday not rain ChapmanKolmogorov Equations 0 Transition after nth steps P PXnm j Xm z W 0 ChapmanKolmogorov Equations Pi Z B P j n m 2 0 for all 2 j k0 0 Proof by Total probability formula Pime PXnm ZPXnm ijn k0 213091 MXO z o nstep transition matrix TL TL TL P00 1301 P02 TL TL TL P10 P171 P12 Pm s s s 3 TL TL TL Pm Pm Pm o ChapmanKolmogorov Equations Pnm P01 PUD7 P01 PH 0 Weather example Pi 051 05 i 0703 7 l 7 04 06 Find P rain on Tuesday rain on Sunday and P rain on Tuesday and rain on Wednesday rain on Sunday Solution P rain on Tuesday rain on Sunday P0270 2 07 03 07 03 P P P 04 00 X 04 06 7 061039 052 048 P rain on Tuesday rain on Sunday 061 P rain on Tuesday and rain on Wednesday rain on Sunday 07Xn1 0 Xn Z PXn O Xn2 OPXn1 0 Xn 0 Xn2 0 PXn O Xn Z PXn1 O Xn 0 P391309 061 X 07 0427 Classi cation of States 0 Accessible State j is accessible from state 239 if Pf gt 0 for some n 2 0 i gt j Equivalent to Pltever enter j start in gt 0 Pltever enter j start in PUZ 0Xn jXo Z ZPltXn j X0 2 710 Hence if 0 for all n Pltever enter j start in 0 On the other hand Pltever enter j start in PUZoXn jXo 2 PXn jHXO for any n 1353 If Pin gt 0 for some n Pltever enterj startin 2 Pg gt 0 Examples 0 Communicate State 239 and j communicate if they are accessible from each other i lt gt j Properties 1 PX0 z lXO 1 Any statez39 communicates with itself 21fz lt gtjthenjlt gtz 31fz lt gtjandjlt gt kthenz39 lt gt k Proof 293 a7jgt0andpfggt0 ijgtP gtOandP fgtO 00 m Z ChapmanKolmogorov Eq 0 n m gt Pmquot M gt 0 Similarly we can show ngm gt 0 Hence 2 lt gt k2 0 Class Two states that communciate are said to be in the same class A class is a subset of states that communicate with each other Different classes do NOT overlap Classes form a partition of states 0 Irreducible A Markov chain is irreducible if there is only one class Consider the Markov chain with transition proba bility matrix P OMIWIH OJIHoPIHMIH WINJle O The MC is irreducible MC With transition probability matrix 00 P OHMlele OHMlele lt3an 3 r HNH 0 Three classes 01 2 10 Recurrent and Transient States 0 fi probability that starting in state 239 the MC will ever reenter state 2 o Recurrent If fz 1 state 239 is recurrent A recurrent states will be Visited in nitely many times by the process starting from 2 0 Transient If fi lt 1 state 239 is transient Starting from 239 the MC will be in state 239 for ex actly n times including the starting state is fin 11 J2 n 12 This is a geometric distribution with parameter 1 fi The expected number of times spent in state 239 i811 fr 11 o A state is recurrent if and only if the expected number of time periods that the process is in state 239 starting from state 239 is in nite Recurrent ltgt Enumber of Visits to ZX0 oo Transient ltgt Enumber of Visits to ZX0 lt oo 0 Compute Enumber of Visits to ZX0 De ne I i 1 ianz i 0 iany z Then the number of Visits to 239 is 20 In Enumber of Visits to ZX0 ZEUniXO 2 710 3 o ZPUn 1th z 213091 tiXo z 12 0 Proposition 41 State 239 is recurrent if 270 00 and transient if 20 PM lt oo 0 Corollary 42 If state 239 is recurrent and state 239 com municates with state j then state j is recurrent 0 Corollary 43 A nite state Markov chain cannot have all transient states For any irreducible and nitestate Markov chain all states are recurrent l3 0 Consider a MC with 00 1000 Pi 0100 0100 The MC is irreducible and nite state hence all states are recurrent 0 Consider a MC with pd H NH O ONIWIH QMIWIH O O NIHO O O O NH O ONIWIH QMIWIH O 3 Three classes 01 2 3 State 0 12 3 are recurrent and state 4 is transient 14 Random Walk 0 A Markov chain with state space 239 0 i1 i2 0 Transition probability PM p 1 131714 At every step move either 1 step forward or 1 step backward 0 Example a gambler either wins a dollar or loses a dollar at every game Xn is the number of dollars he has when starting the nth game 0 For anyz lt j pH gt 013 1 pji gt 0 The MC is irreducible 0 Hence either all the states are transient or all the states are recurrent 15 0 Under which condition are the states transient or re current Consider State 0 f P oo recurrent 070 nite transient 711 Only for even m P577 gt 0 P373 in pm p ea 29gt By Stirling s approximation 71 N nn126 27F P0278 N 4P1 Pn m Whenp 124p1 p 1 0 4p1 p 7 0 L The summation diverges Hence all the states are recurrent When p 7amp124p1 p lt 1 Ziow converges All the states are transient l6 Limiting Probabilities 0 Weather example 07 03 P P 04 06 4 4 05749 04251 P P 05668 04332 0572 0428 lt8 lt4 lt4 P P P 0570 0430 0 PM and P are close The rows in P8 are close 0 Limiting probabilities 17 0 Period d For state 239 if 0 whenever it is not divisible by d and d is the largest integer with this property dis the period of state 2 Period d is the greatest common divisor of all the m such that gt 0 o Aperiodic State 239 is aperiodic if its period is l 0 Positive recurrent If a state 239 is recurrent and the ex pected time until the process returns to state 239 is nite If 239 lt gt j and 239 is positive recurrent then j is posi tive recurrent For a nitestate MC a recurrent state is also pos itive recurrent A nitestate irreducible MC contains all positive recurrent states 0 Ergodic A positive recurrent and aperiodic state is an ergodic state 0 A Markov chain is ergodic if all its states are ergodic 18 0 Theorem 41 For an irreducible ergodic Markov chain lirnnOO Pg exists and is independent of 2 Let 7r lirnnOO Pf j 2 0 then 7r is the unique nonnegative solution of 7Tj ZTI39Z39PW39 0 2390 M8 7Tj j0 o The Weather Example 05 l or P 1 The MC is irreducible and ergodic 7T07T1 1 7T0 7T0P007T1Pi07T0057T15 7T1 7T0P071 7T1P171 7T0ltl Or 7T1ltl l a oz39 Solve the linear equations 7m ng 7T1 1 l9 Gambler s Ruin Problem 0 At each play a gambler either Wins a unit with prob ability or loses a unite with probability q 1 p Suppose the gambler starts with 239 units what is the probability that the gambler s fortune will reach N before reaching 0 broke 0 Solution Let X be the number of units the gambler has at time t th 0 1 2 is a Markov chain Transition probabilities 1300 PNN 1 HM pBz1 1 pq Denote the probability that starting from 239 units the gambler s fortune will reach N before reaching Oby B 2 01N Condition on the result of the rst game and apply the total probability formula B me qPi l 20 Changing forms PBHQBA B fh p hy B PM B gm PH Recursion P0 O 7 Q 7 Q P2 P1 7 P1 P0 7 P1 P P 2 g gQg H1 P P i l B PH gPH PM 9 P1 P P Add up the equations BH gHFH P P Hay P1 133 lfg 7amp1 z Pl ifg 1 p We know PN 1 P N i PN 1 17 p Hence i g P1 W 1fp y 12 1N ifp 12 In summary Hay B 1 N lfp 12 iN ifp12 oNoteifN gtoo 2quot B 1 p ifpgt12 0 1fpgl2 c When p gt 1 2 there is a positive probability that the gambler will Win in nitely many units 0 When p g 12 the gambler will surely go broke with probability 1 if not stop at a nite fortune as suming the opponent is in nitely rich 22

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.