Stochastic Processes STAT 150
Popular in Course
verified elite notetaker
Popular in Statistics
This 73 page Class Notes was uploaded by Floy Kub on Thursday October 22, 2015. The Class Notes belongs to STAT 150 at University of California - Berkeley taught by Staff in Fall. Since its upload, it has received 27 views. For similar materials see /class/226738/stat-150-university-of-california-berkeley in Statistics at University of California - Berkeley.
Reviews for Stochastic Processes
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 10/22/15
1 Stationary distributions and the limit theorem Definition 11 The vector 7139 is called a stationary distribution of a Markov chain with matrix of transition probabilities P if 7139 has entries 7939 j E S such that a 7Tj Z 0 for all j Z 7Tj 1 and b 7139 7rP which is to say that 7Tj 1974 for all j the balance equations Note This implies that WP 7139 for all n 2 0 eg if X0 has distribution 7139 then Xn has distribution 7139 for all n Proposition 12 An irreducible chain has a stationary distribution 7139 if and only if a the states are nonnu persistent in this case 7139 is the unique stationary distribution and is given by 7T7 for each 239 E S where W is the mean recurrence time of 239 We will carry out the proof of this in several steps Fix a state k and let be the mean number of visits of the chain to the state 239 between two successive visits to state k3 Pill E NilXO kl where N 1Xni Tk2n and Tk is the time of the first return to state k We write pllt for the vector 239 E S Clearly Tk Zies Ni and hence Mk I EMU iES Lemma 13 For any state k of an irreducible persistent chain the vector pk satisfies lt 00 for a z39 and furthermore M Mk Proof We show first that lt 00 when 239 7E k3 Observe that pkk3 1 We write Clearly fkkm l n 2 By irreducibility of the chain there exists n such that fik gt 0 So for such a n 2 2 00 1 00 1 mm Elm S mgfkkmngt 3 Mn lt 00 as required Next observe that 1671 pm and zkim Z PXn z39Xn1 jTk 2 mXO k2 Z zkjn 1p jjik Summing over n 2 2 we obtain 939756 PM Z Z lkj 71 1qu pk7 Pk Z pjk29j77 ju k n22 since pkllt 1 jjik El We have shown that for any irreducible chain the vector pk satisfies pk pkP and furthermore that the components of pk are nonnegative with sum uk Hence if uk lt oo the vector 7139 with entries 7T7 p lltuk is a distribution satisfying 7r 7rP Therefore every nonnu persistent irreducible chain has a stationary distribution Proposition 14 For any irreducible persistent Markov chain with matrix of transition probabilities P there exists a positive solution x to the equation X XP which is unique up to a multiplicative constant The chain is nonnul ifzixi lt 00 and null ifzixi 00 We ve seen all of this except for the uniqueness claim which we won t establish although it isn t difficult Proof of Proposition 12 Suppose the 7T is a stationary distribution of the chain If all states are transient then pij gt 0 as n gt 00 for all 239 and j So szzmpijngt0 as n gtoo forallz andj 239 which contradicts Z 7Tj 1 El We show next that the existence of 7139 implies that all states are nonnull and that 7T7 for each 239 Suppose that X0 has distribution 7139 so that PX0 7T7 for each 239 Then 00 00 7Tij PXO jZPTj ZniXo 9 ZINE vaXO j n1 n1 10 However 1PTj 21X0 jPX0 j and forn22 PTj2nX0j PX0jXm7 jfor1 m n 1 PXm7 jfor1gmgn 1 PXm7 jforO mgn l IP Xm7 jforO mgn 2 PXm7 jforO mgn l by stationarity OWL 2 OWL 1 where an IP Xm 79739 for 0 g m g 11 Sum over n to obtain njuj PX0 jPX0 j nligtrr oan1 11m an 39I L gtOO However an gt PXm 79739 for all m 0 as n gt oo by the persistence ofj We have shown that 7Tij 1 so that W lt oo if 79 gt 0 To see that 79 gt 0 for all j suppose 7 on the contrary that 79 0 for some j 12 Then 0 7Tj Zmpm39 2 mp for all 2 and 22 yielding that 7T7 0 whenever 2 gt j The chain is assumed irreducible so that 7T7 0 for all 2 in contradiction of the fact that m39s sum to 1 Hence 2939 lt oo and all states of the chain are nonnull Furthermore 1 H j we see that 7Tj are specified uniquely as 13 Thus if 7139 exists then it is unique and all the states of the chain are nonnu persistent Conversely if the states of the chain are nonnu persistent then the chain has a stationary distribution given by Lemma 13 14 Proposition 15 lfz39 lt gtj then 239 is null persistent if and only ifj is null persistent Proof Let be the irreducible closed equivalence class of states which contains the nonnull persistent state 239 Suppose that X0 6 Then Xn E for all n and Lemma 13 and Proposition 12 combine to tell us that all the states in are nonnull El 15 Proposition 16 Let 8 E S be any state of an irreducible chain The chain is transient if and only if there exists a nonzero solution yi 239 7E 8 satisfying g 1 for all z39 to the equations 97 Zpijij 2759 939759 16 101 Example Random walk with retaining barrier A particle performs a random walk on the nonnegative integers with a retaining barrier at 0 The transition probabilities are 2900 2 Q7 Pm1 P for i Z 0 P 1 q 1cor i Z 1 Let p pq a If q lt 19 take 8 0 to see that y 1 satisfies the equation in Proposition 16 and so the chain is transient 17 b Solve the equation 7139 7rP to find that there exists a stationary distribution with 7rj p1 p if and only if q gt p Thus the chain is nonnull persistent if and only if q gt p c If q p the chain is persistent since symmetric random walk is persistent just reflect negative excursions of a symmetric random walk into the positive halfline Solve the equation x xP to find that 957 1for all 239 is the solution unique up to a multiplicative constant However 96 oo so that the chain is null by Proposition 14 18 Theorem 17 For an irreducible aperiodic chain we have that 1 prijm gt as n gt 00 for alz39 andj Mi Proof If the chain is transient the the result is trivial Suppose X is an irreducible aperiodic nonnull persistent Markov chain Construct the coupled chain Z X Y as an ordered pair X Xn n 2 0 Y Yn n 2 O of independent Markov chains each having state space S and transition matrix P Then Z Z XmYn n 2 0 takes values in S X S and it is easy to check that Z is a Markov chain with transition probabilities pij l PZTL1 IPX1 kian ENEn1 llYn j by independence Pugpji 19 Since X is irreducible and aperiodic then Z is also irreducible Since X is nonnu persistent it has a unique stationary distribution 7139 and it is easy to see that Z has a stationary distribution 1 174 z39j E S given by V74 7mg thus Z IS aso nonnu persnstent 20 Now suppose that X0 239 and Y0 j so that 20 239j Choose any state 8 E S and let Tminn 212 88 denote the time that Z first hits 8 8 Note that PT lt oo 1 Starting from ZO X0Y0 ij PXn PXn k2T g nIPgtXn k2T gt n PYn k2T g niPgtXn k2T gt n since given T g n Xn and Yn are identically distributed Also 3 PYn k3PT gt n pjkm IPT gt 21 This and the related inequality with 239 and j interchanged yields pjknl g IPT gt n gt 0 as n gt 00 therefore pjkn gt 0 as n gt 00 for all z39j and k3 Thus if limngt00 exists then it does not depend on 239 To show that it exists write 7m pjkn Zn pjkn gt 0 as n gt 00 22 11 Examples 111 Example The age of a renewal process Initially an item is put into use and when it fails it is replaced at the beginning of the next time period by a new item Suppose that the lives of the items are independent and each will fail in its 2th period of use with probability Phi Z 1 where the distribution is aperiodic and z39Pq lt 00 Let Xn denote the age of the item in use at time n that is the number of periods including the nth it has been in USE 23 Then if we let p A7 2 Z Pj denote the probability that a unit that has lived for 239 1 time units fails on the 2th time unit then an Z 0 is a Markov chain with transition probabilities given by PM 1 Pi q i Z 1 Hence the limiting probabilities are such that 7T1 7T 1 7T71 7 1 24 Since 00 Zj7 1 Pj 17L T 297ng iterating yields 7T 1 7TZ391 AZ 7n11 A 1 AH 7T11 A11 A2 1 Z 7T1PX2i1 where X is the life of an item 25 Using 7T7 1 yields 00 17r1IPX 2 239 21 or and hence 7F 1 ZEN PX2i EiX i gt1 This is an example of a sizebiased distribution 26 112 Example Poisson births Suppose that during each time period each member of a population dies independently of the others with probability p and that a Poisson number of new members join the population each time period If we let Xn denote the number of members of the population at the beginning of period 71 then it is easy to see that Xn7 n 1 is a Markov chain To find the stationary distribution of this chain suppose that X0 has a Poisson distribution with mean 04 Since each of these X0 individuals will independently be alive at the beginning of the next period with probability 1 p by the Poisson marking theorem the number of them that are still in the population at time 1 is a Poisson random variable with mean a1 p 27 As the number of new members thatjoin the population at time 1 is an independent Poisson random variable with mean x it thus follows that X1 is a Poisson random variable with mean a1 p Hence if ozoz1 p then the chain is stationary By uniqueness of the stationary distribution we can conclude that the stationary distribution is Poisson with mean p That is vrje PApjj j01 28 113 Example the Gibbs sampler Let pZE1 xn be the joint probability mass function of the random vector X1 Xn In some cases it may be difficult to directly sample from such a distribution but relatively easy to sample from the conditional distributions of each coordinate X7 given the values of all of the other coordinates Xjj 7E 239 In this case we can generate a random vector whose probability mass function is approximately pZE1 xn by constructing a Markov chain whose stationary distribution is p as follows 29 Let X0 Xi X be any vector for which pX Xg gt 0 First we generate a random variable whose distribution is the conditional distribution of the first coordinate X1 given that Xj X9 j 2 n and call its value X11 Next generate a random variable whose distribution is the conditional distribution of X2 given that X1 X11 and Xj Xj 3 n and call its value X21 Continue in this fashion until we have a whole new vector X1 X11 X b Then repeat the process this time starting with X1 in place of X0 to obtain the new vector X2 and so on It is easy to see that the sequence of vectors thj Z 0 is a Markov chain We claim that its stationary distribution is 19951 xn 30 To verify the claim suppose that X0 has probability mass function pZE1 xn Then it is easy to see that at any point in this algorithm the vector Xi Xg1Xg1X1 will be the value of a random variable with mass function 19951 xn For instance letting X be the random variable that takes on the value denoted by then PX11x1XQxjj 2n PX11x1lXQxjj 2n gtlt IPXQ xjj2n lPX1 xllXj xjj 2n gtlt PXj xjj2n px1xn 31 Therefore 19951 xn is a stationary probability distribution so provided that the Markov chain is irreducible and aperiodic we can conclude that it is the limiting probability vector for the Gibbs sampler It also follows from the proceeding that pZE1 xn would be the limiting probability vector even if the Gibbs sampler were not systematic in first changing the value of X1 then X2 and so on lndeed even if the component whose value was to be changed was always randomly determined then px11cn would remain a stationary distribution and would thus be the limiting probability mass function provided that the resulting chain is aperiodic and irreducible 32 2 Exercises 1 Each day one of n possible elements is requested it is the 2th one with probability Phi Z 1 271 P7 1 These elements are at all times arranged in an ordered list that is revised as follows the element selected is moved to the front of the list and the relative positions of all other elements remain unchanged Define the state at any time to be the ordering of the list at that time a Argue that the above is Markov chain b For any state 2391 z39n which is a permutation of 12 n let 7tz391 z39n denote the limiting probability Argue that Pin 1 11 13 1 13 P 7Ti1inP n 2 33 2 Let an Z 0 be a Markov chain with stationary probabilities 7rj 9392 0 Suppose that X0 0 and define Tminngt0Xn0 Leth XTjj01T Showthat Yj0T is distributed as the states visited by a Markov chain the reversed Markov chain with transition probabilities P WijiTlZ39 started in state 0 and watched until it returns to 0 34 3 Consider a finite Markov chain on the state space 0 1 2 N with transition probability matrix P Pij YFO and divide the state space into the three classes 12 N 1 and Let 0 and N be absorbing states both accessible from all states in 1N 1 and let 12 N 1 be a transient class Let k be a transient state Define an auxiliary process the return process with transition matrix P by altering the first and last row of P so that POk PNk 1 and leave the other rows unchanged The return process is clearly irreducible Prove that the expected time until absorption uk with initial state k in the original process equals 17r0 l 7rN 1 where 7m l 7rN is the stationary probability of being in state 0 or N for the return process Hint use the relation between stationary probabilities and expected recurrence times to states 35 3 Reversibility Suppose that X71 0 g n g N is an irreducible nonnull persistent Markov chain with transition matrix P and stationary distribution 7139 Suppose further that Xn has distribution 7139 for every n Define the 39reversed chain Y by Yn XNn for 0 g n g N 36 Proposition 31 The sequence Y is a Markov chain with IPY0 7T7 and 77quot PYn1 I y Z lm We call the chain Y the time reversal of chain X and we say that X is reversible if X and Y have the same transition probabilities 37 Proof The crucial step is the stationarity of X PYn1 inH Y inYn1 in1Y0 2390 19gtYk z39mg k n1 PYk z39k0 1 72 IPXNn1 in1XNn inXN 2390 IP XNnz39nXNz390 7Tin1pin173np73m73n 1 39 39 39pihio 7Tinpinaim 1 39 39 497313730 7ll77ri 1p7nlti 17771 7T7 71 38 Let X Xn 0 g n g N be an irreducible Markov chain such that Xn has the stationary distribution 7139 for all n The chain is called reversible if the transition matrices of X and its timereversal Y are the same which is to say that 7T p j ijjq for all These equations are called the detailed balance equations 39 Proposition 32 Let P be the transition matrix of an irreducible chain X and suppose that there exists a distribution 7139 such that nmwm dMaMJESUmea mmwMMMmMMe chain 40 Proof Suppose that 7139 satisfies the conditions above Then Empty 2792991 7Tj ijz39 7Tj and so 7139 7rP whence 7T is stationary 41 31 Reversible Examples 311 Example Ehrenfest model of diffusion Two containers A and B are placed adjacent to each other and gas is allowed to pass through a small aperture joining them A total of m gas molecules is distributed between the containers We assume that at each epoch of time one molecule picked uniformly at random from the m available passes through this aperture Let Xn be the number of molecules in container A after n units of time has passed Clearly Xn is a Markov chain with transition matrix pi 1 1 7 2977g 1 If 0 g 2 g m m m 42 Rather than solve the equation 7139 7rP to find the stationary distribution we look for solutions of the detailed balance equations 7T P7Lj ijjz39 This is solved by 7T7 which is therefore the stationary 1 distribution 43 312 Example the Metropolis algorithm Let aj j 1 m be positive numbers and let A aj Suppose that m is large and that A is difficult to compute and suppose we ideally want to simulate the values of a sequence of independent random variables whose probabilities are pj ajA for j 1 m Similar to the Gibbs sampler one way of simulating a sequence of random variables whose distributions converge to pjj 1 m is to find a Markov chain that is both easy to simulate and whose limiting probabilities are the pj The Metropolis algorithm provides an approach for accomplishing this task 44 Let Q be any irreducible transition probability matrix on the integers 12 22 such that qz39j qu39 for all 2 and j Now define a Markov chain an Z O as follows If Xn 2 then generate a random variable that is equal to j with probability qij2j 1 m If this random variable takes on the value j then set Xn1 equal to j with probability min1aja7 and set it equal to 2 otherwise That is the transition probabilities of an Z 0 are qij min1aja7 ifj 7E 2 73939 q l 23 q j1 m1n1aja If 2 45 We will now show that the stationary distribution of this Markov chain is given by the pj We will first show that the chain is reversible with stationary probabilities pjj 1 m by showing that piPij Pij To show this we must show that piqij min1 aiaj quji min1 ajai Now qij qjq and ajaq 199197 and so we must verify that 297 mi llapjPi Pj minllapvzle This is immediate since both sides of the equation are equal to minp pj 46 That these stationary probabilities are also limiting probabilities follows from the fact that since Q is an irreducible transition probability matrix Xn will also be irreducible and as PM gt 0 for some 239 except in the trivial case where 19 E it is also aperiodic By choosing a transition probability matrix Q that is easy to simulate that is for each 239 it is easy to generate the value of a random variable that is equal to j with probability qij we can use the preceding to generate a Markov chain whose limiting probabilities are ajA without computing A 47 313 Example Random walk on a graph Consider a graph having a positive number wij associated with each edge 239j and suppose that a particle moves from vertex to vertex in the following manner If the particle is at vertex 239 then it will move to vertexj with probability proportional to the outgoing edge weights PM wigE FM 939 where wij is 0 if ij is not an edge of the graph The Markov chain describing the sequence of vertices visited by the particle is called a random walk on an edge weighted graph 48 Proposition 33 Consider a random walk on an edge weighted graph With a finite number of vertices If this Markov chain is irreducible then in steady state it is time reversible With stationary probabilities given by 7 7 Zj 7 7 Proof The time reversible equations 7U 7T P j Wij39 reduce to THU njw 2k Wk 2k wjk 49 or equivalently since wij qu 7T7 7Tj 2k Wk 2k wjk implying that 7T CE 111 k Since 73 1 we are done 50 4 Exercises 1 Consider a timereversible Markov chain on the state space 0 12 with transition probabilities PM and limiting probabilities 7m Suppose we truncate the chain to the states 01 M by defining the transition probabilities Pij2kgtMPika OSiSMvji E P 0327Ang 0 otherwise Show that the truncated chain is also time reversible and has limiting probabilities given by 7U M 270 7 7U 51 2 Suppose M balls are initially distributed among m urns At each stage one of the balls is selected at random taken from whichever urn it is in and placed again at random in one of the other m 1 urns Consider the Markov chain whose state at any time is the vector 711 nm where 7 denotes the number of balls in urn 239 Guess at the limiting probabilities for this Markov chain and verify your guess showing at the same time that the Markov chain is time reversible 52 Statistics 150 Spring 2007 April 23 2008 01 1 Limiting Probabilities If the discretetime Markov chain with transition probabilities pij is irreducible and positive recurrent then the limiting probabilities Pj rm gt00 Pi are given by p 7TjVj ZiTliVi where the 7939 are the unique nonnegative solution of WjZZ iTipij ZTFZ39ZL 239 239 We see that the pj are unique nonnegative solution of Vij ZVipipijv 229339 I 17 73 j or equivalently using qij Vipz39j Vij sz39qija 229339 1 73 j Another way of obtaining the equations for the pi is by way of the forward equations PZjlt qujpiklt VjPz39jltl me If we assume that the limiting probabilities pj limtgt00 P jt exists then P7205 would necessarily converge to 0 as t gt oo Hence assuming that we can interchange limit and summation in the above we obtain upon letting t gt oo 0 Zpquj Vij he Let us now determine the limiting probabilities for a birth and death process The relevant equations are A0290 M12917 Anpn Mn1pn1 l An 1pn 1 n 2 or equivalently A0290 M12917 A1291 2292 A0290 mm M2292 A2292 3293 A1291 2292 M3293 Anpn Mn1pn1 l An 1pn 1 ann Mn1pn1 Solving in terms of po yields 0 P1 P07 M1 1 10 P2 1 2907 M2 M2M1 A2 A210 P3 P2 0 M3 M3M2M1 An l An lAn 2 39 39 39 A1A0 pn pn l P0 Mn MnMn l LL2ILL1 Using 2300 pm 1 we obtain 00 AW 1P0P0 n1 Mn M2M1 or 00 ADA1An1 1 POIlltgml and hence ADA1An1 19 W2un1 231 7121 The above equations also show us what condition is needed for the limiting probabilities to exist Namely 00 AA n ltOO M1M2 Mn n1 Example 11 The MMl Queue In the MMl queue An A un u and thus AM An1 7 ngt0 provided that Au lt 1 Customers arrive at rate A and are served at rate u and thus if A gt u they will arrive at a faster rate than they can be served and the queue size will go to infinity The case A u is null recurrent and thus has no limiting probabilities 2 Time Reversibility Consider an ergodic continuoustime Markov chain and suppose that it has been in operation an infinitely long time that is suppose that it started at time oo Such a process will be stationary and we say that it is in steady state Let us consider this process going backwards in time Now since the forward process is a continuoustime Markov chain it follows that given the present state call it Xt the past state Xt 8 and the future states Xyy gt t are independent Therefore PXt s 9 in 239Xltygty gt t PXt s 9 in 2 and so we can conclude that the reverse process is also a continuoustime Markov chain Also since the amount of time spent in a state is the same whether one is going forward or backward in time it follows that the amount of time the reverse chain spends in state 239 on a visit is exponential with the same rate 17 as in the forward process 10 The sequence of states visited by the reverse process constitutes a discretetime Markov chain with transition probabilities pg given by a ijji 1 T where 7rjj Z 0 are the stationary probabilities of the embedded discretetime Markov chain with transition probabilities pi Let q WP denote the infinitesimal rates of the reverse chain We see that V jpjz39 gij 7m 11 Recalling that we see that and so That is pk Wk yk where C EQMVm Vij 7U 1711 Vjpjpji ijjz39 qz39j 39 pz 29 PM ijji 12 Definition 21 Reversibility A stationary continuoustime Markov chain is said to be time reversible if the reverse process follows the same probabilistic law as the original process That is it is time reversible if for all 239 and j 13 qz39j which is equivalent to piqm39 293 for all 2 Proposition 22 Any ergodic birth and death process in steady state is time reversible 13 Corollary 23 Consider an M M 8 queue in which customers arrive in accordance With a Poisson process having rate and are served by any one of 8 servers each having exponentialy distributed service time with rate u If lt 8p then the output process of customers departing is in steady state a Poisson process With rate A 14 Proof Let Xt denote the number of customers in the system at time 75 Since the M M 8 process is a birth and death process it follows that Xtt Z 0 is time reversible Now going forward in time the time points at which X t increases by 1 constitute a Poisson process since these are just the arrival times of customers hence by time reversibility the time points at which the X t increases by 1 when we go backwards in time also constitute a Poisson process But these latter points are exactly the points of time when customers depart Hence the departure times constitute a Poisson process with rate A D 15 3 Exercises 1 Suppose that the state of the system can be modeled as a twostate continuoustime Markov chain with transition rates 10 11 u When the state of the system is 239 events occur in accordance with a Poisson process with rate O z39 39 01 Let Nt denote the number of events in Ot a Find limtgt00 Ntt b If the initial state is state 0 find 16 2 Consider a continuoustime Markov chain with stationary probabilities p7z39 Z 0 and let T denote the first time the chain has been in state 0 fort consecutive time unites Find ETiXO 0 17 3 Find the limiting probabilities for the MMs system and determine the condition needed for these to exist 18 4 Consider a timereversible continuoustime Markov chain having parameters 141974 and having limiting probabilities pjj Z 0 Choose some state say state 0 and consider the new Markov chain which makes state 0 an absorbing state That is reset 10 to equal 0 Suppose now at time points chosen according to a Poisson process with rate Markov chains all of the above type having 0 as an absorbing state are started with the initial states chosen to bej with probabilities poj All the existing chains are assumed to be independent Let Njt denote the number of chains in state j j gt 0 at time t 19 a Argue that if there are no chains alive at time 0 then Nj j gt 0 are independent Poisson random variables b Argue that the vector process N1tN2t is time reversible in steady state with stationary probabilities 00 a j p H 3n397for n17n277 where ozj Majpom 20
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'