Stoch Proc,Simultn I
Stoch Proc,Simultn I MATH 5040
Popular in Course
Popular in Mathematics (M)
This 21 page Class Notes was uploaded by Miss Noel Mertz on Monday October 26, 2015. The Class Notes belongs to MATH 5040 at University of Utah taught by Staff in Fall. Since its upload, it has received 15 views. For similar materials see /class/229944/math-5040-university-of-utah in Mathematics (M) at University of Utah.
Reviews for Stoch Proc,Simultn I
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 10/26/15
Math 5040 1 Stochastic Processes 8 Simulation Davar Khoshnevisan University of Utah Module 1 Generation of Discrete Random Variables Just about every programming language and environment has a quotrandom numberquot generator This means that it has a routine for simulating a Unif01 random variable that is a continuous random variable with density function fx 1 if 0 g x g 1 and x 0 for other values of x Throughout I assume that we know how to generate a Unif01 random variable 11 using an existing computer program1 Our present goal is to use this knowledge to simulate a discrete random variable with a prescribed mass function f 1 Bernoulli samples Recall that a Bernoulli p random variable X takes the values 1 and 0 with respective probabilities p and 1 7 p We can think of this as the outcome of a coin toss X 1 means that we tossed heads and X 0 refers to tails Let U be a Unif01 random variable and define X lmgp where 1A denotes the indicator of an event A ie it is one if A occurs and zero otherwise The possible values of X are zero and one Because X 1 if and only if U g p p PX1PUltpJ dx p 0 In other words X is a Bernoulli p sample 1One might ask how is that done That is a a good question which is the subject for a quite different course 1 2 1 Generation of Discrete Random Variables In algorithmic terms this method translates to the following If U g p thenXlelseif p ltUltl thenX0 2 More general discrete distributions Now let us try to simulate a random variable with mass function Do if X X0 x D1 ifX XL where 0 g p lt l for allj 2 0 and p0p1 1 That is the possible values of X ought to be x0x1 with respective probabilities p0 p1 We extend the algorithm of the previous subsection as follows If U p0 then X x0 else if p0 lt U p0 p1 then X x1 else if p0 p1 lt U p0 m p2 then X x2 etc At the kth stage where k 2 l is integral this algorithm performs the following query ISDO 39 Pk71ltUltDO Pk If so then set X xk else proceed to stage k l Here is the pseudo code Generate a Unif01 random variable U If U lt p0 then set X x0 and stop Else set k1 PrevSum p0 and Sum p0 p1 If PrevSum lt U lt Sum then set X xk and stop Else set k k1 PrevSum Sum and Sum Sum pk then goto Step 4 You should check that this does the job merIQH Question 1 Could it be that this algorithm never stops Fortunately the answer is quotnoquot In order to see this let N denote the time that the algorithm stops N 00 is a distinct possibility at this point Now we can observe that for all integers k 2 l PN gt k P the algorithm does not stop after the kth stage PUgt PO Pk71l1 PultD0 Pk71 kil gt0 17Zvj Zvj j0 jk And so PN oo lim1ltHOQ PN gt k 0 1 2 More general discrete distributions 3 21 Discreteuniform samples Suppose we wish to simulate a random variable X that is distributed uniformly on 1 TL for a positive integer TL that is fixed For all j 1 TL the jth stage of our algorithm tells us to set 39 39 X j if d lt u g 1 TL TL The condition on U is equivalent to j 7 l lt TLU g j and this means that j l n11 where m denotes the greatest integer g x The preceding can be streamlined into a one line description as follows X z 1 TLUJ is distributed uniformly on 1 TL Exercise 1 Recall the Geometric p mass function is given ly the follow ing fj pl 7pj 1ifj 012 and fj 0 otherwise Prove the following X z 1 111711 has the Geometric p mass function 1n1 T v 22 Binomial samples The BinomialTL p mass function is given by ltngtpxl i p x ifx 0TL X x 0 otherwise In the jargon of our algorithm x0 0x1 lxTL TL and p 31ij 7 p j Note that p0 p and m T X L DJquot J 1 i 17 Using this check that the following alogrithm produces a BinomialTL p simulation 1 Generate a Unif01 random variable U Set M plp j0 P qquotn and Sum P If U lt Sum then set X j and stop Else set jj1 P njj M P and SumSumP then go to Step 3 waw Question 2 At most how long does this algorithm take before it stops Exercise 2 Prove that if X BinomialTL p then X 11 lTL where the lj s are independent Bernoulli p s Use this to prescribe an alternative algorithm for simulating X 4 1 Generation of Discrete Random Variables 23 Poisson samples The Poisson mass function is given by 2490 x ifx 0l x 0 otherwise In the jargon of our algorithm x0 0x1 1 and p e AAjjl Note that p0 expi7 and Dj1 7 vi i1 Using this check that the following algorithm produces a P0iss0n7 sim ulation 1 Generate a Unif01 random variable U 2 Set j O P explambda and Sum P 3 If U lt Sum then set X j and stop 4 Else set P lambda j1 P and Sum Sum P then go to Step 3 Question 3 How long before this algorithm stops The answer requires first the following result Exercise 3 Prove that if V 2 l is integral then EV 2104 PV 2 k Hintz Start by writing the sum as 2101 PV j and then inter changing the double sum In order to answer Question 3 recall the definition of N and apply Exercise 3 to find that EVZPN gtkZPN gtk71ZPN gt1 k1 k1 10 Thanks to this and equation 1 on page 2 39 00 j AN39 0039 AN39 EVZZ j Z ej 50 3 Monte Carlo simulation 5 I have used the fact that the expectation of a Poisson7 is A In summary we expect to stop quotafter 7 1 steps 3 Monte Carlo simulation Now I describe the quotMonte Carlo methodquot which is a popular application of simulation Suppose we need to know the value of EhX where his a function and X a random variable There are many complicated examples wherein this expectation cannot be computed analytically Then we can some times resort to simulation as follows Simulate X1X2 indepen dent random variables from the same distribution as X By the law of large numbers N Z hXj z EhX with high probability if N is large j1 Therefore if we want to simulate PX E A for a set A then we simulate independent random variables X1X2 from the same distribution as that of X and then use the preceding with hx 1x E A to find that 1lt1ltN XiEA N z PX E A with high probability if N is large Exercise 4 Let X Binomial10 Simulate using N 1000 random variables PX k for all k 010 Verify the accuracy of your simulation by comparing your simulation results to the exact results Exercise 5 Simulate P20 g X g 24 where X Poisson4 Exercise 6 The CLT Let Xin 1 be a double array of independent ran dom variables all with common mass function ifx1 orx71 x 0 otherwise Let Sm XL Xzfj I anj where TL is large Suppose m is another large integer Explain in clear heuristic terms why 1lt39lt ltsltb 1 b w x ij eixZZ dx with high probability m 27 0 Use this simulation method to estimate 27I 12 expix22 dx Module 2 Discrete Markov Chains 1 Introduction A stochastic or random process is a collection Xnf0 of random variables They might take values in very abstract spaces eg PX1 cow PX1 horse Let 5 denote a finite or countable set of distinct elements Let X Xnf0 be a stochastic process with values in 5 Frequently we think of TL as quottimequot and XTL as the state of the process at time TL De nition 1 We say that X is a Markov Chain on S if PXn1 an1lX0 X0XTL an Pixl an1lX0 an 2 for all integers TL 2 0 and every 10 an E 5 Equation 2 is called the Markov property In order to understand what this means we should think about how we would simulate the time evolution of the random process X c We start with some initial random variable X0 0 Given thath 10 we simulate X1 according to the mass function x Pixl X l X0 10 O 0 Suppose we have simulated X0 Xn Given thath 10 XTL on then we simulate Xn1 according to the same mass function f that was used earlier It should be clear that we ought to try to understand a Markov chain based on the properties its transition probabilities these are the collection of 7 8 2 Discrete Markov Chains probabilities Pxfy PX1 y X0 x for all xy E S We can think of the collection of these transition probabillities as a matrix P whose xy coordinate is Fwy We call P the transition matrix of the Markov chain X This definition makes sense also when S is infinite But one has to be much more careful when one deals with infinite matrices 11 A simple weatherforecasting model Ross 13 182 Consider the fol lowing overly simplified model for weather changes If it rains today then regardless of the weather of the previous days tomorrow will be rainy with probability at if it is dry today then regardless of the weather of the previous days tomorrow will be rainy with probability 5 Here 5 z 0 1 where state quot0quot refers to quotrainquot and quot1quot to quotdryquot Our Markov chain will take the values zero and one according to the transition matrix at 1 7 at p B 1 7 5 Exercise 1 Simulate this Markov chain We will soon see that the follow ing limits exist Try TliinmPXny X0x foryx 01 and Tr does not depend on x Apply this fact to compute g and 7r1 via Monte Carlo simulation These are the respective probabilities of rainy versus dry days in the quotlong runquot 12 The simple walk The simple random walk on Z is the Markov chain whose transitions are described as 1 E ify x1 ory x71 PX 0 otherwise That is at any given time we go to the left or right of our present position with probability independently of the how we arrived at our present position More generally a simple walk on the vertices of a graph G moves to each of its current neighbors with equal probability at all times Exercise 2 Let X Xnf0 be the simple walk on Z and suppose X0 0 with probability one For every integer x let Tx denote the first time TL such that XTL x Use Monte Carlo simulation to convince yourself that PT1 lt T71 Can you prove that this actually holds It does 2 The ChapmaniKolrnogorov equations 9 Exercise 3 Let X denote the simple walk on Z with X0 0 1 Prove that YTl XTL 7 Xn1 TL 2 1 defines a collection of inde pendent random variables with PYTL 1 PY 71 2 Prove that XnTL goes to zero in probability that is for all 7 gt 0 lim P X gt7 0 HOG J TL 3 Use Monte Carlo simulation to verify this phenomenon 2 The Chapman Kolmogorov equations By using the transition matrix P of the Markov chain X we can compute all sorts of probabilities For instance suppose we want the distribution of X2 given that the initial value of the chain is X0 a for some constant a E S such that PX0 a gt 0 We apply the law of total probabilities to find this as follows PX2b lXO a 7 Ptxo a X2b T Ptxo a 1 PXaXcXb PX0as0 1 2 1 PX2le0aX1CgtltPX0GX1C PX0a5 1 P x P X a X c PX0as cb 0 1 Another round of conditioning shows us that PX0 1 X1 C PX0 1 gtlt PalC Thus PXz b lXO a Z PacPcb Pibl CES where Pi b denotes the ab element of the matrix P2 PP The follow ing is only slightly more general and is proved similarly Theorem 1 For all integers n 2 0 and all ab E 5 PW b lXO a Pgb where Pgb represents the a b coordinate ofthe matrix PTL P P n times where P0 z I denotes the identity matrix 10 2 Discrete Markov Chains In summary Fab is the probability that the Markov chain X goes from a to b in TL time steps Exercise 4 Prove that for all integers nk 2 0 and ab E S PXnk b le 113 XTL b lXo a lagb Here is a consequence of all of this P3 Z P2CPZ b CES The collection of all these equations as we vary TL m a and b is called the ChapmaniKalmagamv equations it turns out to have many useful conse quences 21 Simpli ed weather forecasting Consider the simplified weather fore casting problem 11 page 8 with ac and 5 41 The transition matrix is given by the following P 12 12 14 34 Our present goal is to compute limnH00 PQu that is the long run proba bility of making a transition from state x to state 14 One can easily compute P2 38 58 516 1116 And with enough patience P3 But how does one compute limnH00 P Or is it even clear that the limit exists Exercise 5 Verify numerically that n 13 23 11155 P 13 23 3 Additionally make sure that you understand the following interpretation of 3 Regardless of whether or not we start with a rainy or dry day after a very long time any given day is rainy with probability x 13 and dry with probability x 23 A heuristic interpretation of the latter remark is that it is going to rain about one third of the time And now we return to do some honest mathematics The remainder of this discussion requires a little bit of linear algebra Here is a method that works quite generally Let v 1 2 and w1 71 Then VP v and wP iw That is v and w are the two left eigenvectors of P and correspond respectively to the 2 The Chapman7Kalm0g0mv equations 11 eigenvalues 1 and 14 Note that v and w have been normalized to have unit length More significantly v and w together span all of R2 That is every 2 dirnensional vector u can be written as follows 11 av bw 4 with a uluz and b 2ul7u2 You need to check all of these computationsl Now we can right multiply both sides of 4 by PTl to find that uPTL 1an wa Induction reveals that vPTL v and wPTl 14 w for all integers TL 2 1 Consequently uPTL av 4w whence it follows that lim uPTL av u1u2vult HOG 13 23 3 13 23 Check by plugging in the explicit value of v etc This verifies 3 p 10 Exercise 6 Harder Prove that if 0c 17oc Pltr5 176 and lim P31 lirn Pill n7gtoltgt n7gtoltgt then 5 1 7 0c 1 7 0c 5 1 7 OH 5 Conclude in words that the long run proportion of rainy days is equal to 517oc 6 lim PTl lirn PTl n7gtoltgt 0 0 n7gtoltgt 1 0 Exercise 7 Let P denote the transition matrix of a Markov chain on a finite state space S 1 Prove that P1 1 where 1 is a vector of all ones That is 7 1 is a right eigenvalue of P with right eigenvector 1 2 Use the preceding to prove that 7 1 is also a left eigenvalue of P and the corresponding eigenvector v solves vP v 3 Suppose that there exists a unique left eigenvector v for P such that Vj gt 0 for all j Prove that in that case lirnnH00 Pgb exists for all b E S and does not depend on a 12 2 Discrete Markov Chains Exercise 8 1 Prove that the following is the transition matrix of a Markov chain with values in 5 01 2 0c 17a 0 P 5 176 0 0 0 1 where0lt ocf5 lt1 2 How many left eigenvectors correspond to the left eigenvalue 7 1 Do any of these eigenvectors have positive coordinates 3 Does limnH00 Pgb exist If so then can you compute it for all ab 6 01 2 Exercise 9 Computer exercise Consider the transition matrix 0 12 12 P 12 0 12 12 12 0 Describe the behavior of PTl for large values of TL Module 3 Generation of Continuous Random Variables 1 The inversetransform method This is the most basic method for generating continuous random variables that have quotnicequot densities Suppose F is the distribution function of a continuous random variable with a proper inverse F l Theorem 1 lfU Unif01 then X F 1U is a samplefram F Proof We compute the distribution function of X PX lt x PF1u lt x Pm lt Fm since F is increasing Note that y Fx is a number between zero and one Since PU g y y for such y s it follows that PX g x Fx for all x This has the desired effect D 11 Exponential samples Recall that X is Exponential7 if its density is given by A if x 2 0 x 0 otherwise 14 3 Generation of Continuous Random Variables The distribution function is given for all x 2 0 as follows X Fx J Ae m dy 1 7 67 700 If 0 lt y lt 1 and y Fx then y 1 7 6 Therefore we solve for x to find that x 17 ln1 714 That is my 7 7In1 7y no lt y lt 1 In particular if U Unif0 1 then X 17 ln17U is an Exponential7 random variable This can be simplified further slightly Note that V 1 7 U is also a Unif0 1 sample So in order to generate a Exponential7 random variable X we first generate a Unif0 1 random variable V then set X z 7 ln V Our discussion implies that X has the Exponential7 distribution Exercise 1 Gamma densities If at and 7 are positive and fixed then the following is a probability density ADC mucoquot1e x if x gt 0 x 0 otherwise where F denotes the Gamma function which I recall is Wt J actile x dx for all t gt 0 5 0 Show that if X Gammak for an integer k 2 1 then X T1 Tk where the Tj s are independent Exponential7 s Use this to describe carefully and in detail a method for simulating Gammaoc in the case that at 2 1 is integral Hintz Moment generating functions 12 Pareto samples A random variable X is said to have a Pur to density with parameter at gt 0 if PX gtxx for allx2 1 andPXgtx1ifxltm Exercise 2 Easy Compute the Paretooc density function 2 The acceptance7rejectian method 15 The distribution function F of X is described as follows Fx 1 73 ifx 2 1 and Fx 0 ifx lt 1 If 0 lt y lt 1 satisfies y Fx for some x then x has to be at last 1 why and y 1 7 x Solve for x to find that x 1 7 y 1 That is F 1y 1 7y 1 ifO lty lt 1 Because 1 7 U Unif01 it follows that U lo Par tooc Exercise 3 lNeibull density Compute the density of X TU where T Exponential7 and 0c 7 gt 0 Explain how you would simulate X Exercise 4 Cauchy density Show that if U Unif01 then C tan7IU 7 has the following density function 1 fxm for7ooltxltoo Exercise 5 Rayleigh density A random variable X is said to have a Rayleigh density if X 2 0 and PX gt x e x2 for x gt 0 Explain in detail how you would simulate from a Rayleigh density Exercise 6 Beta density Consider the density function if0ltxlt1 71 x17x 0 otherwise Show that if U Unif01 then Y sin27IU 7 has density f Suggestionz First consider X sin7IU 7 and then Y X2 2 The acceptance rejection method Some times the distribution function and hence its inverse are hard to compute In some such cases one can employ the so called acceptance rejection method This is an iterative method which frequently converges rapidly Consider two density functions f and 9 that satisfy the following quotdom ination conditionquot for a finite constant 0 x cgx for all x 6 We suppose we know how to sample from g and propose to use that fact in order to simulate from f 16 3 Generation of Continuous Random Variables Exercise 7 Easy Prove that c 2 1 Let Y1Y2 be an independent sequence of random variables with density function g and 111112 an independent sequence of Unif01 random variables totally independent of the Y s Define a random variable N to be the smallest k 2 1 such that fYk U 7 k CQlYk 7 That is if the preceding condition holds then we accept Yk as the simu lated value of X Theorem 2 We have PN lt oo 1 and EN c Moreover X YN is a continuous random variable with density function f Proof First we make a few observations 0 Let A denote the collection of all points y such that gy 0 Then Pom 0 Pm e A j 92 dz 0 A Therefore is well defined with probability one o For every integer k 2 1 P 11k g m J P 11k g m gz dz CQlYk 700 J z gzdz1j fzdz 1 C 0 Therefore N Geometric1c In particular PN lt oo 1 In fact we also have EN c In order to conclude the proof it suffices to show that PYN g x Jim fz dz for all x Note that we have established already that YN is well defined since N lt 00 We compute M8 PYN lt X PlYk KIN k T H H 948 Ix PN k l Yk zgzdz T H H 2 The acceptanceerejectian method 17 Given that Yk z the conditional probability of N k is uvi kswee538lt1gtk5321 Therefore x 1 H fz PY g x lt1 7 7 gz dz N c ago J fz dz because 2101 Tk l 1 7 TF1 for 0 lt T lt 1 Differentiate both sides of the preceding to find that ny x fy1X D Check that the following produces a simulation of a random variable from f using random variables from g and some extra uniforms Generate a Unif01 random variable U Generate a random variable Y from g If U lt fYcgY then set X Y and stop Else go to to Step 1 rbCOIQI x 21 Normal samples Suppose we wish to simulate from the standard normal density f 1 2 f x 39 727x 2 27 Consider the quotdouble exponentialquot density 9 gx ge x 7 2 x2 7Ehxj Ef expiglxli 716 The function h is clearly symmetric about zero Moreover if x gt 0 then hx 7x 1 and h x 71 Therefore 11 is maximized on the positive axis at x 1 and maxxgt0 hx 111 12 By symmetry maxxlt0 hx 1171 12 as well Because 110 0 we find that maxx hx 12 and hence x 22 maxi 4 I 7 x gx TL That is 6 holds with c 2 c 1x1 Clearly 18 3 Generation of Continuous Random Variables It remains to know how one can sample from 9 But that is not hard Let Gx Jim gz dz define the distribution function for g If x g 0 then And if x gt 0 then by the symmetry of g Gx 71Jx zdzi1 1Jxe zdz 71716 x 2 0 g 2 0 2 Note 0 IfO lty and Gx y then equot y whencex ln2y That is G 1yln214if0 lt y lt o If lt y lt 1 and Gx y then 1 7 e y whence G 1y 7 ln21 7 14 To summarize may if0ltylt G lw 7ln21 713 if lt1 lt 1 Therefore in order to generate a random variable Y from 9 we first gener ate a Unif01 random variable U and then define Y G 1U The acceptance rejection method can now be applied this yields a sample from the standard normal density f Exercise 8 Prove that if Z N01 a E R and b gt 0 then X db12Z is Nab Apply this to describe in detail and carefully how one can simulate a Nu oz 22 Beta samples Choose and fix two constants 0c 5 gt 0 The beta density with parameters 0c and 5 is Foc 5X 11 ix sil madam if0ltxlt1 0 otherwise where l is the Gamma function see 5 on page 14 I will take for granted that I x dx 1 though proving that requires work You should check that the Beta1 1 density is the same thing as Unif01 Now suppose 0c 5 2 1 and define g to be the Unif01 density Then Foc f5 Foc f5 fXltWWgX if0ltxlt1 2 The acceptanceirejectian method 19 That is equation 6 on page 15 holds with c Foc f5l ocl f5 Exercise 9 Beta density Let f denote the Betaoc 5 density with 1 gt 0c 5 gt Let 9 denote the Beta density and recall that 12 First verify that 6 on page 15 holds with c 71Foc f5l ocl f5 Then explain clearly and in detail how one can simulate from f Hintz Consult Exercise 6 on page 15
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'