Special Topics BMED 8813
Popular in Course
verified elite notetaker
Popular in Biomedical Engineering
This 0 page Class Notes was uploaded by Juliet Ryan on Monday November 2, 2015. The Class Notes belongs to BMED 8813 at Georgia Institute of Technology - Main Campus taught by Garrett Stanley in Fall. Since its upload, it has received 19 views. For similar materials see /class/233987/bmed-8813-georgia-institute-of-technology-main-campus in Biomedical Engineering at Georgia Institute of Technology - Main Campus.
Reviews for Special Topics
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 11/02/15
Georgia Institute of Technology amp Emory University Coulter Department of Biomedical Engineering BMED 8813 NEUROPHYSIOLOGY Fall 2008 Point Processes A random point process is a mathematical model for a physical phenomenon characterized by highly localized events distributed randomly in a continuum It is often of interest to count the numbers of points in subsets of the continuum and to describe the statistics of their locations This is of particular importance in neuronal coding where information is being transmitted as sequences of neuronal events in time Under certain conditions which will be described in the following sections a point process may be a Poisson process The Poisson process is a commonly utilized process perhaps analogous to the Gaussian distribution for continuous random variables The de ning characteristic of the Poisson process is that the number of events in disjoint intervals is independent a condition which we will subsequently relax in the next collection of notes Much of the material presented here has been adapted from Random Point Processes by Snyder 1 which contains a signi cantly more detailed description of the statistical properties of point processes 1 Point Process Notation Before providing the conditions under which a random point process is a Poisson process we must rst introduce some general notation in describing stochastic point processes Consider a space 9 on which a point process is de ned A realization w is a denumerable point set on Q The realization no can therefore be expressed as w w1 w2 where each M denotes the coordinate of a point in 9 Let A be a subset of Q and denote by NA w the number of points in an that lie in A NMW 1AWi i where 1Aw denotes an indicator function 1 w E A 1Altwigt 0 M 14 We will suppress the dependence on w in further notation allowing NA to represent the counting process As an example suppose that Q 2525 2 to and A is an interval of time Ats tltuwheret0 s For the set if s g t lt u the following notation is introduced Nt s tlt N5 So N9 is the number of points in the interval from s to u A stochastic process is introduced N NtoNto BMED 8813 Notes on point processes 2 where Nto is a non negative integer that will typically be assumed zero As a function of t N has a unit jump at the occurrence of each point and is left continuous at each jump Note that N is an additive set function the number of points in a union of disjoint sets is the sum of the number in each set We therefore have that Nst Nt 7 N9 where N59 is the random variable denoting the increment of the counting process N on the interval 3t 2 Properties of a Point Process Orderliness A counting process tht 2 to is called orderly at time t 2 to if V6 gt 0 3 6 E 6t e gt 0 such that PrNtt5 gt 1 ePrVtt5 1 V6 6 0 6 A point process is orderly on an interval of time if it is orderly for each time in the interval and it is uniformly orderly on the interval if a single quantity 6 E 66 can be chosen independently of if so that orderliness holds simultaneously for all times in the interval Conditional orderliness A counting process tht 2 to is conditionally orderly at time t to if VP and V6 gt O 36 E 6te gt 0 such that PrNtt5 gt llP EPrlM 5l llP V6 6 0 6 where P is any event Evolution without aftereffects A point process on 250 00 is said to evolve without aftereffects if for any 25 2 t0 the realization of points during the interval 25 00 does not depend in any way on the sequence of events that have occurred in the interval 250 t This implies independence of the past and future of the point process Formally we can express this condition in the following manner Let P be any event that can be associated with the random variable Nag to g a lt t For example P could be that the number of points on t0t is n or that the number of points on t0t is n and they are located at times W1 W2 W or that the rst point to occur in t0t is at W1 etc Let F be any event associated with the random variables Nag a 2 t lft denotes the present time then P and F are arbitrary events in the past and future of the process respectively A point process evolves without aftereffects if the conditional probability of F given P equals the unconditional probability of F Vt 2 to 3 De nition of a Poisson Process A Poisson counting process for times t 2 to is by de nition a counting process tht 2 to with the following properties i PrMO 0 1 BMED 8813 Notes on point processes 3 ii for to g s lt t the increment N59 Nt 7 N9 is Poisson distributed with parameter A 7 A5 A 7A n 7 A 7A m n lt t sgt exgi lt t s 71 for n O 1 2 where A is a non negative non decreasing function oft iii tht 2 to has independent increments The third property is the distinguishing property It implies that the number of points in nonoverlapping intervals are statistically independent no matter how large or small the intervals Precisely if 25 for i 1 2 k are k disjoint intervals on 250 00 then k Prth1u1 77417JVt2u2 n27 7jvtkuk nkl HPrthiui i1 Also note that these properties provide a complete statistical characterization for the Poisson counting process The joint counting probability PrlM1n17M2 r7239 C nk can be determined for any collection of times 251 lt 252 lt lt tk in 250 00 and any collection of non negative integers 711712 nk where k 1 2 This can be accomplished in the following manner Prth1 7117M2 n27 thk nkl PTthot1 n17Nt1 2 n2 n17 7Mk1tk 71k nkill k HPTlJVti1 i m niill i1 The parameter function The term A is called the parameter function of the Poisson counting process This function only need be non negative and non decreasing In general it does not have to be continuous nor differentiable However for the purposes of our discussion we will assume that the function is both continuous and differentiable Let us now introduce a more formal set of criteria for a point process to be a Poisson point process Theorem Conditions for a Poisson process Let tht 2 to be the counting process associated with a point process on 250 00 Suppose that 1 the point process is conditionally orderly BMED 8813 Notes on point processes 4 2 Vt 2 to and for an arbitrary event P associated with the random variables Nag to g a lt t7 the limit t E m 1 PriNW 71113 1 670 6 exists7 and f Aada exists and is nite for all nite intervals 3 t7 to g s g t 3 PrMO 0 1 Then tht 2 to is a Poisson counting process with an absolutely continuous parameter t function A E fto Aada Proof We need to show both that the increment N5 N 7 N5 is Poisson distributed with parameter f Aada and that N t 2 to has independent increments The rst part of the proof seeks to derive a differential equation for PrN5t n and then demonstrate that the solution to the differential equation is the Poisson distribution For n 2 17 the event N5t5 n can occur in n 1 mutually exclusive ways N5 n and Nut 0 N5 n 71 and Nt 5 1 N59 0 and Nut 71 So we have n rams n 2mm n 7 k7M 5 k k0 PrNtt5 O Nw nPrN5t n PrNtt5 MN n 7 1PrNst n 7 1 11 1 Prljvt 6 klet n 7 klprlegf n 7 kl k2 From this basic relationship7 we can nd the derivative ofthe counting probability PrN5t n with respect to 257 which exists and satis es dPrNst n dt for n 2 1 and to g s g t See Appendix Section 611 for details of derivation AtPFle l Atprlegf n i 1l Following a similar argument dPrNst 0 dt for to g s g t The initial conditions are given by tPrNst 0 limPrN5t 60 t7ls BMED 8813 Notes on point processes 5 for n O 1 2 where 60 is the Kronecker delta function This differential difference equation can be solved sequentially resulting in V L 0 Mia exp 7 f Aada n 1 for n 012 and to g s g t The proof that Equation 1 is a solution to the differential equation is left as an exercise Prlet nl By the same argument used to derive 1 we can show that N t 2 to has independent increments Note that 1 is unchanged if PrNst n is replaced by PrNst an throughout where P is any event determined by the random variables Nag to g a lt s We nd that 0 Mia exp 7 f Anda Prlet n 2 which implies that PrNst an PrN5t So the number of points in 5t is independent of any events that occur in the interval 250 s establishing the theorem D Properties Consider the characteristic function ltIgt9 jw E Eexpijst where N5 is the number of points on 3 t This gives 00 Igts jw ZWU IW As epoAt Asl expljwnl 0 n 00 epoAt AslZnl 1t As expljwl n0 7 epoAt 7 Am expm 7 As expw expl6 1At Asl We therefore have Elel 7 j1lt1gt22o0gt A 7 As and Emit 7 j2lt1gtf2ltjogt 7 A 7 A A 7 As where 13 0 denotes the kth derivative of P594312 with respect to jv evaluated at v 0 Note that the variance of N5 is varN9t ENt 7 ENst2 A 7 A9 A 7 As2 7 A 7 As2 At As Eletl So we have that the variance of the number of points on the interval 3 t equals the expected number of points on the interval for a Poisson process BMED 8813 Notes on point processes 6 4 Time statistics of the Poisson counting process Associated with the Poisson counting process are several well de ned statistical measures We have already discussed the statistics of the number of points occurring in an arbitrary interval Two further measures have to do with the statistics of the intervals between events and the statistics of the actual point locations collectively described as the time statistics of the process Let denote the times of the events often called occurrence times and the sequence Tn denote the intervals between events often called inter arrival times By convention Tn is the nth inter arrival time between the n 7 1st and nth occurrence times Tn E Wn 7 Wn71 Let tht 2 to be a Poisson counting process with intensity function A for all t 2 to The parameter function is thus At Aada for t 2 t0 Occurrence times Denote the joint probability density for the rst n occurrence times as fw with occurrence times W W1 W2 Wn In order to determine the form of this density we rst consider the partitioning of disjoint intervals Note that W E 10 w Awi for i 1 2 n We therefore consider the probability of the event Nto i 07 Nw1w1Au1 17Nw1Aw1w2 Ova2w2Au2 17quot397N Un UnAwn 1 Using the independence of disjoint increments we have wiAwi wnAwn PrW e wiwi Aw 1n H Aada exp 7 Mia i1 W 0 We therefore have that fw hm PrlVi E wiwiAwii1n IDAXA UJZ39HO H1 which reduces to Wm exp 7 ma 3 for to wl wg wn and 0 otherwise The joint occurrence density for a homogeneous Poisson process with constant intensity is therefore Wm vexpwwnew for to wl wg wn and is 0 otherwise We can now easily demonstrate that the occurrence time sequence W1 W2 Wn for an inhomogeneous Poisson process is a Markov sequence The process is called a Markov process if for any k 2 2 there holds PrWk S wle1 S W17 7Wk71 S wka PrWk S wlekA S wka BMED 8813 Notes on point processes 7 This can be interpreted qualitatively as a dependence of Wk only on the most recent past Wk1 Using 3 we can express the conditional density as m w anan71W1wnlwnil7 7w1 n71 wn exp 7 Udo39 fW w Mil Aw epoAw Awn71l fwwwhanlwnil which implies that the inhomogeneous Poisson process is indeed Markov It will be left as an exercise to show that the occurrence times for a homogeneous Poisson process are gamma distributed Interarrival times Note that the occurrence time can be written as Wn Wn1 Tn The further implication is that anan71W1tnlwn717 7w1 anan71W1wn71 tnlwnih 7101 anan1wn71 tnlwn71 Muf n explimwam Awn71l 4 forn23 and thO For a homogeneous Poisson process we nd that At tit0 For this case the expression in 4 becomes anan71W1tnlwn717397w1 AexpPVn So the inter arrival times for a homogeneous Poisson process with intensity are identically distributed with the exponential distribution with parameter It can also be shown that anlTn71T1tnltn717 7t1 t0t1mtn71tn exp Atot1mtn71tn Atot1mtn71l From this we have that the inter arrivals are independent for a homogeneous Poisson pro cess stated more formally in the following statement For a homogeneous Poisson counting process with intensity the inter arrival times T1 T2 Tn are independent and identi cally distributed as exponential with parameter This can easily be shown by examining the density of the inter arrival times fTt anlTn71T1tnltn717 7t1an1lTn2T1twilltn727 7t1 quotfT1t1 which becomes fTt Anexp AZM ti 2 0 i1 It is clear however that the inter arrival times for an inhomogeneous Poisson process are not independent BMED 8813 Notes on point processes 8 5 Simulation In this section the simulation of the Poisson counting process will be described Homogeneous Poisson Suppose the is a sequence of independent uniformly dis tributed random variables on 01 Then a sequence of independent exponentially dis tributed random variables can be obtained from the transformation T lnUi result ing in a probability density for T that is exp7t A homogeneous Poisson process can therefore be simulated by generating a realization of U1 transforming to T1 and assigning T1 as the time from to to the rst occurrence time generating a realization of U2 transforming to T2 and assigning T2 as the time from the rst to the second occurrence time and so on Inhomogeneous Poisson Suppose that we wish to generate a realization of a random variable N59 that is Poisson distributed with parameter f Aada De ne M t 2 0 to be a homogeneous Poisson process with unit intensity Then M is Poisson distributed with parameter 739 MT and N5 have the same distribution for 739 f9 Aada The procedure for generating this realization is to rst produce a sequence of independent unit parameter exponentially distributed variables to obtain a realization of M t 2 0 The number of points in 0 739 is a realization of N59 6 The Sample Function Density The sample function density is an important quantity for the Poisson counting process as well as the more general counting processes we will encounter The sample function density for the counting process N on the interval 250 t is de ned as PrM 0 N 0 lt E pHNm t0 7 039 lt tl fWNtw7n Nt n 21 5 where fWNtw n E PrNt nlWl w1W2 wg Wn wdf kw where again 1 63 w represents the probability density for the rst 71 occurrence times In words the sample function density is the probability of obtaining a particular realization of the point process on 25 to with Nt n points located at times W1 w1W2 102 Wn wn Suppose that the intensity function is again represented by At The sample function density can be expressed as exp 7 ft Again N 0 g AW exp 7 ft A0010 Nt n 2 1 The expression in Equation 6 can be proven in the following manner Recall that the occurrence density is fw exp 7 Ayala 7 i1 0 fHNa to S 0 lt 75H 6 BMED 8813 Notes on point processes 9 Further7 we have Prth llV1 W17 W2 w27 7 Wn wnl Prth 7 an 0l exp lt7 gladly 8 flNa t0 a lt tn Prth 7 W17 10th 7 1027 W 7 wnlfw t um exp 7 Ayala AW exp 7 Ayala w 13971 0 n w t The expression in Equation 5 reduces to establishing the expression in Equation 6 We will also express the sample function density in the following form t t fN0 to g a lt t exp 7 Aada ln AUdNUgt to to where the last term can be interpreted as a Riemann Stieltjes integral t 0 M0 lndN l a a M21 See Section 612 in the Appendix for more detail on the Riemann Stieltjes integral lntegrals of this type are called counting integrals 61 Appendix 611 Derivation of the differential equation for the counting probability Here7 we seek to establish the differential equation describing the derivative of PrN5t l with respect to time Again7 we can write n rams n 2mm n 7 k7M 5 k k0 PrMt6 O Nst nPrN5t 71 PrNtt5 MN n 7 1PrNst n 7 1 n Zpruvtm MN n 7 14mm n 7 k 9 BMED 8813 Notes on point processes We rst must establish two properties that will be valuable First from conditional order liness we have PrNtt5 gt MN n 7 k g ePrNtt5 1let n 7 k 10 Furthermore the derivative of PrNtt5 HP is de ned as At E 133 PrlJVt 1lPl where P is any event implying that 7e 61PrNt 5 11p 7 A g e 11 where e gt 0 and 6 gt 0 selected so that Equations 10 and 11 are both satis ed simultaneously We will now use Equations 10 and 11 to upper bound the expression in Equation 9 M n Zpruvtm MN n 7 14mm n 7 k mm gt 1mg n 7 14mm n 7 k k2 k2 n EZPthM 11M n 7 14mm n 7 k k2 n EZPthM 11N5 n 7 14mm n 7 k k0 Eprljvt 6 1l 12 If P is some event independent of Nag then we can write n Emmy MN n 7 14mm n 7 k Emmy 11p k2 666 Mt 526 7 66A 13 Now consider the following relationship PrNtt5 Ole n 17 PrNtt5 gt Ole n 1 PrUVt 5 1 N5t n 7 PrUVt 5 gt1 N5t n We can obtain a lower bound on Equation 14 using Equation 10 and the upper bound in Equation 11 1 1 EPrtht6 1let 71 Z 17 1 ee6 6M PrNtgtt5 O N n gt 17 PrNtgtt5 ENS n 7 ePrNtgtt5 MN 71 Similarly we can obtain an upper bound using the lower bound of Equation 11 PrNtt5 OlN5 n g 17 76 At6 7 PrNtt5 gt1lN5 n 17 lt7 M6 1 66 7 t6 BMED 8813 Notes on point processes Combing this upper and lower bound7 we have 17 1 ee6 Mt PrNtts 0N5t n 1 66 7 t6 Now7 we are ready to put it all together Rearranging Equation 97 we have 0 PrNst5 n 7 PrNtt5 0Nst nPrNst n 7PrNtt5 1Nst n 71PrNst n 71 71 7 ZPrUVerg MN n 7 kPrNst n 7 k 15 k2 Using Equations 10 and 117 we can obtain the following lower bound 0 lt PrNst5 n 7 17 1 6 t6 e6PrNst n 7 766 t6PrNst n 71 PrNst5 n 7 MN n 7 M3 7 66 7 EM 7 86mm n 66PINS n 7 1 7 t6Pr N59 n 71 PrNst5 n 7 MN n t6PrNst n 7 t6PrNst n 7 1 e6 6amp6 626PrNst n E6PINS n 71 So we have 6 1PrNst n 7 MN 71 AtPrNst n 7 AtPrNst n 7 1 Z 7eext6276762t6 We can then obtain the following upper bound 0 Z PrNst5 n 7 17 t6 e6PrNst n 7 66 t6PrNst n 71 7626 E6t PrNst5 n 7 PrNst n t6PrNst n 7 t6PrNst n 7 1 7E6PrNst n 7 E6PIN5 n 7 1 7 626 7 E6t So we have 6 1PrNst n 7 MN 71 AtPrNst n 7 AtPrNst n 7 1 626At2662t6 The bounds ie2 At e can be made arbitrarily small7 giVing us the derivative of the counting probability 1 N W 7tPrNst n AtPrNst n 71 which completes the proof D BMED 8813 Notes on point processes 612 The RiemannStieltjes Integral The following is a discussion of the difference between the Riemann integral and the Riemann Stieltjes integral used in the preceding derivation Let a b be a given interval By a partition P of a b we mean a set of points 0 1 mn where ax0 x1 z2 mn1 mn b 16 We write Ami miimiil i ln Now suppose that f is a bounded real function de ned on ab Corresponding to each partition P of a b we put Mi sup m i71 i mi inffz miilgmgmi UPf nMAz i1 MRI 7 mm i1 and nally 7 fdz infUPf 17 b dz supLltPfgt 18 where the inf and the sup are taken over all partitions a b The left sides of 17 and 18 are called the upper and lower Riemann integrals of 1 over a b respectively If the upper and lower integrals are equal we say that f is Riemann integrable on ab We write 1 E R where R denotes the set of Riemann integrable functions We done the common values of 17 and 18 by b fdz 1 Ab mm This is the Riemann integral of 1 over a 1 Since 1 is bounded there exist two numbers in and M such that or by Georgia Institute of Technology amp Emory University Coulter Department of Biomedical Engineering BMED 8813 NEUROPHYSIOLOGY Fall 2008 Probability and Statistics 1 Probability Spaces Consider an experiment 5 that can be conducted many times Each trial of 5 results in a possibly different outcome or realization w The sample space 9 is the set of all possible outcomes w We declare certain subsets of Q to be events and we will assign probabilities to these special subsets The set of events is A A probability space is a triple 9 A7 Pr where Pr A a R is a function called the proba bility that satis es the following axioms i 0PrA1vAeA ii PrQ 1 iii If A o B then PrA o B PrA PrB iv lf is a sequence of events An 6 A such that An a A7 then PrAn a PrA It is easy to show that PrA U B PrA PrB 7 PrA O B 2 Independence and Conditioning Let 9 A7 Pr be a probability space and let A7 B E A be two events with PrB 7 0 We say that A and B are independent events if PrA B PrAPrB This can be thought of as the information that B has occurred does not alter the chance that A has occurred A collection of events 1411427 714 is called independent if for every nite sub collection7 we have Pr i1Aki HprAki i1 BMED 8813 Notes on probability and statistics 2 where ki E 12nVi E 1m and ki kj foriy j We de ne the conditional probability of A given that B has occurred as PrA O B PrA B MB This is called Bayes rule and PrAB has the interpretation of the chance event A has occurred given the information that B has occurred Note that if A and B are independent events then PrAB PrA 3 Random Variables We are particularly interested in experiments whose outcome is a real number for example as in the measurement of a voltage For experiments whose outcomes are not real numbers like coin tossing we can assign real numbers to outcomes Let QA Pr be a probability space A random uariable X is a mapping X Q a R For example in the two coin toss experiment we can declare HH 1 HT 2 TH 3 and TT 4 4 The Distribution Function Let X be a random variable on the probability space 9 A Pr The probability distribution function PXz associated with X is PXm Prw Xw lt m PrX lt m It is easy to show that H 11mins PXz 0 ii liming PXz 1 iii PX is a monotonic non decreasing function of z iv PX is left continuous in z 5 Continuous Random Variables A continuous random variable can assume all values in an interval or in several distinct intervals The intervals may be unbounded as in the positive part of the real axis or the whole real axis The possible outcomes are in nitely close77 to each other and no single outcome has a positive probability In the case that PXz is differentiable everywhere we de ne the probability density function fXm associated with X as d fxm P m BMED 8813 Notes on probability and statistics 3 The interpretation of the density function is fXzAm PrX E mz Am This approximation becomes exact in the limit as Am a 0 It is easy to verify fXx 2 0 and f fXada me 6 Discrete Random Variables A random variable is discrete if it can assume a nite or a denumerably in nite number of different values This is a common situation for example where counting is involved Let X be a random variable on the probability space 9 A Pr The probability function analogous to the density function for continuous random variables is mug PrXk In general k E Z ie an integer but typically the process involves counting and we thus restrict to non negative k We obviously have that PXW ZPXU 13 Conversely we have p 0 if k 0 PXk pEkPXki 1 otherwise 7 Expectation Statistical expectation will be discussed for continuous random variables but identical re sults exist for discrete random variables where the integrals become sums Let X be a continuous random variable on the probability space 9 A Pr The expected value of X is de ned as 00 EX Mcde OO often referred to as the mean and has the interpretation of the average value of the random variable X Let us denote this quantity u X Let g be a real valued function and X be a random variable Consider the random variable Y gX We can write Em m mom which can be shown to be BMED 8813 Notes on probability and statistics 4 The mth moment of X is EXm 00 EXm 00 Of particular importance is the 1st moment or mean and 2nd moment related to the variance The expectation is a linear operator If X and Y are two random variables7 then we have EaX BY aEX BEY where a B E R are arbitrary constants 8 Covariance and Correlation Let X and Y be random variables on the same probability space 9 A Pr The co variance between X and Y is de ned as COVX7 Y E CXY EKX MXY MN The variance of X is de ned as VarX CovX7 X CXX EKX MXX Md EKX MX 0 ie the variance of X is just the covariance of X with itself The correlation is related to these covariance measures in the following way C X Y C CorrltX7Y XY CovXXCovY7 Y Q72 02 X Y which is a measure that lies between 1 and 17 where a value of 1 implies perfect correla tion X Y7 1 implies perfect anti correlation X iY and a value of 0 implies no correlation at all Note that CorrX X 7 COVX7X 770 CovXXCovXX 47367 1 9 Common Distributions 91 Continuous Distributions For each of the following7 let X be a continuous random variable on the probability space 9 A Pr Several continuous distributions will be presented here see 1 or any introduc tory text on probability and statistics for reference BMED 8813 Notes on probability and statistics 5 Singlepoint distribution If the total mass of the random variable X is concentrated at one point7 a7 yielding fXm 6m 7 a Uniform distribution If the random variable X has the density function 1b7a ifaltmltb 0 otherwise fx then X is said to have a uniform distribution on 11 Note that 0 ifxlta PXm ifagng 1 1 ifmgtb Exponential distribution If the random variable X has the density function 1 imm Re if z 2 0 0 zlt0 where m gt 07 then X is said to have an exponential distribution with parameter m7 or X N expm Normal distribution If the random variable X has the density function 1 Jerry 7W5 20 fooltmltoo where m and a gt 0 are given quantities7 then X is said to have a normal7 or Gaussian7 distribution with parameters m and a or X N 39m7 02 Gamma distribution If the random variable X has the density function m p71 wa fgt0 amp 5 1 kw 0 zlt0 where p gt 0 and a gt 07 then X is said to have a gamma distribution with parameters 197 a The gamma function is de ned as CO Pp E zpile mdz pgt0 0 Note that for p 17 the gamma distribution reduces to the exponential distribution 92 Discrete Distributions For each of the following7 let X be a discrete random variable on the probability space rm Pr Singlepoint distribution The total mass of the random variable X is concentrated at one point7 a7 yielding pXa 1 Yawn BMED 8813 Notes on probability and statistics 6 Bernoulli distribution Suppose that X assumes only two values a and b with proba bility p and q 1 7 p respectively Then X is said to have a Bernoulli distribution The probability function is This situation occurs during a coin tossing experiment where you receive a dollars when you ip a heads with probability p and 1 dollars when you ip a tails with probability q Uniform distribution Suppose that X assumes the values 12 m with the same probability 1m Then X is said to have a uniform distribution and the probability function is pXk 1m k12m Geometric distribution If a random variable X has the probability function pxltkgt 7 qkp k 7012 where q 1 7 p then X is said to have a geometric distribution Consider a coin for which there is p probability of heads and q probability of tails The geometric distribution described above represents the probability that in a sequence of independent coin tosses there are k tails before the rst heads Binomial distribution If X has the probability function 71 7 MW 7 k pkq k 17013 where q 1 7p then X is said to have a binomial distribution Again consider a coin for which there is p probability of heads and q probability of tails The binomial distribution represents the probability that in a sequence of n independent coin tosses that there are k heads and obviously n7 k tails The second term represents the probability of a particular sequence of k heads p k and n 7 k tails q k The rst term represents the number of possible ways in which this can be achieved in n tosses For example we could have k heads followed by n 7 k tails or n 7 k tails followed by k heads and so on The expression Poisson distribution If the random variable X has the probability function 71 i i k is n choose k and is k pxk e m k012 with m gt 0 then X is said to have a Poisson distribution with parameter m BMED 8813 Notes on probability and statistics 7 10 Random Vectors Let Q A Pr be a probability space and let X1 Xn be continuous random variables over this space ie Xk Q a R for k 1 n De ne the random uector X X1XnT and let x m1 znT E R The joint distribution function of X is Pxm PrX1ltz1ampX2ltz2amp ampXnltn and the joint density function of X is These functions have the usual interpretations and we can introduce the notion of expecta tion in the obvious manner In particular if X is a random n vector and Y is a random in vector then Pm PXyltxygtdz For discrete random variables there are analogous expressions The couariance matrix of X and the cross couariance matrix assuming n m of X and Y are de ned respectively as CovXX CXX EX 7 WX 7 MT 6 RM COVXY CXY EX 7 MW 7 MT 6 RM where T represents matrix transpose The random variables X and Y are called uncorrelated if CXY EKX MW MDT 0 and the collection of random variables X1 Xn is called uncorrelated if the collection is pairwise uncorrelated CXin Oforiy j or equivalently if CXX is diagonal 11 Functions of a Random Variable Let X be a random variable over the probability space 9 A Pr and let 9 R a R Then Y gX is also a random variable over the same probability space 9 A Pr We can write my PrYlty PrgltXgt lty 3 more where S z 9a lt BMED 8813 Notes on probability and statistics 8 Now suppose that in the neighborhood of some point yo ie in a sphere in R with in nitesimal radius the function g is invertible with a differentiable inverse More precisely there exists an open neighborhood containing yo on which the function My z is well de ned continuous and differentiable where y Next observe that fyyoAy PFY E 240 yo Ayl PFX E Wyot Wye AM a PrX e We me gummy om2m Ag 0Ay2 z fxhyo lgw Dividing through by Ag and taking the limit as Ay a 0 gives us fyyo z fXhy0 yo Example Consider the random variable X N NO 1 and let Y gX that gz expz is invertible for all y gt 0 and its inverse is My yo gt O we have expX Observe lny Next for fx1ny0y0 1 7r empower2 240V 2 Also since the random variable Y is always positive fy 0 Vy O Mm fXhyo gw Now let X be a random Nivector and let Y gX where g RN a RN is some function Suppose h 9 1 exists on some neighborhood of yo 6 RN and suppose that h has continuous rst partial derivatives Let us write My hy17y277yNm1 m2 901v lTh1y My hNyT The Jacobian of h at go is the N x N matrix th 321 32m Myo E E M M J 321 3W yo Using the Taylor series expansion for vector valued functions we can write hy0 Ay hy0 Jhy0Ay higher order terms We can approximate h by neglecting the higher order terms and this approximation is good for small Using several approximations and taking the limit as goes to zero we have fYyo fXhyodetJhyo BMED 8813 Notes on probability and statistics 9 12 Statistical Independence A collection of random variables X17 X27 7XN is called statistically independent if N N PXz H PXkzk or equivalently fXm H kamk k1 k1 lf X17X27 7XN are independent7 then 91X17 7gNXN are also independent N N E 90 HEMXM k1 k1 Note that if X17 X27 7XN are independent7 then CXin 7 HwilXj 7 MmjTl 7 MmilElXj 7 1le 0 i In other words7 independence implies uncorrelated The converse of this is not true in general 13 Moments and Characteristic Functions Let X be a random variable Recall that the nth moment of X is For the random variable X 7 de ne its characteristic function or moment generating function by ltIgtXw E Eexpij 0 ejmeXxdx Note that the moment generating function is simply the Fourier transform of the density function Using the inverse Fourier transform we have 1 00 7 I d fxm 2 700 xw w The moments of X can be readily found from ltIgtXw as follows Theorem Let X be a random variable with moment generating function ltIgtX Then7 EX jnltIgt07 where the superscript denotes the nth derivative with respect to w Proof We make the following calculation EX 1m fxmdmOO mneiwwfxam w0 1 00 dquot 1 d 00 mwe wzgtdz 0 wltmewfxzdgt 0 i w i w 1 d ltIgtXltwgtto j dw
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'