MODELS IN BIOLOGY
MODELS IN BIOLOGY BIOL 429
Popular in Course
Popular in Biology
verified elite notetaker
This 5 page Class Notes was uploaded by Renee Lehner on Wednesday September 9, 2015. The Class Notes belongs to BIOL 429 at University of Washington taught by Carl Bergstrom in Fall. Since its upload, it has received 23 views. For similar materials see /class/192308/biol-429-university-of-washington in Biology at University of Washington.
Reviews for MODELS IN BIOLOGY
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/09/15
Lecture 2 Discrete probability Biology 497 Carl Bergstrom October 5 2006 Sources These lecture notes draw heavily on material from Taylor and Karlin An Introduction to Stochastic Modeling 3rd Edition 1998 Parzen 1962 Stochastic Processes Pitman 1993 Probability and Van Campen 1997 Stochastic Processes in Physics and Chemistry In places de nitions and statements of theorems may be taken directly from these sources Many phenomena in biology are fundamentally stochastic we cannot un derstand how they work if we View them as fully deterministic In this course we will investigate the mathematical tools and techniques of stochastic pro cesses with an aim toward being able to model and understand biological processes and phenomena De nition 1 A stochastic process is a dynamical system which unfolds pro babilistically As such the eld of stochastic processes is grounded upon probability theory In the rst few lectures I will present a brief refresher on proba bility theory This should not be taken in lieu of a proper course or text in probability theory but rather is intended simply to sketch out the key points that hopefully everyone has seen already Probability theory can be developed in a completely formal rigorous fashion grounded in set theory and measure theory this is known as ar iomatic probability theory Alternatively one can be develop probability theory in a common sense fashion allowing our our basic intuitions about chance to replace much of the formalism I will take the latter approach throughout Random Variables De nition 2 A random variable is a variable that takes on its value prob abilistically A random variable X is de ned by a set of possible values m and a distribution of probabilities pz over this set such that 1 pz 2 0 for all z and 2 Em pz 1 For example if X is the random variable representing the outcome of rolling a fair die the random variable X has a set of possible values 12345 6 each with probability pz 16 If X drawn is from dis tribution F we write x N F In this die example X is a discrete random variable because it takes one of a set of discrete values Random variables can also be continuous they can take on any of a set of continuous val ues Here we will treat discrete random variables rst and then move on to continuous random variables Events For future revisions this would be a good place to introduce the Venn Diagram set theory analogy for dealing with probabilities of events One could then to continue to use that analogy throughout to explain most of the other formulae here Closely related to the notion of a random variable is the concept of an event De nition 3 An event is the case that a random variable takes on a value within a described subset of possible values For example the event that I roll a die and get an odd number can be represented as the event X 6 13 5 Events can include single values of random variables eg the event that X 4 they can include all possible values X E 1 2 6 and they can include no possible values X E lf events A1 A2 An are mutually exclusive events with probabilities PAi the probability that any one of them occurs is PA1 or A2 or An ZPMi Conditional probability Conditional probability lets us talk about the chance that one event occurs given that another occurs De nition 4 The conditional probability of A given B is PM and B Bl We will write PAB as shorthand for PA and Note that we can also write PlAlBl PAB PAlBPB We can extend this to write down a chain rule for probabilities This rule tells us the probability that a series of events A1 A2 An happen in succession This will be one of the main foundations that we ll use in working with stochastic processes PA1A2A3 An PA1PA2lA1PA3lA2A1PAnlAn1A3A2A1 De nition 5 A partition is a collection of non overlapping subsets of events B1 B2 B3 that together cover the whole such that Bi 1 Theorem 1 The Law of total probability states that if B is a partition the probability of an event A is the probability weighted sum of the conditional probabilities ofA given Bi PlAl ZPlAlBilPlBil Independence We say that two events A and B are independent if the probability that A occurs does not depend on whether B has occured and visa versa That is PAlB PAl not B PA When two events are independent the probability that both occur is simply the product of the probabilies that each occur PmmPmHm Similarly we can extend this to more than 2 events Proof de ne a new event as the event 77A and B then apply again Bayes7 Rule Bayes7 Rule allows us to work out the conditional probabilities of A given B from the conditional probability of B given A PBlAPA Pwm Pm Joint Distributions The joint distribution F of two random variables X and Y can be written as Fmy PX z and Y y For future revisions a more detailed explanation of marginal distributions would be very helpful The marginal distribution PX Z PY le gives us a single variable distribution on the values of Y Notice that If X and Y are independent Fmy Here the marginal distribution Pm PM and Py Mean and variance The expected value of a discrete random variable X is de ned as EX EMHX m Expectations are additive even for random variables that are not inde pendent Thus EX Y EX EY If two random variables are independent the expectation of their product is equal to the product of their expectations EXY EXEY However note that in general this is not true Moreover the expectation of a function is not generally equal to the function of the expectation ie Elmo y fElxl The variance of a random variable X is de ned as VarX E X 7 EX2l We can also write this as VarX EX2 7 EX2 by expanding the de nition of variance and recognizing that EXEX EX2 If two random variables are independent the variance of their sum is equal to the sum of their variances VarX Y VarX VarY
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'