### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Calculus Level I STAT 3025

UCONN

GPA 3.87

### View Full Document

## 23

## 0

## Popular in Course

## Popular in Statistics

This 78 page Class Notes was uploaded by Blair Williamson on Thursday September 17, 2015. The Class Notes belongs to STAT 3025 at University of Connecticut taught by Staff in Fall. Since its upload, it has received 23 views. For similar materials see /class/205901/stat-3025-university-of-connecticut in Statistics at University of Connecticut.

## Reviews for Calculus Level I

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/17/15

V 7 Parameter Estimation Part III Cyr Emile M LAN PhD mlanstatuconnedu Parameter Estimation Part III p 12 Introduction V 7 4 Text Reference Introduction to Probability and Statistics for Engineers and Scientists Chapter 7 a Reading Assignment Sections 7475 December 1st Parameter Estimation Part III p 22 Constructing Confidence Interval for M1 2 If Example 720 Two Normal Populations with variances 0 and 0 known 9 Let X1X2 Xn denote a SRS from NW1 0 with 01 a known constant Let Y1Y2 Ym denote a SRS from NW2 0 with 02 a known constant Find a 1 a confidence interval for the difference in means M1 M2 9 Solution Start with 7 7 LetZ M1 M2 Then we have 2 2 0391 0392 TL m a ter Estimation Part III 32 2 N Mm my D Constructing Confidence Interval for M1 2 j and derive the following confidence intervals as we did before in other cases Thus a 1001 00 confidence interval for M1 M2 is or equivalently 2 2 X Y i za2l Parameter Estimation Part III p 42 Constructing Confidence Interval for M1 2 If A 1001 00 onesided upper confidence interval for 1 H2 iS 9 A 1001 00 onesided lower confidence interval for M1 H2 iS Parameter Estimation Part III p 52 Constructing Confidence Interval for M1 2 If Example 721 Two Normal Populations with variances 0 and 0 unknown but with 0 0 02 common variance 3 Let X1X2 Xn denote a SRS from NW1 0 with 01 a known constant 3 Let Y1X2 Ym denote a SRS from NW2 0 with 02 a known constant Find a 1 a twosided confidence interval for the difference in means M1 M2 Parameter Estimation Part III p 62 Constructing Confidence Interval for M1 2 397 Solution We have VarY 7 a ngO g 02 n m n m 1 We know that the statistics Sf and S22 sample variances for the two populations are two unbiased estimates of 02 Le ESl2 02 and ES 02 In addition under the normality assumption we have TL DSZ m lS2 02 1 N Xi 1 and 72 N XEn l 2 4 2 4 As a consequence VarSl2 a 1 and VarS 0 1 n m 9 The statistic 52 n 1S m 1gts p n l m 2 is an unbiased estimator of 02 and also have smaller variance than L both sample variance 12 and 3 therefore better S is called a pOOIed sample Var39ance39 Parameter Estimation Part III p 72 Constructing Confidence Interval for M1 2 r Indeed we have n 1ESf m 1ES 2 2 n m 2 0 VarltS2gt n 12VarSl2 m 12VarS 204 p nm 22 nm 2 9 It can be shown that n m 2 S 2 T N Xnm 2 9 Hence The sampling distribution of the statistic T is as follows 7 7 M1 M2 N nm 2 Sp l l L n m L Parameter Estimation Part III p 82 Constructing Confidence Interval for M1 2 r Thus a 1001 00 twosided confidence interval for M1 M2 is 7 tOz2nm ZSp 1 1 7n 1 1 X Y ta2nm 2Sp E E or equivalently Y 7 i ta2nm2sp 1 1 n m Parameter Estimation Part III p 92 SmallSample Confidence Intervals for 111 112 397 Example 722 A farmequipment manufacturer wants to compare the average daily downtime for two sheetmetal stamping machines located in two different factories Investigation of company records for 10 randomly selected days on each of the two machines gave the following results n1 10 921 2 12min 36 n2 10 22 29min 834 Assume that we have the common variance assumption holds Estimate the difference between the average daily downtime for the two sheetmetal stamping machines with confidence coefficient 095 What additional assumptions are necessary for the method used to be valid 1 1 l 12 9 i 2101 2236 1 0 3 i 2101 1 arameterstimation Part III p 1021 A guide to Political Philosophy liYou have two cows 3 Socialism You give one to your neighbour Communism The government takes both and give you the milk Fascism The government takes both and sells you the milk 0 Nazism The government takes both and shoots you 9 Capitalism You sell one and buy a bull Trade Union They take both from you shoot L one milk the other one and throw the milk away Parameter Estimation Part III p 1121 Constructing Confidence Interval for p If Example 723 Bernoulli Population with p unknown Let X1 X2 Xn denote a SRS from Bernoulli distribution with parameterp unknown Find a 1001 a percent twosided confidence interval for p Solution Let X X1 X2 X the number of successes X np np1 29 Thus an approximate 1001 00 twosided confidence intervals forp is Let Z Then we have Z N0 1 7113 p p 3 Zea2 n ParameterEstimation Part III p 1221 This confidence Interval IS only valid If 71131 13 gt 10 Constructing Confidence Interval for p If Example 723 Example 72 revisited Find an approximate a 99 twosided confidence intervals for p Solution A 99 twosided confidence intervals forp is given by A 13119 15 3 Zea2 Here 13 019 and 71131 13 3078 gt 10 The desired 99 twosided confidence interval is then 0191 019 200 019 1 2576 019 1 00715 L or 011857 02615 Parameter Estimation Part III p 1321 Constructing Confidence Interval for p If 0 A better approximate 1001 00 twosided confidence intervals for p is 2 A A Z 1 p 2iza2 p 77 n 3 Example 723 revisited The desired 99 twosided confidence intervals is then 014177 02500 L Note that this interval is smaller than the interval 01185 02615 derived prpavig gtgl stimation Part III p 1421 Constructing Confidence Interval for p1 m V 7 An approximate 1001 00 twosided confidence intervals for p1 p2 A A 1311 131 1321 132 p1p2iza2 n1 712 This confidence interval is only valid if 711151 151 gt 10 and 712152 92 gt 10 Parameter Estimation Part III p 1521 LargeSample Confidence Intervals cont If Example 724 A large firm made up of several companies has instituted a new qualitycontrol inspection policy Among 60 artisans sampled in Company A only 15 objected to the new policy Among 64 artisans sampled in Company B 20 objected to the policy Estimate the true difference between the proportions voicing no objection to the new policy for the two companies with confidence level 098 9 Solution Let p1 denote the proportion corresponding to company A and p2 that of company B A 98 CI for p1 p2 is A A 1311 131 1321 132 191 p2 i Za2 n1 772 Parameter Estimation Part III p 1621 LargeSample Confidence Intervals cont r Here 131 1560 025 and 132 1032 3125 We also I have H1131 91 1125 gt 10 and 712152 92 1375 gt 10 Hence the desired 98 CI is then 1 025 031251 03125 60 64 025 025 03125 i 233 00625 i 01876 or 0250101251 This interval contains zero Thus a zero value for the difference between the proportions voicing no objection to the new policy for the two companies p1 p2 is believable at the 98 level on the basis of the observed data Parameter Estimation Part III p 1721 LargeSample Confidence Intervals cont 397 Also this interval contains O1 and 01 Thus both O1 and 01 also represent also believable value for p1 p2 at the 98 on the basis of the data In general any value in the interval represents a believable value for 191 192 o General comments on LargeSample Cls i In practice caution should be taken when sample size is not too large In this case CI may not be accurate ii CI could be wrong if the model assumption is incorrect L Parameter Estimation Part III p 1821 Choice of the Sample Size 1 The length of the confidence interval 7 i gal2 039 2Za2W gives an idea of the precision or accuracy in the point estimate 7 Setting 2 2a 2 i to be l lead to the sample size 4222 02 n 2 Hence the sample size required for the CI to have a 2 2 length l is n l2 Parameter Estimation Part III p 1921 Choice of the Sample Size cont 397 For a proportion p the sample size required for the Cl to l have a length l is 4zigp1p nl 2 When p is unknown we use the worst casescenario sample size 2 ZaZ 2 which corresponds to setting p 05 Parameter Estimation Part III p 2021 j Random Variables and their Expectation Part III Cyr Emile M LAN PhD mlanstatuconnedu Random Variable and Expectation Part III p 1 Introduction If 4 Text Reference Introduction to Probability and Statistics for Engineers and Scientists Chapter 2 a Reading Assignment Sections 43 47 48 and 49 October 20 October 22 9 So far we have studied the probability models for a single random variable Many problems in probability and statistics lead to models involving several random variables simultaneously 9 Thus in this lecture notes we discuss probability models for the joint behavior of several random variables L We revisit and reemphasize the notion of independence and 0f Conditional 390 QP RiUEYia39Bl Eli E9ampc93 e r li two 9 2 more random variab es 39 39 39 Jointly Distributed Discrete Random Variables r The probability mass function of a single random variable X specifies how much probability mass is placed on each possible value a of X o The Joint Probability Mass Function of Two Discrete Random Variables The joint probability mass function of two discrete random variables X and Y describes how much probability mass is placed on each possible pairs of values x y Random Variable and Expectation Part III p 3 Jointly Distributed Discrete Random Variables r Definition 49 Let X and Y be two discrete random variables associated to a random experiment each assuming values on the sample space 81 and 82 respectively The joint probability mass function pc y is defined for each pair of numbers cc y in S 81 lt29 82 by Way PX 56 Y y Let A be any event in the sample space S that is consisting of pairs xy with x e 81 and y e 82 Then the probability that the random pair X Y lies in A is obtained by summing the joint probability mass function over pairs cc y in A PX7Y A Z p7y L xyEA Random Variable and Expectation Part III 4 Jointly Distributed Discrete Random Variables r Theorem 47 Let X and Y be two discrete random variables with joint probability mass function pry defined on S Then it must pc y must satisfy 1 p y 2 0 for all x and y 2 Z py 1 my 8 Definition 410 Let X and Y be two discrete random variables with joint probability mass function pry defined on S The joint distribution function Fr y is defined as F7y PX S 56 Y S y py L E Z W310 31321 Random Variable and Expectation Part III 5 Jointly Distributed Discrete Random Variables r Given the joint probability mass function of two discrete random variables X and Y one can find the probability distribution of X or Y alone The probability distribution of the individual random variables are called the marginal distributions Definition 411 Let X and Y be two discrete random variables with joint probability mass function pry defined on S The marginal probability mass function of X and Y is defined by as pXx PX oo Y ooZpy y L pyy POOSX 007YyZp7y 6 Jointly Distributed Discrete Random Variables If Example 420 An large insurance agency services a number of customers who have purchased both a homeowner s policy and an automobile policy from the agency For each type of policy a deductible amount must be specified For an automobile policy the choices are 100 and 250 whereas for a homeowner s policy the choices are 0 100 and 200 Suppose an individual with both types of policy is selected at random from the agency s files Let X the deductible amount on the auto policy and Y the deductible amount on the homeowner s policy Suppose the joint probability mass L j function is y pv y 0 100 200 x 100 20 10 20 250 05 15 30 a Find PX 100Y 2 100 b Find the marginal probability mass function of X and Y and PY 2 100 Random Variable and Expectation Part III p 7 Jointly Distributed Discrete Random Variables If Solution a PX 100Y 2 100 p100 100 p100 200 10 2030 b The marginal probability mass function ofX and Y are 1Kxy O 100 200 lex x 100 20 10 20 50 250 05 15 30 50 pyy 25 25 50 100 Hence L PY 2 100 PY 100 PY 250 25 50 75 Random Variable and Expectation Part III p 8 Jointly Distributed Continuous Random Variables r Definition 412 Let X and Y be two continuous random variables associ ated to a random experiment each assuming values on the sample space 81 and 82 respectively The joint probability density function fc y forX and Y is the function such that for any twodimensional set A PXYEA EAfxydxdy fxydydx my A In particular d b Pa X bc Y d fffltxygtdxdy b d fxydydx Random Variable a d pectation Part III 939 Jointly Distributed Continuous Random Variables r Theorem 48 Let X and Y be two continuous random variables with joint probability density function fxy Then fcy must sat isfy 1 fcy 2 0 for all x and y 2 11fxydxdy 1 Definition 413 Let X and Y be two continuous random variables with joint probability density function fxy The joint distribution function Fr y is defined as L FyPX Y yy x f7ydxdy OO 00 Random Variable and Expectation Part III 10 Jointly Distributed Continuous Random Variables r j Definition 414 Let X and Y be two continuous random variables with joint probability density function fy The marginal probability density functions ofX and Y fXc and fyy respectively are given by as fxI 00 fydy fYy wayww Random Variable and Expectation Part III p 11 Jointly Distributed Continuous Random Variables lib Example 421 A bank operates both a driveup facility and a walkup window On a randomly selected day let X the proportion of time that the drive up facility is in use and Y the proportion of time that the walkup window is in use Suppose the joint probability density function is given by 6 xy2 09310931 may 5 0 elsewhere a Verify that this is a legitimate joint density function 1 1 b Calculate the probability P X g 1 Y 3 Z c Find the marginal probability density functions fX1c and fyy of X and Y 1 L d Calculate the probability P 1 g Y 3 Random ariable and xpectation Part III p 12 Jointly Distributed Continuous Random Variables Solution a We have I fxydxdy b We have PX l 4 L 7Y3 4 1 Ulla 14 14 xy2dxdy 0 14 14 6 14 14 xdxdy y2dxdy 0 5 0 0 2 5314 6 3 914 3 1 7 E y 360 110 320 640 640 Random Variable and Expectation Part III p 13 NR O lla O lla a 2 393 D C Jointly Distributed Continuous Random Variables r Solution c The marginal density function of X and y are fX 113 fYJ c We have L 1 3 I9 ltYlt Q Random Variable and Expectation Part III x y xg 5 53yquotO 5 5 1 6 1 6 1 50 xy2dx EO xdxEO jgde 9 1 19152515 5296O 5 5 5 34 1 6 2 3 fyydy 511 5 dy 14 0 37 p 14 Joke r Stranger Good morning Doctor 1 just dropped in to tell you how much I benefited from your treatment Dootor But you re not one of my patents 4 Stranger I know But my Uncle L Bill was and I m his heir Random Variable and Expectation Part III p 15 Independent Random Variables r Definition 415 Two random variables X and Y are said to be independent if for any two sets of real number A and B PX AY BPX APY B In other words X and Y are independent if for all A and B the events EA X e A and Fb Y e B are indepen dent Note it is sufficient to restrict oneself to the set of the type EA X g a and Fb Y g b Hence independence between X and Y holds if and only if L PX aY bPX aPY b Random Variable and Expectation Part III p 16 Independent Random Variables j r or equivalently Fm 5 FXW FYb 9 When X and Y are discrete random variables the condition of independence can be replaced by 2903 y ppy for a as y 0 When X and Y are continuous random variables the condition of independence can be replaced by fix 3 fx93 fyy for a 93 y L Random Variable and Expectation Part III p 17 If Example 422 a Independent Random Variables 7 Two components of a microcomputer have the following joint probability density function for their useful lifetimes X and Y xay a What are the marginal probability density functions of X and Y Are the two lifetimes independent gage 3H9 x gt 0 y gt 0 0 elsewhere b What is the probability that the lifetime of at least one component exceeds 3 Solution It is easy to show that the marginal probability density functions ofX and Y are fXa ace 13 a gt 0 and fyy ye y y gt 0 We also have Ran om Varia and Ex ectation Part III p 18 w Wwkh Independent Random Variables 1 If Hence the two lifetimes are independent b We have 3 PXlt3 xedx1 4e3PYlt3 0 The probability that the lifetime of at least one component exceeds 3 is 1 PXlt3Ylt3 1 PX lt 3PY lt 3 1 1 4 32 3586365122 Random Variable and Expectation Part III p 19 Conditional Distributions lib Discrete Case Suppose X and Y be two discrete random variables with joint probability mass function px y 3 Then the conditional distribution ofX given Y y is defined by 295 y My pxl xly PX W I y where py gt 0 3 Similarly we define the conditional distribution on given X a is 7 pyXya PY ylx x 295 y L 2995 Random Variable and Expectation Part III p 20 where px gt 0 Conditional Distributions lib Continuous Case Suppose X and Y be two continuous random variables with joint probability density function fa y 3 Then the conditional distribution ofX given Y y is defined by f937y fYZ 7 fXYZ where fyy gt 0 9 Similarly we define the conditional distribution on given X a is f937y fX 7 fYXZ93 L where fa gt 0 Random Variable and Expectation Part III p 21 Conditional Distributions li Example 423 The joint density of X and Y is given by 12 2 0lt lt10lt lt1 IE E ya IE 7 CU w 5 0 elsewhere Compute the conditional density of X given that Y y where 0 lt y lt 1 0 Solution For0ltxlt10ltylt1wehave f w 2xw fm y fyy 1x2 x ydx 962 x y 6x2 x y 2 g g 4 39 Random Variable and Expectation Part III p 22 Joke T 7 Did you hear about the man who accidentally swallowed all the tiles from a Scrabble set His doctor told him that the problem would eventually sort itself out but not in so many words Random Variable and Expectation Part III p 23 Expected Values and Covariances if Theorem 49 Let X and Y be two random variables with joint probability density function fcy and g a function of the two random variables Then gxy fxy dx dy if continuous Z Z 9fanPIEy ifdiscrete ac oo y oo EgX Y o If gX Y XY we have xyfxydx dy if continuous EiX Y oo 2 Z xypx y if discrete iii 00 y a dom Variable and Expectation Part III p 24 L Expected Values and Covariances ii If gX Y X we have i foo foo xfxy dx dy if continuous EiX I I Z Z xpxy Ifdlscrete ac oo y oo fgX Y X Y we have EXY EX EY Definition 416 The covariance of two random variable X and Y written CovX Y is defined by CovX Y EX MXY MYgti Random Variable and Expectation Part III p 25 Expected Values and Covariances r 9 Shortcut Computational Formula CovX Y EX Y EX EY 9 We have the following results CovX X VarX CovaX Y aCovXY VarX Y VarX VarY ZCovX Y IfX and Y are independent then b bbb CovX Y 0 and VarX Y VarX VarY Random Variable and Expectation Part III p 26 Expected Values and Covariances If In general a positive covariance is an indication that an l increase in X results in an increase Y on average A negative covariance is an indication that an increase in X results in a decrease in Y on average Definition 417 The correlation between two random variable X and Y written CorrXY or p is defined by CovX Y p CorrXY W and we have 1 g CorrXY g 1 9 The correlation measures the strength of the relationship between X and Y Random Variable and Expectation Part III p 27 Expected Values and Covariances If Example 424 Example 420 revisited Compute the correlation between X and Y 0 Solution The probability mass function of X and Y are x 100 250 y 0 100 200 px 12 12 py 14 14 12 Thus EX 1001225012 175 EX2 100212250212 36250 VarX EX2 EX2 5625 EY 0141001420012 125 L EY2 02141002144200212 22500 Random Varla le and Expectation Part III p 28 VarY EYZ EY 6875 r Expected Values and Covariances j We have EXY 100020 10010010 10020020 250005 25010015 25020030 23750 CovXY EXY EXEY 1875 Hence Corral CovXY 1875 VarX VarY 5625 6875 n 3015113445 Random Variable and Expectation Part III p 29 Tchebyshev s Inequality V 7 The following theorem can be used to determine the bound for the probability that a random variable X fall in an interval u i k0 Theorem 410 Tchebyshev s Theorem Let X be a random variable with mean M and finite variance 02 Then for any constant k gt 0 2 PY M 2k or PIY M Zkag 9 This result applies to any probability distribution whether it is skewed or not Random Variable and Expectation Part III p 30 The Weak Law of Large Numbers r The following theorem states that the probability that l the average of of the first n terms in a sequence of independent and identically distributed random variables differs by its mean by more than 6 goes to O as n increases to infinity Theorem 410 The Weak Law of Large Numbers Let X1 X2 Xn be a sequence of n independent and identically distributed random variables each having mean ElXi M Then for any 6 gt 0 X X X 11mPlt1 2 n ugt 0 n gtoo n Random Variable and Expectation Part III p 31 Joke r A somewhat advanced society has figured how to package basic knowledge in pill form A student needing some learning goes to the pharmacy and asks what kind of knowledge pills are available The pharmacist says Here s a pill for English literature The student takes the pill and swallows it and has new knowledge about English literature What else do you have asks the student quotWell I have pills for art history biology and world history replies the pharmacist The student asks for these and swallows them and has new knowledge about those subjects Then the student asks Do you have a pill for statistics The pharmacist says Waitjust a moment and goes back into the storeroom and brings back a whopper of a pill that is about twice the size of a jawbreaker and plunks it on the counter l have to take that huge pill for statistics inquires the student The pharmacist understandingly nods his head and replies Well you know statistics always was a little hard to swallowv Random Variable and Expectation Part III p 32 Probability Part Cyr Emile M LAN PhD mlanstatuconnedu Probability Partl p 1339 V 9 L Introduction Text Reference Introduction to Probability and Statistics for Engineers and Scientists Chapter 3 Reading Assignment Sections 3135 September 17September 22 In chapter 2 we introduced graphical and numerical descriptive methods Although these tools are useful to understand the structure of variability in a given data one is particularly interested in developing statistical inference Statistical inference is the process by which one acquires information about a population from samples A critical component of statistical inference is probability because it provides the link between the population and the sample Probability permits one to make inference from a sample 23 to the population and in addition to mei lgl frl Wlol f 39p39 accurate and reliable the inference is L Random Experiments Many experiments such as tossing a coin rolling a dice drawing a card spinning a roulette wheel counting the number of arrivals at emergency room guessing tomorrow weather measuring the lifetime of an bulb etc have unpredictable outcomes We cannot say with absolute certainty which outcome will show up SUCh experiment are called random experiment 1 A random experiment or a probability experiment is an action or process that leads to one of several possible outcomes and before it is performed one cannot guess which outcome will come out An outcome is a result of an experiment Examples Experiment Record marks on a statistics test out of 100 Outcomes Numbers between 0 and 100 Experiment Record student evaluations 0Lg ls Par p 3339 Outcomes Poor fair good very good and excellent Sample Space and Events r Sample space g A sample space S is the set that consists of all possible outcomes of an experiment A probability experiment consists in determining the gender of a newborn child The sample space is S girl boy A probability experiment consists of tossing a coin The sample space is S H T A probability experiment consists of tossing two coins simultaneously The sample space is s ltH H H7T T H mo A probability experiment consists of selecting a faculty member at Uconn and recording her income The sample space is S 0 00 L A probability experiment consists of selecting 5 faculty members at Uconn and rlergpdrgjggptheir p 43 incomes The sample space is S 0 oo5 Sample Space and Events If 3 A sample space is said to be discrete if it contains I either a finite or a countable number of distinct sample points A probability experiment consists of tossing a coin repeatedly until a head is obtained The sample space is S H TH TTH TTTH A probability experiment consists of counting the number of bacteria on a computer keyboard The sample space is S 1 2 3 4 5 o In the last example one might be interested in the outcome at least one tail This corresponds to the set E H T T H T T which is a subset of the L sample space 8 Probability Partl p 5339 Sample Space and Events If Event 3 An event is a subcollection of outcomes of a probability experiment The events two heads A H H one head and one tail B HT one tail and one head 0 T H and two tails D TT are each called simple events Only one simple event occurs in a single trial of an experiment The event at least one tail E H T T H T T is called composite event or compound event can always be expressed using two or more simple events Many composite events can occur simultaneously L in an experiment Probability Partl p 6339 Sample Space and Events V 7 3 Approaches to Define Events We can define an event E by a Making an exhaustive list of its simple outcomes if possible or b Imposing some condition a condition satisfied by the outcomes in E Probability Partl p 7339 Sample Space and Events If Occurrence of an Event An event E occurs ifthe outcome of the experiment is an element in E 3 For example suppose we toss a die once and observe the number on the face which comes up In this case the sample space S 1 2 3 4 5 6 Let E the event that the observed number is even 246 If a die is tossed and 6 comes up we say that E occurs if a die is tossed and 5 comes up we say that E does not occur 3 Special Events L Certain event the sample space S Impossible event or null event the emptv set Z Proba ill y Partl p 8339 L Sample Space and Events 0 More complicated events can be constructed from simpler events using basic concepts of set theory Relationship Between Events Let E and F denote two events Then a E C F means that if E occurs F must occur b E F means that i if E occurs F must occur and ii if F occurs E must occur c F is the complement of A denoted by F E if F contains all the outcomes in S that are not contained in E d E U E is an event that contains all outcomes in either E or F e E n F is an event that contains all outcomes in both E and F Probability Partl p 9339 Sample Space and Events V 7 9 Relationship Between Events f E and F are mutually exclusive or disjoint ifthe occurrence of E precludes the occurrence of F and vice versa Mathematically we have E m E Z 9 F E denotes an event that contains all outcomes in F but not in E In other words it is F H EC h DeMorgan s laws EUFCEC FC E FCECLJFC i Venn Diagram Probability Part p 10339 I b b b b An Example Flip a coin three times 5 TTT TTH THT THH HTT HTH HHT HHH A more than one head B more than one tail 0 at least one head and one tail 3 A HHH HHT HTH THH 3 B TTT TTH THT HTT g C TTH THT THH HTT HTH HHT It is easy to verify that o A U B S A m B 0 disjoint I or 3 CC HHH TTT J I A0 at most one head B A n C THH HTH HHT C A TTH THT HHT Probability Partl p 113 If Probability Roughly speaking probability of an event E is a measure of the likelihood of the occurrence of E on a single run of a experiment whose outcome cannot be predicted A box contains 4 red marbles and 6 green marbles What is the chance to draw a green marble Most would say 6 What do we mean 6 iS regarded as a longterm relative frequency Most gamblers are very aware of the idea of chance in the longrun Indeed history tends to show that gambling was at the heart of the mathematical formulation of chance or probability 7 Most famous example of that is the French gambler de Mere glorious history and his falling down around 1650 Probability Part p 12339 Definition of Probability If Empirical Definition Probability can be regarded as the ratio of successes to the total number of trials in the long run Indeed we are counting occurrences in successive runs 9 The French naturalist Count Buffon 17071788 tossed a coin 4040 times resulting in 2048 heads ie 20484040 05069 While imprisoned by the Germans during World War II the Australian mathematician John Kerrich tossed a coin 10000 times resulting in 5067 heads ie 506710000 05067 Around 1900 the English statistician Karl Pearson heroically tossed a coin 24000 resulting in 12012 heads ie 1201224000 05005 3 When we say that McGuire batting average was 333 we really L mean that the probability of a hit during his career is about 1 in 3 Probability Part p 13339 Definition of Probability lib More precisely suppose a very large set of N identical systems are available to be run together Suppose we are interested in the occurrence of an event E Let nE be the number of occurrences of the event E The probability of the occurrence of E denoted PrE is defined as n E 0 Note that this definition requires infinite number of trials which is not practical In real world we would monitor the limit for few repeated trials and guess the limit 9 P represents the probability of the event E in a L single trial Probability Part p 14339 Interpretation of Probability If We always interpret probability as the relative frequency or fraction or proportion an outcome occurs in a infinite number of repeat of an experiment 9 You will discover later that this is the way we link a population and a sample in statistical inference Probability Part p 15339 Joke r A statistics major was completely hung over the day of his final exam It was a TrueFalse test so he decided to flip a coin for the answers The stats professor watched the student the entire two hours as he was flipping the coinwriting the answerflipping the coinwriting the answer At the end of the two hours everyone else had left the final except for the one student The professor walks up to his desk and interrupts the student saying Listen l have seen that you did not study for this statistics test you didn t even open the exam If you are just flipping a coin for your answer what is taking you so long The student replies bitterly as he is still flipping the coin Shhh I am checking my answers Probability Part p 16339 Definition of Probability r It is clear that this intuitive definition has clear I limitations It has no application for unique event Eg What does the phrase quotthe probability of reelection for the current president is 60 really mean given the empirical definition 9 Subjective Definition Probability can be thought of as the degree of belief not necessarily the same from one person to the next Eg There is a high probability 99 of another form of life in the universe Note that here we are not counting occurrences or nonoccurrences Nor did we sample the universe to ascertain that where we found a new form of life We mean our degree of belief in other form of life elsewhere in the universe L is 99 Probability Part p 17339 Definition of Probability If When assuming that some form of symmetry exist I among the outcomes one does not need to conduct a probability experiment to compute probabilities 9 Classical Definition If a probability experiment has n possible outcomes w1w2 wn all equally likely then each individual outcome wZ has probability 171 Le Pw7 o the probability of an event E is where m denote the number of ways that an event E I can occur Probability Partl p 183 r L o In other words if a sample space is defined so that all Definition of Probability simple outcomes are equally likely this is a convenient way to compute probabilities of event by just counting Example 21 Guessing Yet Passing a Pop Quiz For a threequestion multiple choice pop quiz a student is totally unprepared and randomly guesses the answer to each question Each question has two options a Find the probabilities of each possible student outcomes for the quiz in terms of whether each response is correct C or incorrect l b Find the probability the student passes answering at least three question correctly Probability Part p 19339 Frequency Table and Relative Frequency Table V m 7 a The sample space is S CCC CCI CIC Cll ICC lCl C III Each question has two options and with guessing the responses are expected to be equally likely Hence the probability of selecting any of these 8 simple outcomes is 18 b The probability of at least two passes is PCCC CCI CIC CC g o In general one needs counting rules to compute L probability using the classical definition Probability Part p 20339 r In any case we need a better definition of probability o Definition of Probability Kolmogorov s Probability Axioms Definition 21 Let S be a sample space associated with a probability ex periment To every event E in S E is a subset of S we assign a number PE called the probability of E so that the following three axioms hold Axiom 1 For every event E C S 0 g PE g 1 Axiom 2 PS 1 Axiom 3 If E1 E2 En is a sequence of painNise mutually exclusive events in 8 Le EL 0 E7 0 mm 7A j then n P Ema Probability Part I N 1 1339 T Kolmogorov s Probability Axioms 1 These axioms were proposed by Andrey Nikolaevich Kolmogorov in 1931 as the first mathematical foundation to probability Kolmogorov s definition of probability states the conditions an assignment probabilities must satisfy However it comes short in telling us how to assign specific probabilities number to events in the first place Probability assignment in practice are based on empirical rule or a careful thought about the selection process classical rule or are just subjective Probability Part p 22339 Kolmogorov s Probability Axiom r Steps for Calculating Probabilities 0 Step 1 Define the experiment ie describe the process leading to the observation and the type of observation that will be recorded Step 2 List all simple events and the sample space Step 3 Assign probabilities to the simple events Step 4 Determine the collection of simple events that make up an event of interest 0 Step 5 Sum the simple event probabilities to get the probability of the event of interest Probabilities of composite events can be calculated using the probabilities of simple events Note that this process works only well for experiment with few simple sample L points In other cases we need rules for counting Probability Part p 23339 b b b Kolmogorov s Probability Axioms If Example 22 All human blood can be typed as one of O A B or AB but the distribution of the types varies a bit with race Here is how blood type is distributed in any African American blood type 0 A B AB probability 49 27 20 a What is the probability of type AB blood Justify b Interpret these number c Maria has type B blood She can safely receive blood transfusions from people with either blood type 0 or B what is the probability that a randomly chosen African American can donate blood to Maria Probability Part I p 243 Kolmogorov s Probability Axioms If Solution a PAB 1 PO PA PB 04 b These numbers represents the proportion of African American with these blood types Another way to think of it is if I were to select randomly an African American the probability that it would have a blood type A will be 27 c Let D denote the event of a successful donation PD PO PB 49 20 69 9 The odds of an event E is defined as PA PA PAC 1 PA L It measures how much more likely it is that A occurs than that it does not occur Probability Part I p 253

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Selling my MCAT study guides and notes has been a great source of side revenue while I'm in school. Some months I'm making over $500! Plus, it makes me happy knowing that I'm helping future med students with their MCAT."

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.