### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# ELEG310 Midterm 1 Study Guide ELEG310

UD

### View Full Document

## About this Document

## 149

## 6

1 review

## Popular in Random Signals and Noise

## Popular in Electrical Engineering

This 26 page Study Guide was uploaded by Cindy Nguyen on Sunday March 6, 2016. The Study Guide belongs to ELEG310 at University of Delaware taught by Dr. Daniel Weile in Spring 2016. Since its upload, it has received 149 views. For similar materials see Random Signals and Noise in Electrical Engineering at University of Delaware.

## Popular in Electrical Engineering

## Reviews for ELEG310 Midterm 1 Study Guide

Better than the professor's notes. I could actually understand what the heck was going on. Will be back for help in this class.

-*Rodolfo Bruen*

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 03/06/16

ELEG 310 Midterm 1 Study Guide: Chapters 1 and 2 Note: major concepts colored blue, some major examples in red Two Types of Models Deterministic Stochastic Symbols: ∈ element of ∉ Not an element of ∩ Intersects (“and”) ∪ Union (“or”) ⊂ Subset ⊄ Not a subset Ac “A-complement”; “not” in A Ø Null; empty set Definitions to know: Set S Outcomes to an experiment A ⊂ S Subset of S; e.g. an event is a subset For every event, associate a number. Probability P[A] Z Zeta, which represents an outcome Sample space the space of possible outcomes; can be discrete or continuous Subset of a sample space An event Sample points set members 2 Events in Probability Certain Event S Impossible event Ø (null) Set Theory Review Some Rules: Subset Rule: A = B ↔ A ⊂ B ↔ B ⊂ A (Means A is a subset of B, and B is a subset of A) Commutative Rules: A ∪ B = B ∪ A (union) A ∩ B = B ∩ A (intersect) A ∪ (B ∪ C) = (A ∪ B) ∪ C (A ∩ B) ∩ C = A ∩ (B ∩ C) Distributive Rules: A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C) (See Week 1 Notes for proof by venn diagrams) A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C) DeMorgan’s Rules (see Week 1 Notes for Proof) (A ∪ B) = A ∩ B c (A ∩ B) = A ∪ B c Can be proven with the following definition: (A ) =A c c c • Proof: x ∈ (A ) x ∉ A x ∈ A Probability is a number assigned to an event Probability law: ∀ (for all) A⊂S P[A] P[A] is a number. Mapping P[A] must have some properties. P[A] must satisfy: I. 0 ≤ P[A] II. P[S] = 1 III. If A∩B ≠ Ø (disjoint), then P[A∪B] = P[A] + P[B] Side Note: If there is an infinity set of events, then III is not sufficient. So III. If i, j … is a sequence of events, and A ∩ A ≠iØ, anj i ≠ j, COROLLARIES: C Corollary #1: P[A ] = 1 – P[A] Corollary #2: P[A] ≤ 1 Corollary #3: P[Ø] = 0 Corollary #4 (using I-III): If Ai∩ A ≠jØ ∀ i ≠ j, then Union of finite number of sets (not infinite) Corollary #5: *very important P[A∪B] = P[A] + P[B] – P[A∩B] (not disjoint) Corollary #5’: P[A∪B∪C] = P[A] + P[B] + P[C] – P[A∩B] – P[A∩C] – P[B∩C] – P[A∩B∩C] PROOF BY VENN DIAGRAM: P[A∪B∪C] = P[A] + P[B] + P[C] – P[A∩B] – P[A∩C] – P[B∩C] + P[A∩B∩C] Regions A, B, and C are in union and are all Since each of these This middle accounted for by pairwise intersections section has not addition. This includes (football-shaped) been accounted their overlaps regions are added for; thus it is (pairwise twice (2X), they each added in. intersections) must be subtracted once so that they are only accounted for once. Corollary #6: Corollary #7: If A⊂B then P[A] ≤ P[B] (if A happened, we know B happened.) Intro to Continuous Probability Example: Lifetime of a computer chip. Assigns probabilities to semi-infinite integrals Sample space: S = (0, ∞) Decays exponentially at rate α Probability Law: P[(t, ∞)] = e -αt P[S] = e-α(= 1 Side Note P[chip dies] P[die at any time t] = 0 P[(r,s)] = P[(s, ∞)] = P[(r, ∞)] Feature of continuous probability because area of Chip dies between r,s; r<s point on0, BUT doesn’t mean that it won’t happen. P[(r,s)] = P[(r, ∞)] - P[(r, ∞)] By Probability Law: = e-αr– e-αs Combinatorics – math of “counting things” Suppose we have that same urn, with n balls in the urn. Suppose we pick the ball K times – How many possible outcomes are there? This depends on 2 things: 1. Whether we put the ball back, or not put it back; a.k.a, “sampling with replacement” and “sampling without replacement,” respectively. 2. The order we get (we may or may not care about this). a. E.g. (1,2) ≠ (2,1) Example 1: WITH Replacement, WITH Order Example 2: WITHOUT Replacement, WITH Order k Experiment # Possible outcomes n This is “the easiest possible case.” 1 n 2 n(n-1) (pulled one out) 3 n(n-1)(n-2) …However this is not the full answer. Keep multiplying up to (n – k + 1) kth time. Example 3: WITHOUT replacement, WITHOUT ordering You have a standard deck of cards. 52 cards, 4 suits (clubs, diamonds, hearts, spades) and 13 ranks (Ace, 2, 3, 4, 5, 6, 7, 8, 9, 10, Jack, Queen, King). How many possible poker hands are there? What is the % of Royal Flushes? (Royal flushes are the highest hand: Ace, 10, Jack, King, and Queen of the same suit) There are exactly 4 Royal flushes (of clubs, of diamonds, of hearts, of spades). So the % Royal Flush is: 4 /2,598,960 = 0.00000154 = 0.000154% Example 4: WITH replacemenet, WITHOUT ordering We can using encoding to illustrate this. K = times repeated, n = # of things Experiment: Pick ball, look at it, replace it. When you pick ball p, mark bin p with an “x”. There aren bins. /’s are used to separate the bins. e.g. Each code used to count. There are n-1 x x / x x x / / x x x / slashes to separate the bins, which separate the balls. There are k number of x’s, because you repeated the experiment k times. Ball 1 So here, you did the experiment 8 times picked 2 Ball 3 (k=8) and there are 5 balls (n=5). times picked 0 times Other code examples: Legal codes have: x x x x x x x x / / / / n – 1 + k symbols ((n-1) slashes, k x’s) x / x x / x x x / x / x Permutations Suppose we are interested in ordering n objects. Each ordering is called a “permutation. This is the same as sampling with ordering WITHOUT replacement, for k = n. The formula becomes: n(n-1)(n-2)(n-3) … (1) = n! To last “n factorial”: ball Possible permutations Combinations We’re most commonly interested in “without replacement, without ordering” combinations Experiment: Choose k balls from an urn containing n balls, without replacement, without ordering. (Professor’s example: choosing students, to form a 5-member committee). We have a combination of k things (e.g. 5 members) from a set of n things (e.g. from a class of 30 students). This notation is used: Said as “n choose k” e.g. 80 students, choose 5 for committee Suppose you choose k things. You know how many orders (permutations) there are by k! Therefore, it makes sense that If you multiple the number of possible different orders (k!) times the thing we want to know, we will get the number of possible ways considering order BUT without replacements. Conditional Probability This is the probability of an event given that another event occurred. Suppose you pick a ball from an urn of 10 balls. We know the ball is even. What is the probability of the ball being 2? It is 1/5. Formalizing this Idea: Suppose A = {2} (A is set 2) B = {2, 4, 6, 8, 10} (B is set of even balls) “given that” Testing it out: Rearranging, we get P[A∩B] = P[A|B] P[B] = P[B|A] P[A] Using this idea helps calculate probabilities. Partition of a Set into Sets Partition: Set is a union of all sets that we partitioned it into.The sets DO NOT interesect (see below). The Theorem of Total Probability: The gist: A can be broken into B’s and ie can factor the probabilities out. Baye’s Rule Monty Hall Problem -Monty Hall was a g ame show host of “Let’s Make a Deal” in the 1960’s -Host makes deals with people on the show. -There are 3 Doors; behind 2 are goats, behind 1 is a new car. -You pick dooδ;door η is the one Monty opens; dσois the remaining door. -Probability-wise, should you: 1) Switch your door 2) Don’t switch your door 3) it doesn’t matter. -The correct solution is: You should switch. The reason is in Baye’s Rule. Proof: Let A be the event that Monta opens ηoor LetB δbe the event that the car is behinδ door Bη and Bσ are treated the same. What are the B probabilities, given A? By Baye’s Rule, Let α ∈ { δ, η, σ } The denominator becomes (1/2)(1/3) + (0)(1/3) + (1)(1/3) = 1/2 P[B α = 1/3 Ɐ α P[A|B ] = ½ (the other 2 have goats) δ P[A|B ]η= 0 (If the car is behind η then Monty won’t open it) P[A|B ]σ= 1 (This is the probability that Monty will open door η Given that the car is behind door σ. You opened door δ. So he can’t open δ, nor can he open η because it has goat The numerator: Drug Testing Example The reliability of drug testing depends on: 1) Sensitivity -the probability that test is positive, for a drug user 2) Specificity -the probability that test is negative, for non-drug user Both of these are technical terms, and are equally important. It’s not good if a test is too sensitive (then everyone would appear “positive”); nor is it good for it to be too specific (then everyone would appear “negative”). Test has a 1% error rate: 99% sensitive and 99% specific. This means that 1% of drug users will come up negative for drugs, and 1% of non-drug users will come up positive for drugs. Suppose 0.5% of the population are drug users. We have no idea what that 0.5% actually is (depends on drug) but it’s important to assume it’s “fairly rare.” From Given: Let U= {users} (set of drug users) P[U] = 0.005 P[+|U]= 0.99] (+ means possible outcome that someone is a User) c P[-|U ]= 0.99 ] (- means possible outcome that someone is not a User) We are interested in P[U|+] By Baye’s Theorem Independence Let event A = “it’s raining in Walla Walla, WA” Let event B = “the stock market is rising” Knowing one has nothing to do with the other. Example: Independence of 2 events. Ther are two numbers, x and y are chosen at random in [0,1]. So x,y ∈ [0,1] Let A = {x > 0.5} B = {y > 0.5} C = {x > y} a) Are A and B independent? a. Yes. P[A] P[B] = ¼ = P[A ∩ B] b) Are A and C independent? Independence of 3 Events This is more strict than pairwise independence. P[A ∩ B ∩ C] = P[A] P[B] P[C] Assume again that x,y ∈ [0,1] 3 events: B = {y < ½} D = {x < ½} F = (B∩D )∪(B ∩D) “in B but not in D, in D but not in B” Determining pairwise independence: P[B∩D] = ¼ B and D are independent P[D∩F] = ¼ D and F are independent P[F∩B] = ¼ F and B are independent HOWEVER, proving pairwise independence is not enough to say that the 3 events are independenP[B∩D∩F] = Ø because their intersect is an empty space. So these 3 events are NOT independent. This concept generalizes to as many events as you want. New Notation: Example: Probability of rain vs. no rain in Walla Walla, WA, in a 3-day sequence This is a Bernoulli Trial, meaning a “success or failure” type trial One way to solve is listing them out as below, where R is rain and S is sunny. 3 P[RRR] = p Probability of rain 3 days P[RRS] = p (1-p) P[SRR] = p (1-p) P[RSR] = p (1-p) P[SRS] = p(1-p) 2 P[RSS] = p (1-p) 2 P[SSR] = p(1-p) 2 2 P[SSS] = (1-p) In summary, P[R=0] = (1-p) 3 (e.g. P[SSS]) P[R=1] = 3p(1-p) 2 P[R=2] = 3p (1-p) 3 P[R=3] = p Generalizes to: given the sequence of p[k successes in n number of trials] n-k number of failures Bernoulli Trial – has 2 outcomes, “success” or “failure.” Success is represented by p Failure is represented by 1 – p. e.g. Flipping a coin, HTTH (heads tails tails heads) p = success of heads “H” p – 1 = failure of heads, “T” HTTH = p(p – 1)(p – 1)p = p (1 – p) getting exactly 2 head, regardless of order HHTT and HTHT are the same outcome We are interested in counting the # of successes. To do this, we write: ( )p (1 – p) 2 2 “Four choose two” times the probability of getting exactly two heads. Binomial Random Variable n number of trials, k number of successes ???? P[B = k] = ( ???? = p [1 – p] n-k P[B∈{0, 1, 2, … n}] = 1 Probability in set must be 1 Binomial theorem ???? ???? (???? + ????)???? = ∑ ( )???? ???? ???? ????−???? ????=0 ???? Counting Bernoulli Trials Geometric Distribution Recall and Memorize Geometric Sums: 1 ∑ ∞ i ????=0 r = 1−???? Repeat Bernoulli Trial until we get a success. st p[m=1] = p success on 1 trial p[m=2] = p(1-p) also pq Must fail first to get a success on second trial. p[m=3] = p(1-p) also pq p[m] = pq-1 Checking it: ∞ ∑ ????=1 ???? ???? = 1 Must take somewhere between 1 and ∞ to guarantee getting a success ???? ∑ ????=1 ????????????−1 = p ∑ ????=1 ????????−1 = p ∑????=0 ???? ???? = 1−???? Outcome ꝭ (zeta) X(ꝭ) ꝭ ∈ ???? (ꝭ is a member of S) X is a “mapping.” X : S is the domain ofXthe mapping; S ⊂ ℝ is the range of mapping. The capital letter X denotes a “random variable,” which is not a variable; it is a function that maps the outcome. P[X < x]. e.g. Suppose you toss a coin 3 times, and want to count the heads. ꝭ = TTT TTH THT THH HTT HTH HHT HHH X(ꝭ) = 0 1 1 2 1 2 2 3 SX= {0, 1, 2, 3} range of the random variable Below is the same experiment, but we assign different rules to the outcomes; so we use a different random variable, Y (capital y). SY= {$0, $1, $8} Prize money: $0 for no heads, $1 for getting 2 heads, $8 for getting 3 heads Y (ꝭ) = 0 0 0 $1 0 $1 $1 $1 $8 Discrete Random Variables Probability Mass Function (pmf) A discrete function that assigns probability to each atomic outcome. P Xx) = P(X = x] = P[{ꝭ: X(S) = x}] x ∈ ℝ X is the random variable Axioms of Probability Mass Function (i) PX(x) ≥ 0 (outcome must be positive or 0;⊂ ℝ) (ii) ∑ ????∈ ???? P Xx) = 1 ???? (iii) P[x∈B] = ∑ ????∈ B PX(x) (∀ ???? ∈ SX⊂ SX) Define random variables corresponding to probability laws like geometric, Bernoulli, etc. Important Examples Example 1: Bernoulli Random Variable (a.k.a success/fail) Define success asꝭ = A. 0 ꝭ ∉ A Indicator function: IA(ꝭ) = {1 ꝭ ∈ A Example 2: Geometric Random Variable outcome: k-1 P Xk) = P[X – k] = (1 – p) p Example 3: Binomial Random Variable ???? k n-k PX(k) =(????) p (1 – p) What else can we do with these concepts? To fully specify a random variable, we must know the distribution completely – not every value, but we want to observe the center(expected) value of the distribution if we were to repeat the experiment over again. This would be the mean, which “summarizes our gut feelings.” ???? Χ ???? Χ =[ ] ∑ ????∈ ???? x X (x) ???? Probability All poss. Expectation mass function of X outcomes Assign xkas all possible outcomes. =∑ x P (x ) ???? k X k e.g. what is the mean of a Bernoulli Random Variable? M I (0IP(0) +I(1)P(1) = 0(1 – p) + 1(p) = p Example 5: Uniform Random Variable 1 PX(x) ???? for x ∈ {0,1,2,…???? − 1} ???? Χ = ∑ ????−1 ???? 1 ????=0 ???? = 1 ∑ ????−1 ???? ???? ????=0 1 = ???? [0 + 1 + 2… ???? − 3 + ???? − 2 + ???? − 1 ] ( ) 1 ???? = ???? [(???? − 1) 2 ????−1 = 2 Geometric Random Variable **(This could be on the test!) ???? ???? ∑ ????∈???????? x X (x) =∑ ???? xkPX(xk) = ∑∞ ???? pqk-1 ????=1 ∞ k-1 = p∑ ????=1???? q this looks like a derivative. = p∑ ∞ ???? qk use geometric series to simplify sum ????=1???????? ???? ???? = p???????? 1−???? now take derivative ???? =????2 =1 ???? Expectation E[X] = ∑ ????∈???????? xPX(x) ???? ???? ????−???? P Xk) = ( ???????? (1 − ????) ???? ???? E[X] ∑ ????( )???? (1 − ????)????−???? definition of Expectation ????=0 ???? ???? ????! ???? ????−???? =∑ ???? ????! ????−???? !(1 − ????) ????=0 ???? ????! ???? ????−???? =∑ (????−1)! ????−???? !(1 − ????) ????=1 ????−1 = ????∑ (????−1)!???? ????+(1 − ????)????−????+1 sub j=k-1 ????=0????! ????−1−???? ! ????−1 =???????? ∑ ???? (????−1)???? (1 − ????)????−????+1 ????=0 ???? =np Purpose: illustrate calculating the Mean, and use of Binomial Theorem. Expectation Properties E[aX] = ∑ ????xkP Xx k ????∈???????? ∑ = ???? ????∈???????? xkP Xx k = aE[X] E[X + C] =∑ ????∈???????? (???? + ????) P Xx k = ∑ xP (x ) + ∑ ???? P (x ) ????∈???? X k ????∈???? X k = E[X] + ????∑ ????∈???? P Xxk) = E[X] + C E[g(X) +h(X)] = E[g(X)] + E[h(X)] Deviance 2 DX(x) = (X – E[x]) Variance ???? 2= VAR[X] = E[???? (????)] ???? ???? Standard Deviation STD [X] = √???? 2 = ???? ???? ???? Purpose of STD when you already have Variance? Standard Deviation has the same units as the variable we are measuring, while variance squares the units. Can simplify computation using: 2 2 VAR[X] = E[X ] – E[X] Examples of computing variance Bernoulli Random Variable, Bernoulli Trial 2 2 VAR[X] = E[X ] – E[X] 1 ????????????ℎ ???????????????????????????????????????????? ???? IA= { 0 ????????ℎ???????????????????????? E [IA] = p 2 2 2 E [(A ) ] = 0 (1 – p) + 1 (p) = p Plugging in, we get: 2 2 VAR[I A = E [IA] – E [(I A] 2 = p – p = p(1 – p) If p is 0, no variance If p is 1, no variance We maximize variance by p being fair. Variance of Geometric Random Variable VAR[X] = E[X ] – E[X] 2 from previous lecture 2−???? 1 = - ( ) ????2 ???? = 1−???? ????2 Variance of Uniform Random Variable X ∈ {0,1,…???? − 1} 1 ????−1 P (x) = E[X] = X ???? 2 ????−1 1 E[X]= ∑ (???? )( ) ????=0 ???? 1 ????−1 2 = ???? ∑ ????=0(???? ) = ∑ ???? (???? ) = an + bn + cn + d ????=0 …Sum at N=0, N=1, N=3, etc. = 1 ∑ ????−1(???? ) = 1 ????−1(2M -2 +1)(2M -1 + 1) ???? ????=0 ???? 6 (????−1 (2????−1) = 6 2 2 VAR[X] = E[X ] – E[X] (????−1 (2????−1) (????−1) 2 = 6 - ( 2 ) (????−1 (????+1) = 12

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.