PSY 202 Chapter 7 -Day 1
PSY 202 Chapter 7 -Day 1 Psy 202
Popular in Elementary Statistics
Popular in Psychology
verified elite notetaker
This 2 page Class Notes was uploaded by Stephanie on Saturday September 24, 2016. The Class Notes belongs to Psy 202 at University of Mississippi taught by Matthew Mervin in Fall 2016. Since its upload, it has received 2 views. For similar materials see Elementary Statistics in Psychology at University of Mississippi.
Reviews for PSY 202 Chapter 7 -Day 1
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/24/16
PSY 202: Elementary Statistics Chapter 7: Describing the Relationship between 2 Quantitative Variables: Regression – Day 1 I. Introduction a. With regression we are trying to find a causal relationship b. Independent variables are called predictors c. Dependent variables are called criterion d. We use the information that we get from the predictors to explain what is going on with criterion e. Telling “how” in addition to “if” i. The sloped tells how much X and Y change together II. Reviewing the Algebra of Lines a. High School Algebraic Form i. y= f(x) = mx+b 1. f(x) : You can only have one Y for each X 2. mx : Slope 3. b : yintercept ii. The yintercept tells the value of Y if X equals 0 1. f(x) : You can only have one Y for each X 2. mx : Slope 3. b : yintercept b. Moving algebraic to standard notation i. 1. Yhat: Predicted value of Y ii. You use this to get the regression line 1. Think of the regression line as the central tendency for bivariate distributions III. The Notion of Predicting Y From X a. If we know the relationship between two variables we can use one variable to get information about the other variable b. Simplest way to get the regression line: Connect the dots i. This doesn’t always give us a straight line ii. It’s not very good for predictions either iii. We need a line that summarizes the entire distribution iv. For most scatter plots there are an infinite number of lines that can summarize the scatter plot c. We’ll be using Squared Errors i. This is the distance between what you predicted and what the actual score is and then you square it ii. You look for the regression line that gives you the smallest sum of squared errors d. Regression Analysis i. Error of estimate: This is the difference between the predicted value of Y and the actual value 1. Square them to prevent cancellation and the subtract a. The smaller the difference the better the regression line 2. When X intercepts the regression line that is the predicted value of Y (Yhat) a. The gap between Y and Yhat is the error of estimate ii. Find the line that minimizes the sum of squared errors IV. The Standard Error of the Estimate a. This is the standard deviation of the regression line
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'