PSYC 245 Study Guide Exam 1
PSYC 245 Study Guide Exam 1 PSYC 245
Popular in Industrial Organizational Psychology
Popular in Psychology
verified elite notetaker
verified elite notetaker
verified elite notetaker
verified elite notetaker
verified elite notetaker
verified elite notetaker
This 12 page Study Guide was uploaded by Sara on Tuesday September 27, 2016. The Study Guide belongs to PSYC 245 at University of Illinois at Urbana-Champaign taught by Angela Lee in Fall 2016. Since its upload, it has received 130 views. For similar materials see Industrial Organizational Psychology in Psychology at University of Illinois at Urbana-Champaign.
Reviews for PSYC 245 Study Guide Exam 1
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/27/16
PSYC 245 Study Guide: Exam 1 General I/O: • Psychology: scientific study of thinking, feeling, and behavior • I/O Psychology: application of psychological principles, theory, and research to the workplace • I/O is a false dichotomy • Disagreement on the classification of topics; some can fall on either side depending on the research question • Interdependence of topics • Research necessarily increasingly complex Science Practitioner Model: • A call center has a three-week training program for new employees. The average employee quits after three months. • Science: Job Turnover Theories & Past Empirical Studies • Practitioner: Create plan SIOP (Society for Industrial and Organizational Psychology): • Includes information about graduate programs, jobs, and what I/O/ psychology is • Also, includes a number of publications on salary, employment outlook etc. History of I/O: Hugo= father of I/O Psychology Structuralism vs. Functionalism: Structuralism – Willhelm Wundt founded psychology • Laws and principles governing human behavior o Tried to make psychology more legitimate and similar to natural sciences (e.g., biology, physics) o Ignores individual differences Functionalism – William James • Functions of human behavior and how mental processes operate Scientific Management - Frederick W. Taylor • Basic principles included: Jobs that can be scientifically studied, and the one best method for doing them can be found, there needs to be a match between a workers’ abilities/skills and the job tasks, money is the worker’s prime motivator, tools and work systems need to be desirable to fit the worker’s needs • Taylor used “Time and Motion” studies to analyze jobs o Break down a job into its core movements o Time each movement o Use this information to figure out more time-efficient ways of doing the job Hawthorne effect and how Hawthorne studies were conducted: • Elton Mayo became involved in a series of studies • Human engineering project: vary workplace conditions to find out what helps and hinders performance e.g., more or less light • Performance always improved! • They were conducted by being interviewed by Mayo and also saw that productivity improved because the researchers were watching them – demand effect • The workers believed that the studies would result in more demanding quotas, so they adjusted down Human Relations Movement: • Core beliefs of this movement include: good relationships are the most important determinant of productivity, social factors will override financial considerations in many cases, workers are more responsive to peers’ values than those of management Civil Rights Act of 1964: • Legislation banned discriminatory practices in employment • Designed to protect underrepresented groups • Forced organizations to reexamine their selection and promotion practices, including various tests • I/O psychologists have worked hard over the years to help develop standards for fair and unbiased testing and selection Henry Murray’s assessment centers and what they were developed for: • Henry developed the first US assessment center for the Office of Strategic Services – how to pick overseas spies • Assessed personality and ability • Simulations to test abilities to deal with stressful and frustrating circumstances o Alcohol Test, Stress Interview, Brook Situation, Construction Situation Research Methods: Vocab: • Independent Variable: (Predictor) – The presumed “cause” o Changes in the IV lead or predict changes in the DV • Dependent Variable: the “effect” or outcome, or criterion • Confounding Variable: correlates directly or inversely with the IV and DV • Construct: psychological concept or characteristic that we’re trying to measure (intangible; e.g., intelligence) • Operational Definition: (operationalization) – how the construct or other variable is measured or manipulated in the study Moderator vs. Mediator: • Moderator: third variable that influences the relationship between two variables; “if – then” statement • Mediator: third variable that represents the generative mechanism through which the IV influences the DV 3 Criteria of Causation: (John Stuart Mills) ¢ X is related to Y (correlation) ¢ X precedes Y (temporal precedence) ¢ No alternative explanations for the relationship between X and Y (no confounds) Experimental design: • Experiments are used in order to detect cause-and-effect relationships • Can be lab or field • Manipulation of IV (control group) • Random assignment into experimental conditions Quasi-experiment: • Individuals are not randomly assigned to groups (sometimes due to lack of feasibility; e.g., personality traits; intelligence; cancer; pregnancy) • Participants can still be assigned to experimental and control groups non-experiment: • Does NOT include the manipulation of variables or random assignment o Cannot make casual assumptions – say two variables are “associated” survey design: • Series of questions used to study one or more variable of interest o Because of limited control, it is difficult to infer causation Random Sampling vs. Random Assignment: • Random Sampling: This is where you have a list of people and you pick every third person into a test population • Random Assignment: This is where you have a program group and a comparison group Correlation: • How much do two variables relate to each other? Example: Is intelligence related to dating success? Regression: • How can an outcome variable be predicted by an explanatory variable? Example: Is job performance predicted by intelligence? Mean difference (T-Test, ANOVA) • How much do two groups differ on a variable? Example: Do introverts have more dating success than extraverts? Reliability: The consistency or stability with which something is measured Validity ¢ Face Validity: does the test appear (to the test-taker) to be measuring what it is supposed to and will it predict behavior at work? ¢ Content Validity: you are sampling from the entire domain of possible behaviors ¢ Construct Validity: you are measuring what you are intending to measure ¢ External Validity: generalizability of your findings ¢ Criterion Validity: the test can be justifiably used to predict what you want it to. ¢ External validity: inferences about the extent to which a causal relationship holds across different types of people, settings and time Restriction of Range: The correlation decreases when there is less variability on one of the variables of interest. Example: ACT scores & GPA Meta-Analysis: Small sample size, unreliable measures, other factors that can distort results • Meta-analysis: statistical method for combining and analyzing the results from many studies to draw a general conclusion about relationships among variables Correlation coefficient: Represents the relationship between two variables • Captures both the direction and strength of a linear relationship • CORRELATION DOES NOT EQUAL CAUSATION! Cognitive Ability: Individual Differences: • Dissimilarities between two or more people, relatively stable or enduring dispositions or traits Intelligence: • correlated with job performance, training success, Schmidt and hunter’s meta-analysis • Class definition: “repertoire of acquired skills, knowledge, learning sets, and generalization tendencies that is available at any one period of time” Flynn Effect: • scores on intelligence tests increased substantially over time (between generations) Incremental Validity: • When we want to use more than one predictor, the second one should add significantly to our ability to predict the criterion • Incremental Validity: gain in predictive validity increases multiple correlation (from regression) Factor analysis: • Give a sample of people a test to see how it’s “structured” and then find intercorrelations – how the items group together • When a set of items hang together, they can define a “factor” or a dimension Bandwidth vs Fidelity: • You can either measure something broadly… or you can measure it precisely • If you’re interested in a broad outcome (like job performance) its best to use a broad measure, like g Emotional Intelligence: Ability vs. Mixed Models • Mixed Model: array of non-cognitive capabilities, competencies, and skills that influence one’s ability to succeed in coping with environmental demands and pressures • Ability Model: Type of intelligence 1. Ability to carry out accurate reasoning about emotions and the ability to use emotions and emotional knowledge to enhance thought Personality: Catell (16PF) Questionnaire: Reduction was beneficial because it allowed statistical analyses to be conducted on the data Big 5 & criticisms: O – Openness to experience: intelligent, imaginative, curious C – Conscientiousness: dependable, responsible E – Extraversion: sociable and energetic A – Agreeableness: cooperative, kind, trusting N – Neuroticism/Emotional Stability: calm, relaxed, not anxious • Criticisms: 1. Big 5 isn’t comprehensive 2. Based on narrowed set of descriptors 3. Factors are confounded (too broad) Validity of personality tests (Barrick & Mount, 1991): • Strongly related to conscientiousness, agreeableness, and emotional stability (personality traits!) Integrity Tests: Covert vs personality-based • Measures how honest and moral an applicant is • Designed to predict deviant behaviors such as theft, absenteeism • Two types: 1. Overt: asks directly about honest behavior 2. Covert or Personality-Based: asks about traits and behaviors related to dishonesty What is faking and what can be done about it? • Applicants can “fake good” • Should be concerned with faking because applicants who feel a greater incentive to present themselves in a more positive light and inflate personality scores would be more likely to be hired without necessarily showing higher levels of performance • We can use forced-choice response formats, choose the statement that is more like you given a pair of like statements. Group differences in personality: • No reliable differences between ethic/racial groups • Small gender differences Vocational Interests: RIASEC model and preferred activities of each interest: • RIASEC: Realistic, investigative, artistic, social, enterprising, or conventional • Work environments can also be classified into the six types. Each other setting is populated by people with generally matching interests • People seek work environments that match their interest In Su et al. 2012 what was the strongest predictor of income? College persistence? • Strongest predictor of income is the interests • Strongest predictor of College persistence is the ability Selection Assessment: Construct vs. method • Methods: tests, interviews, SJTs, and assessment centers • Constructs: cognitive ability, personality, specific skills, background, experience, knowledge o Constructs are distinct from the methods used to measure them Power vs. speeded tests • Power tests: although there may be a time limit, the time should be sufficient o Some test-takers will still not finish o Designed to assess ability more clearly • Speed Tests: time limit; usually items are easier o Many test takers will not finish o If the time limit is not long enough, the test may be measuring other abilities than the one it is designed to measure o May be unfair Proctored vs. Unproctored tests • Heavily debated in I/O psychology • Proctored is more commonly used for employee selection • But unproctored testing has a number of advantages • Unproctored testing can be used to reduce the applicant pool before moving to more expensive proctored tests or interviews Paper-and-pencil vs. computerized tests: • Paper and Pencil tests are more common • Pencil and paper tests almost equivalent to computerized tests o Speed tests are slightly less equivalent • Electronic Page Turners are paper and pencil tests delivered electronically o Can provide more efficient and accurate scoring than traditional paper and pencil tests Guion’s (1998) definition of test: q “An objective and standardized procedure for measuring a psychological construct using a sample of behavior” (Guion, 1998) Computer adaptive tests (CAT) • After answering an item, the next item given to the respondent matches the current estimate of their ability level • As a result, different examinees will see a number of different items • Process continues until: 1) a certain number of items are administered (GRE) or, 2) a desired level of accuracy is reached (Nurse Licensure Exam) Structured interviews and types of questions: • Past-focused/behavioral – What did you do to fix the situation? • Future-focused/situational – Something happened, how would you handle this situation? • Skill-level determiner – This happens, what could be going wrong? • Organizational Fit – What type of work place is best for you? • Clarifier – allows the interviewer to clarify info in the resume, cover letter • Disqualifiers – If the interviewee gives a particular answer, they are disqualified from the job. Example: Can you work overtime without notice? Situational Judgment tests: • Measure more contextually mebedded reasoning than cognitive ability tests • Can be administered via pencil and paper but more interesting when using video/computer • Present participant with a description of a relatively complex situation • Participant makes a decision from a set of options Why are unstructured interviews bad? • The questions are not job-related, many questions interviewers ask are illegal, interviewers make decision in the first 3 minutes, contrast effects, negative-information bias, interviewer-interviewee similarity, interviewee appearance matters What is the validity of interviews? • Can be confounded with g What constructs do interviews measure? • Cognitive ability • Personality • Social Skills • Interests and preferences • Organizational fit Job Analysis: KSAOs • Knowledge: domain-specific facts and information (declarative knowledge; know-what) • Skills: practiced acts (procedural knowledge; know-how) • Abilities: the capacity to engage in specific acts (e.g., cognitive, psychomotor) • Other characteristics: things like personality, interests, experience Purpose of job analysis: • Provides the foundations for any integrated HR system • Reduce role conflift and ambiguity; help define standards of performance • Mandated by legal requirements- they protect organizations and employees Job analysis methods (e.g., observation, surveys, etc) Observation • Excellent for understanding and appreciating conditions under which job is performed • May miss aspects of the job that don’t occur frequently but are still important for good performance • Not as good for understanding why behaviors do/do not occur • Can also be very time consuming and expensive Interviews: can be conducted with subject matter experts (SMEs) • SMEs may be job incumbents, their supervisors, or technical experts • Should be structured • Most commonly used method of gaining information • Very adaptable • Supervisors may not know all that their employees do but job incumbents may try to make their job sound more important than it actually is Critical Incidents Techniques or Work Diaries • Ask SMEs about critical aspects of perf. in a job • Info primarily based on SMEs recalling specific incidents of outstanding or poor performance: • What led up to the incident? • What did the individual do? • What were the consequences? • Were the consequences controllable? • Work Diaries: SMEs (incumbents or supervisors) write down what they are doing at specific times throughout the day Task-oriented job vs. worker-oriented job analysis: • Task-oriented: describes the tasks that are required in the job • What does the employee do? • Worker-oriented: focuses on the human characteristics needed to perform the job • What attributes help the individual to perform the task? • Can be helpful when hiring new employees • Both approaches obtain information about what is necessary to perform the job well Position Analysis Questionnaire (PAQ): • Most famous job analysis survey is the Position Analysis Questionnaire (PAQ) • Uses the worker-oriented approach • Assessed all jobs on a common set of behaviors by judging the relevance of the behavior for the target job • PAQ methodology allowed for factor analyses across a number of jobs • The resulting factors were used to assess all jobs on several important criteria O*NET & DOT: • The DOT was replaced by a more comprehensive effort to describe jobs along standardized dimensions • DOT was too big • Expensive and time-consuming to update so many jobs were out of date • Only focused on tasks, which made it difficult to compare jobs • The Occupational Information Network (O*NET) remedied these disadvantages Competency modeling and how it compares to job analysis: • Employers have begun to focus on the competencies important to their organization • Competencies are not specific to a certain job but are the KSAOs that are most important to the organization as a whole • This approach contrasts with that of job analysis o Job Analysis: “Answers questions for customers about the organization” and “assists customers in troubleshooting problems with their product” o Competency Modeling: “Provides the best service by providing as much help and details to the customer as possible” • Job Analysis – detailed specific work- or worker-related characteristics o Rigorous o Ignores the org! Doesn’t say if doing the task helps the org o More detailed (distinction between jobs) • Competency Modeling – higher order, general characteristics o Considers orgs needs o Lacking assessment of reliability & documentation o Not much variety o Less detailed—competencies relevant to an entire level Job Performance: Campbell’s model of performance (multidimensional): • Performance has 8 dimensions: 1. Job-specific task proficiency 2. Non-job-specific task proficiency 3. Written and oral communication proficiency 4. Demonstrating effort 5. Maintaining personal discipline 6. Facilitating team and peer performance 7. Supervision/leadership 8. Management/Administration Performance vs. Effectiveness vs Productivity • Performance is behavior • Behaviors under the direct control of the individual that are directed towards organizational goals • Effectiveness is the evaluation of the effectiveness of the action • Often influenced by factors beyond the control of the individual (e.g., market factors, equipment slowdown or failure, teammates, etc.) • Productivity is an output/input ratio • Effectiveness/cost for that level of effectiveness Ultimate vs. Actual Criterion • Ultimate: describes the full domain of performance and includes everything that ultimately defines success on the job, it is impossible to actually measure this for any job • Criterion: evaluative standard that can be used to measure a person’s performance Criterion deficiency vs. Criterion contamination vs. Criterion relevance • Criterion Relevance: This is the overlap is what you are trying to maximize • Criterion contamination: Your actual criterion has missed some of the conceptual criterion • Criterion deficiency: All of this actual criteria that you are measuring is unrelated to the conceptual criteria Task performance vs. organizational citizenship behavior (OCB): • Intelligence is the best predictor of task performance • Personality is the best predictor of OCB Conscientiousness predicted OCB-O (helping org) Agreeableness predicted OCB-I (helping people) OCB facets (e.g., OCB-I and O): • Two facets based on the target of behavior • OCB-I: interpersonal support; altruism (e.g., helping and cooperating with others, preventing conflicts) • OCB-O: organizational support; generalized compliance (e.g., adhering to rules, demonstrating loyalty to the org, saying positive things about the org to outsiders)
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'