×
Log in to StudySoup
Get Full Access to UA - MUTH 361 - Study Guide - Midterm
Join StudySoup for FREE
Get Full Access to UA - MUTH 361 - Study Guide - Midterm

Already have an account? Login here
×
Reset your password

UA / MUTH / MUTH 361 / How did the work of barry schwartz and allen neuringer support or adva

How did the work of barry schwartz and allen neuringer support or adva

How did the work of barry schwartz and allen neuringer support or adva

Description

School: University of Alabama - Tuscaloosa
Department: MUTH
Course: Psychology of Learning
Professor: Craig cummings
Term: Spring 2017
Tags:
Cost: 50
Name: PY 361 Exam 2 Study Guide
Description: Study Guide includes chapters 4-6 Includes practice questions, key terms, and specific topics from each chapter
Uploaded: 02/22/2017
7 Pages 66 Views 9 Unlocks
Reviews


PY 361 Exam 2 Study Guide – 2/24/17


How did the work of barry schwartz and allen neuringer support or advance the understanding of response stereotypes and response variability?



Chapter 4

∙ How did the work of Barry Schwartz and Allen Neuringer support or advance the understanding of  response stereotypes and response variability?

o Schwartz argued that reinforcement produces response stereotypy/rigidity

o Neuringer argued that the contingencies of the Schwartz studies, not reinforcement itself,  produced response stereotypy.

o Demonstrated that creativity could be reinforced when the contingencies explicitly required  variability.

o Variability is an operant

∙ Identify the main criticisms of the use of external rewards as motivational tools (hint, it has to do with  intrinsic/extrinsic motivation)


According to skinner, what measure of operant behavior was superior to response latency?



o It is controlling and reduces self-determination, intrinsic motivation, and creative performance o Example: a child who enjoys drawing may begin to draw less when offered an extrinsic reward  (money) contingent on drawing

∙ Be familiar with the Premack principle and how it can be applied to behavior modification (e.g.,  behavioral intervention strategies)

o One way to determine if something will be a reinforcer is by using Premack’s principle:  A higher frequency behavior will function as reinforcement for a lower frequency  behavior.

o Premack argued that reinforcement involves a contingency between two behaviors, rather than  a behavior and a stimulus Don't forget about the age old question of Gregory gets very emotional while watching sad romance movies and normally cries during them. which archetype is he displaying?

 Traditional: SDGreen light: RPress leverà SR+Receive foodpellet


Be familiar with avoidance and escape. how are they similar and how are they different?



 Premack: SDGreen light: RPress leverà SR+Eat food

o Determine reinforcers by measuring behavior in a free operant environment, create a hierarchy  from most to least frequent responses, and use any higher frequency behavior as a reinforcer  for a lower frequency response

∙ According to Skinner, what measure of operant behavior was superior to response latency? o Rate of response is a better measure than latency

∙ Be familiar with the definition of the following terms and any relevant distinctions between, or among  them: operant class, behavior topography, behavior function, discriminative stimulus (SD and S∆),  response hierarchy, free operant, discrete trial, extinction vs. punishment, extinction burst, variability  (following extinction) If you want to learn more check out Is jahangir a good king?

o All terms located under KEY TERMS

Chapter 5

∙ Understand the difference between continuous reinforcement and intermittent reinforcement o Continuous occurs when reinforcement is delivered after every single target behavior o Intermittent is delivered after some behaviors or responses but never after each one

∙ Be familiar with the 4 basic schedules of reinforcement (fixed ratio, fixed interval, variable ratio,  variable interval) and the typical, steady-state response patterns that are generated on a cumulative  record under each schedule

o Fixed Ratio – a schedule that delivers reinforcement after a fixed number of responses o Fixed-Interval – a schedule in which the operant is reinforced after a fixed amount of time has  passed

o Variable-Ratio – a schedule in which the number of responses required for reinforcement  changes after each reinforcer is presented

o Variable-Interval – schedule in which the first operant to occur after a variable amount of time is  reinforced

o What might a PRP stand for? Post-reinforcement pause

∙ Be familiar with progressive ratio schedules. Also, be familiar with the distinction between arithmetic  and geometric ratio progressions. What is the breaking point?  Don't forget about the age old question of What are examples of gestures?

o Progressive Ratio Schedule – a schedule where the number of responses required for  reinforcement are increased systematically

PY 361 Exam 2 Study Guide – 2/24/17

o Geometric Progressive Ratio – a schedule of reinforcement where the previous response  requirement is multiplied by a set number

o Arithmetic Progression – a schedule of reinforcement where a set number is added to the  previous response requirement

o Breakpoint – the highest ratio value completed on a progressive ratio schedule

Chapter 6

∙ Know the difference between primary aversive stimuli and conditioned aversive stimuli o Primary Aversive Stimuli – aversive stimuli based on phylogeny

o Conditioned Aversive Stimuli – an aversive stimulus acquired by a history of conditioning ∙ Identify two side effects of punishment If you want to learn more check out What is the meaning of realized returns (yield)?

o Emotional distress and increased aggression

∙ Be able to name 4 factors that improve the effectiveness of punishment (slide 15 CH 6 PPTs) o Suddenly introduced at moderate to high intensity

o High-intensity punishers are employed

o Punishers are delivered immediately after the response

o Effectiveness of positive reinforcement for target behavior is reduced

∙ Be familiar with avoidance and escape. How are they similar and how are they different?  o Escape – the operant response that removes the aversive stimulus while the stimulus is present  Example: pushing the snooze button when your alarm sounds

o Avoidance – the operant response that removes the aversive stimulus prior to the onset of the  stimulus

 Example: turning off the alarm before it sounds

Key Terms

∙ Operant – behavior that operates on the environment

∙ Operant Class – a set of responses that vary in topography but produce a common environmental  consequence

∙ Behavior Topography – physical form or characteristics of the response

∙ Behavior Function – the environmental changes produced by the operant response that control the  response We also discuss several other topics like What is the zhou?

∙ Discriminative Stimulus – an event or stimulus that precedes an operant and sets the occasion ∙ S-Delta (S△) – an event that precedes a response and produces a lower probability of emitting an  operant

∙ Discrete Trial – method of teaching in simplified and structured steps

PY 361 Exam 2 Study Guide – 2/24/17

∙ Positive Reinforcement – a stimulus or event the presentation of which increases or maintains the rate  of response

∙ Negative Reinforcement – a stimulus or event the removal of which increases or maintains the rate of  the response

∙ Positive Punishment – a stimulus or event the presentation of which decreases the rate of the response ∙ Negative Punishment – a stimulus or event the removal of which decreases the rate of the response ∙ The Premack Principle – a higher frequency behavior will function as reinforcement for a lower  frequency behavior

∙ Response Hierarchy – differences in frequency of different responses in a free-choice setting ∙ Discriminative Stimulus – an event or stimulus that precedes an operant and sets the occasion ∙ Instrumental Response – behavior that produces the opportunity to engage in some activity ∙ Contingent Response – activity obtained by making the instrumental response ∙ Response Deprivation – when access to the contingent behavior is restricted and falls below baseline ∙ Latency – time from the onset of one event to the onset of another Don't forget about the age old question of What is not learned through experience; fast and involuntary responses?

∙ Law of Effect – stamping in or out of a response coined by Thorndike

∙ Rate of Response – the number of responses in a specified interval

∙ In-Vitro Response – reinforcement at the neuron level

∙ Free Operant Method – organism is free to respond or not to respond over a period of time ∙ Deprivation Operation – procedure for restricting access to a reinforcing event ∙ Magazine Training – click of the feeder with the presentation of food

∙ Conditioned Reinforcer – an event or stimulus that is effective because of the organism’s life history ∙ Continuous Reinforcement – each response produces reinforcement

∙ Shaping – differential reinforcement of successive approximations of a desired response ∙ Satiation – the tendency for the effectiveness of a consumable reinforcer to decrease after repeated  presentations

∙ Extinction – withholding reinforcement for a previously reinforced response

∙ Discriminated Extinction – a low rate of operant behavior as a function of S-Delta ∙ Resistance to Extinction – the continuation of operant behavior when it is placed on extinction ∙ Intermittent Schedule of Reinforcement – only some responses are reinforced ∙ Partial Reinforcement Effect – the more intermittent reinforcement, the greater the resistance to  change

∙ Spontaneous Recovery – an increase in the magnitude of a response after operant extinction has  occurred

∙ Schedule of Reinforcement – states how and when stimuli and behavioral consequences will be  presented

∙ Steady-State Performance – behavior occurring at a steady operant level/rate ∙ Mechner Notation – describes what the experimenter does, not the behavior of the organism ∙ Resurgence – the increase in behavioral variability during extinction

∙ Ratio Schedules – schedules of reinforcement based on the number of emitted responses ∙ Interval Schedules – schedules of reinforcement based on time since the last consequence occurred ∙ Break-and-Run – steep period of responding followed by reinforcement then a pause in responding ∙ Post-Reinforcement Pause – the pause in responding following a consequence ∙ Scalloping – a cumulative record pattern that shows an increasing response rate as reinforcement  approaches (associated with fixed-interval schedules)

∙ Assumption of Generality – the idea that the effects of contingencies of reinforcement extend across  species

∙ Limited Hold – contingency where the reinforcer is only available for a set time after an interval  schedule has timed out

∙ Transition Site – the period between initial steady state performance and the next steady state ∙ Inter-Reinforcement Interval – the time that passes between reinforcers

∙ Contingency Management – systematic use of reinforcement to establish desired behavior

PY 361 Exam 2 Study Guide – 2/24/17

∙ Molecular Accounts of Schedule Performance – analysis of the small moment-to-moment relations  between behavior and consequences

∙ Behavioral Dynamics – the change in behavior allocation across time

∙ Pause-and-Run Pattern of Behavior – quick burst of response followed by a pause ∙ Resistance to Extinction – the number of responses emitted during extinction before a behavior  disappears

∙ Fixed Time Schedule – schedule of reinforcement where response-independent reinforcer is delivered  after a set time

∙ Behavioral Momentum – behavior that persists in the presence of a stimulus despite disruption ∙ Charles Bohris Ferster – published schedules of reinforcement with Skinner

∙ Aversive Stimuli – environmental stimuli or events an organism escapes or avoids ∙ Punishment – behavior contingency that decreases the rate of response

∙ Punisher – an event or stimulus that decreases the rate of operant behavior

∙ Positive Punishment – when the delivery of a stimulus or event decreases the rate of a response ∙ Overcorrection – positive punishment procedure that includes practicing an appropriate response  multiple times

∙ Negative Punishment – a stimulus or event the removal of which decreases the rate of a response ∙ Timeout Procedure – the contingency removal of access to positive reinforcers following a problem  behavior

∙ Response Cost – negative punishment procedure where reinforcers are removed based on behavior ∙ Relativity of Punishment – presentation of a lower frequency operant will punish a higher frequency  behavior

∙ Permanence of Punishment – maintenance of response suppression over time ∙ Shock-Shock Interval – time between shock presentations

∙ Response-Shock Interval – time between a response and the presentation of a shock ∙ Discriminated Avoidance – avoidance behavior emitted to a warning stimulus ∙ Non-Discriminated (Sidman) Avoidance – training avoidance with no presented warning stimulus ∙ Molecular Perspective – focus on small moment-to-moment relationships

∙ Molar Perspective – focus on large-scale factors that regulate responding

∙ Timeout from Avoidance – negative reinforcement of behavior that prevents/postpones avoidance  contingencies

∙ Learned Helplessness – when an animal gives up avoiding or escaping an aversive situation ∙ Reflexive Aggression – aggressive responses elicited by an aversive stimuli

∙ Operant Aggression – aggressive behavior reinforced by the removal of an aversive stimuli ∙ Social Disruption – person who delivers the punishment and the context becomes conditioned aversive  stimuli

∙ Coercion – use of punishment to get others to act as we like

∙ Positive Punishment Elicits reflexive behavior that prevents the occurrence of operant behaviors ∙ Delayed Punishment – the observed decrease in effectiveness of punishers not delivered immediately ∙ Water Misting – form of punishment to suppress self-injury where water is sprayed in the face of a  participant

∙ Punishment Debate – debate over it, when, and how punishment should be used ∙ Ineffective Punishers – punishers that have to be used repeatedly

∙ Conditioned Punishing Stimuli – what a social agent that frequently uses punishment becomes Chapter 4 Sample Questions

1. The term operant comes from the verb ________ and refers to behavior that ________ a. Opponent; opposes its consequences in a given environment

b. Opendum; opens the door to its effects on a given occasion

c. Operates; operates on the environment to produce effects

d. Opara; presents the opportunity to respond on a given occasion

2. What defines a contingency of reinforcement?

a. Discriminative stimulus

b. Operant

PY 361 Exam 2 Study Guide – 2/24/17

c. Reinforcement

d. All of the above

3. Which of the following is not one of the four basic contingencies?

a. Positive reinforcement

b. Positive extinction

c. Negative punishment

d. Negative reinforcement

4. In terms of rewards and intrinsic motivation, Cameron et al. (2001) conducted a statistical  procedure called ________, and one of the findings indicated that verbal rewards ________  performance and interest on tasks

a. Multivariate analysis; decreased

b. Meta-analysis; decreased

c. Meta-analysis; increased

d. Multivariate analysis; increased

5. The Premack principle states that a higher-frequency behavior will:

a. Function as reinforcement for a lower-frequency behavior

b. Function as punishment for a high-frequency behavior

c. Function as intermittent reinforcement for a low-frequency behavior

d. None of the above

6. To experimentally study the probability of response, a researcher uses ________ as the basic  measure and follows the ________ method

a. Latency; T-maze

b. Latency; free operant

c. Operant rate; T-maze

d. Operant rate; free operant

7. Shaping of behavior involves:

a. The molding of a response class by the physical arrangement of the operant chamber b. Reinforcing closer and closer approximations to the final performance

c. Withholding and giving food for correct performance of a specified level of response d. None of the above

8. A classic experiment on the effects of extinction by Antonitis (1951) involved: a. Nose poking by rats for food reinforcement

b. Photographs of the rats’ position and body angle

c. Increased variability of nose poking during extinction

d. All of the above

9. In terms of response stereotypes, variability, and reinforcement, the work by Barry Schwartz shows  that reinforcement can produce ________ patterns of behavior, while the work of Neuringer and his  colleagues indicates that reinforcement can produce ________

a. Stereotyped; response variability

b. Response variability; stereotyped

c. Stereotyped; response stability

d. Response stability; response variability

10. Which of the following is involved in the partial reinforcement effect?

a. Longer extinction on intermittent reinforcement compared with CRF

b. The higher the rate of reinforcement the greater the resistance to change

c. Discrimination between reinforcement and extinction is more rapid on CRF

d. All of the above

Answers: 1, c; 2, d; 3, b; 4, c; 5, a; 6, d; 7, b; 8, d; 9, a; 10, d

Chapter 5 Sample Questions

1. Schedules of reinforcement were first described by:

a. Charles Ferster

b. Francis Mechner

c. B. F. Skinner

d. Fergus Lowe

2. Infrequent reinforcement generates responding that is persistent. What is this called? a. Post-reinforcement pause

b. Partial reinforcement effect

PY 361 Exam 2 Study Guide – 2/24/17

c. Molar maximizing

d. Intermittent resistance

3. Mechner notation describes:

a. Stimulus effects

b. Dependent variables

c. Response contingencies

d. Independent variables

4. Resurgence happens when:

a. Behavior is put on extinction

b. Reinforcement magnitude is doubled

c. High-probability behavior persists

d. Response variability

5. Schedules that generate predictable stair-step patterns are:

a. Fixed interval

b. Fixed ratio

c. Variable ratio

d. Random ratio

6. Variable-ratio schedules generate:

a. Post-reinforcement pauses

b. Locked rates

c. Break-and-run performance

d. High rates of response

7. Schedules that combine time and response are called:

a. Partial reinforcement schedules

b. Complex schedules

c. Interval schedules

d. Fixed-time schedules

8. The shape of the response pattern generated by FI is called a:

a. Scallop

b. Ogive

c. Break and pause

d. Accelerating dynamic

9. Human performance on FI differs from animal data due to:

a. Intelligence differences

b. Self-instruction

c. Contingency effects

d. Alternative strategies

10. Behavior is said to be in transition when it is between:

a. A rock and a hard place

b. Stable states

c. One schedule and another

d. A response run

Answers: 1, c; 2, b; 3, d; 4, a; 5, b; 6, d; 7, c; 8, a; 9, b; 10, b

Chapter 6 Sample Questions

1. In terms of aversive stimuli, attacks and foul odors are ________, while threats and failing grades are ________

a. Potent; impotent

b. Natural; secondary

c. Primary; conditioned

d. Primitive; cultured

2. If wheel running is a higher-frequency operant, then wheel running will ________ drinking; if wheel  running is a lower-frequency operant, then wheel running will ________ drinking

PY 361 Exam 2 Study Guide – 2/24/17

a. Decrease; increase

b. Reinforce; punish

c. Determine; undermine

d. Diminish; exacerbate

3. Research on the use of skin-shock punishment in the treatment of self-injurious behavior: a. Shows many side effects of punishment

b. Indicates that skin shocks have no effect on non-targeted behavior

c. Found an increase in aggressive and destructive behavior with skin shocks

d. Indicates that skin-shock treatment eliminates the need for physical restraint 4. The time between shocks or the ________ interval and the time away from shocks produced by  responses or the ________ interval are two aspects of escape and avoidance

a. Temporal shock; response time

b. Shock-shock; response-shock

c. Shocking; responding

d. Aversive; postponement

5. The procedure of non-discriminative avoidance is also called:

a. Signaled avoidance

b. Sensory aversion

c. Sidman avoidance

d. Stevens aversions

6. In terms of operant-respondent interactions, persistence, and avoidance:

a. Operant avoidance prevents respondent extinction

b. Operant avoidance interacts with respondent aggression

c. Operant avoidance competes with respondent avoidance

d. Operant avoidance sets the occasion for respondent aversion

7. For learned helplessness, pre-exposure to escape ________ the helplessness brought on by ________  aversive stimulation

a. Enhances; non-contingent

b. Causes; excessive

c. Augments; expected

d. Blocks; inescapable

8. With regard to respondent aggression, Ulrich and Azrin (1962) found that the probability of attack  for any single shock:

a. Decreased as the number of shocks increased

b. Remained constant as the number of shocks increased

c. Increased as the number of shocks went up

d. Increased and then decreased as the number of shocks went up

9. Skinner (1953) reported a game played by sailors in the 18th century. This game involved the  following:

a. Trying several boys in a ring

b. Telling each boy to hit another boy when he himself was hit

c. A slight tap on one boy

d. All of the above

10. In terms of dropping out, Sidman (2001) indicates that one basic element is: a. Escape due to negative reinforcement

b. Escape due to punishment

c. Escape due to contingencies of avoidance

d. Escape due to a history of inescapable shock

Answers: 1, c; 2, b; 3, d; 4, b; 5, c; 6, a; 7, d; 8, c; 9, d; 10, a

Page Expired
5off
It looks like your free minutes have expired! Lucky for you we have all the content you need, just sign up here