Copyright © 2007-2017 Russ Dewey
Operant conditioning is distinguished from classical conditioning in several ways. Classical conditioning emerged around 1900 in Russia, with Pavlov. Operant conditioning emerged in 1938 in the United States with Skinner.
The types of behaviors involved are different (learned behaviors for operant conditioning instead of inborn reflexes as with classical conditioning). The prototypical set-up is different (an operant chamber instead of Pavlov's dog).
Operant conditioning is also called instrumental conditioning, because the animal uses its behaviors as instruments to pursue a goal. It operates on the environment (which is where operants got their name).
Skinner defined an operant by its effect on the environment. A bar-press operant is any behavior resulting in a bar press, whether it is accomplished with the animal's paw or nose.
In an operant conditioning laboratory using rats (a "rat lab") students start by teaching a rat to find food pellets in a small enclosure, the food magazine. Next the rat is reinforced (given food pellets) for any behavior that brings it close to a bar protruding from the cage wall.
Next the rat is reinforced for touching the bar. Finally, it is required to press the bar down, to receive a food pellet.
This process of reinforcing steps toward a desired behavior (first approaching the bar, then touching it, then pressing it) is called shaping. It is also known as the method of successive approximations.
Skinner defined a reinforcer as a stimulus that makes a behavior more frequent or probable, when it follows the behavior. A punishing stimulus is one that makes the behavior it follows less frequent or probable. It is not the same thing as negative reinforcement.
Negative reinforcement is a form of reinforcement that occurs when something bad is taken away. That makes the behavior it follows more frequent.
Punishment is any stimulus that makes a behavior less frequent, when the stimulus follows the behavior. Electric shock is a potent punishing stimulus for almost all behaviors.
Response cost ("negative punishment") occurs when a behavior is punished by taking away something good. That makes the behavior less frequent, hence this is a form of punishment.
Antecedents are stimuli that come before a behavior. An S+ is a stimulus that indicates reinforcement is available if a behavior is performed. An S- is a stimulus that indicates reinforcement is not available.
An S+ or an S- is called a discriminative stimulus. It helps the animal discriminate between situations when reinforcement is available or not available.
When animals learn to perform a behavior to escape from pain or other aversive stimulation, this is called escape conditioning. When animals receive a stimulus indicating something aversive is about to happen, they try to escape ahead of time. That is called avoidance learning.
Avoidance learning is marked by its persistence. It produces relief as a reinforcer, and that reinforcer continues even if the actual threat is taken away. The animal continues to avoid the imagined threat, feeling relief each time.
Learning can occur through observation. Modeling is common among humans, who can demonstrate a behavior others learn by observation. Imitation is observed in the animal kingdom, for example, when young lions learn to hunt by watching adults from a concealed location.
Extinction occurs when the reinforcer that maintains a behavior is stopped, and the behavior goes away. A behavior that is extinguished will often appear again, which is called spontaneous recovery. Extinction must be continued to completely eliminate a behavior.
When extinction is prolonged, animals may engage in vigorous and variable behavior, to try to make reinforcement start again. This is called an extinction burst or extinction-induced resurgence. It is handy for animal trainers who wish to find novel behaviors to reinforce, with performing animals.
Write to Dr. Dewey at email@example.com.
Copyright © 2007-2017 Russ Dewey