Operant conditioning is a theory of behaviorism that focuses on changes in an individual's observable behaviors. In operant conditioning, new or continued behaviors are impacted by new or continued consequences. Research regarding this principle of learning was first conducted by Edward L. Thorndike in the late 1800s, then brought to popularity by B. F. Skinner in the mid-1900s. Much of this research informs current practices in human behavior and interaction.
Skinner's Theories of Operant Conditioning
Almost half a century after Thorndike's first publication of the principles of operant conditioning and the law of effect, Skinner attempted to prove an extension to this theory—that all behaviors are in some way a result of operant conditioning. Skinner theorized that if a behavior is followed by reinforcement, that behavior is more likely to be repeated, but if it is followed by some sort of aversive stimuli or punishment, it is less likely to be repeated. He also believed that this learned association could end, or become extinct, if the reinforcement or punishment was removed.
B. F. Skinner
Skinner was responsible for defining the segment of behaviorism known as operant conditioning—a process by which an organism learns from its physical environment.
Skinner's Experiments
Skinner's most famous research studies were simple reinforcement experiments conducted on lab rats and domestic pigeons, which demonstrated the most basic principles of operant conditioning. He conducted most of his research in a special cumulative recorder, now referred to as a "Skinner box," which was used to analyze the behavioral responses of his test subjects. In these boxes he would present his subjects with positive reinforcement, negative reinforcement, or aversive stimuli in various timing intervals (or "schedules") that were designed to produce or inhibit specific target behaviors.
In his first work with rats, Skinner would place the rats in a Skinner box with a lever attached to a feeding tube. Whenever a rat pressed the lever, food would be released. After the experience of multiple trials, the rats learned the association between the lever and food and began to spend more of their time in the box procuring food than performing any other action. It was through this early work that Skinner started to understand the effects of behavioral contingencies on actions. He discovered that the rate of response—as well as changes in response features—depended on what occurred after the behavior was performed, not before. Skinner named these actions operant behaviors because they operated on the environment to produce an outcome. The process by which one could arrange the contingencies of reinforcement responsible for producing a certain behavior then came to be called operant conditioning.
To prove his idea that behaviorism was responsible for all actions, he later created a "superstitious pigeon." He fed the pigeon on continuous intervals (every 15 seconds) and observed the pigeon's behavior. He found that the pigeon's actions would change depending on what it had been doing in the moments before the food was dispensed, regardless of the fact that those actions had nothing to do with the dispensing of food. In this way, he discerned that the pigeon had fabricated a causal relationship between its actions and the presentation of reward. It was this development of "superstition" that led Skinner to believe all behavior could be explained as a learned reaction to specific consequences.
In his operant conditioning experiments, Skinner often used an approach called shaping. Instead of rewarding only the target, or desired, behavior, the process of shaping involves the reinforcement of successive approximations of the target behavior. Behavioral approximations are behaviors that, over time, grow increasingly closer to the actual desired response.
Skinner believed that all behavior is predetermined by past and present events in the objective world. He did not include room in his research for ideas such as free will or individual choice; instead, he posited that all behavior could be explained using learned, physical aspects of the world, including life history and evolution. His work remains extremely influential in the fields of psychology, behaviorism, and education.