B. F. Skinner was born on March 20, 1904 in Susquehanna, Pennsylvania (Vargas, 2005). As a boy he enjoyed inventing and building things. He would later use this skill for psychological experiments (Cherry, 2012b). In 1926, Skinner received a B. A. in English literature from Hamilton College. He spent time as a writer until he discovered the writings of Watson and Pavlov (Cherry, 2012b). These writing he found interesting and impressive and wanted to learn more about them (Vargas, 2005). He then decided to give up his career as a writer, and entered the psychology graduate program at Harvard University (Cherry, 2012b). While being rebellious and impatient, Skinner found a mentor, William Crozier (Vargas, 2005). Crozier’s study of behavior of “the animal as a whole” was an exact match for Skinner’s goal of relating behavior to experimental conditions (Vargas, 2005). During his career, Skinner moved to Bloomington, Indiana and became the Psychology Department Chair at the University of Indiana in 1945 (Cherry, 2012b). Later on, he became one the leaders of behaviorism, and his work contributed greatly to the development on experimental psychology (Cherry, 2012b).
Skinner’s Life and His Experiments
Using his skills of building he learned as a small boy (Cherry, 2012b), he developed a device called the “cumulative recorder”; the device showed the rates of responding as a sloped line (Vargas, 2005). Using this device, he had come to find that behavior did not depend on the preceding stimulus, as Watson and Pavlov maintained (Cherry, 2012b), but on what came after the button was pressed (Vargas, 2005) . Instead, Skinner found out that behaviors were dependent upon what happens after the response; Skinner called this operant behavior (Cherry, 2012b). In 1936, at the age of 32, Skinner and his wife, Yvonne Blue, moved to Minnesota (Vargas, 2005). In 1944, bombs and missiles were common because it was World War II, but the thing was there were no missile guidance systems (Vargas, 2005). Skinner, anxious to help provide, requested funding to train pigeons to guide bombs. Skinner trained the birds to peck at a target to keep the bomb on target. The pigeons even pecked with warlike sounds around them and with rapid descent from the sky (Vargas, 2005). The pigeon project was discontinued because of the advancement in radar (Vargas, 2005). One day while visiting his daughters fourth grade math class for Father’s day, he was hit with a force of inspiration (Vargas, 2005). The way he stated it was that the teacher was unintentionally violating everything he knew about learning (Vargas, 2005). In shaping, you change what you ask the animal to do because of the way it performs, but while in the math class, some kids had no clue on how to solve a problem and some knew exactly how to do it (Vargas, 2005). With this arose the question on how the teacher reinforced a correct or incorrect answer (Vargas, 2005). With all intentions of helping teachers, he built his first teaching machine later that day (Vargas, 2005). The first machine provided questions in a random order for children to do, with feedback immediately after each response (Vargas, 2005). The machine did not teach new behavior, but it did give practice for skills and things already learned (Vargas, 2005). Within three years of its start, Skinner designed programmed instruction in which student responded to materials broken up into small steps (Vargas, 2005). As performances improved, the less help was needed and given. When a student finished, they could do something they couldn’t do before using the machine (Vargas, 2005).
Later Life
Skinner turned to philosophical and moral issues because there was a concern for behavioral science for society (Vargas, 2005). At the end of his life, Skinner was still professionally active in psychology (Vargas, 2005). In 1989, he was diagnosed with leukemia and did what he could with his greatly depleting strength (Vargas, 2005). Up until ten days before he died, he was still giving speeches and lectures (Vargas, 2005).
Accomplishments
He had received numerous awards throughout his life. They include: the Edward Lee Thorndike Award, National Medal of Science from Lyndon B. Johnson, Gold Medal of the American Psychological Foundation, Human of the Year Award, and Citation for Outstanding Lifetime Contribution to Psychology. The Thorndike award is given in recognition to scientific based research that contributes to the knowledge of educational psychology (American Psychological Association, 2012a). The National medal of Science is rewarded for people who deserve special recognition for outstanding contributions to the physical, biological, mathematical, and engineering sciences (National Science Foundation, 2012). The Gold Medal of the American Psychological Foundation is awarded to someone with a distinguishable career with an enduring addition to psychology (American Psychological Association, 2012b). The Human of the Year Award is given to someone who has made a significant improvement to the conditions of humans (American Humanists Association, 2008). The Citation for Outstanding Lifetime Contribution to Psychology was rewarded to Skinner for his lifetime of contributions to psychology and also to the entire world. Skinner had many contributions to psychology which included publishing over 200 articles and 20 books. In a 2002 survey of psychologists, he was identified as the most influential 20th-century psychologist. While behaviorism is no longer a dominant source of thought, his work in operant conditioning remains vital to today’s line of psychological work (Cherry, 2012b). Some examples occur in everyday life. Mental health professionals often use some of his operant techniques when working with clients, teachers are frequently using reinforcement and punishment to shape behavior in the classroom, and another is that animal trainers rely heavily on the same techniques to train dogs and other animals (Cherry, 2012b).
Skinner and Operant Conditioning
Skinner believed that we do have a mind, but that it is simply more productive and easier to study observable behavior rather than internal mental events (McLeod, 2007). Skinner believed that the best way to understand behavior is to look at the causes of an action and the consequences of those actions (McLeod, 2007). He called this way of thinking operant conditioning (McLeod, 2007). Skinner’s theory of operant conditioning was based on the work of Thorndike, who studied learning in animals by using a puzzle box to propose a theory known as the ‘Law of Effect’ (McLeod, 2007). Skinner introduced a new term into the Law of Effect, reinforcement. Reinforcement is behavior that is reinforced tends to be repeated and behavior that is not reinforced tends to die out or be extinguished (McLeod, 2007). Skinner studied operant conditioning by conducting experiments using animals (rats) which he placed in a “Skinner Box.” The Skinner Box was similar to Thorndike’s puzzle box (McLeod, 2007). Skinner created the term operant conditioning; it means roughly changing in behavior by the use of reinforcement, whether it is positive or negative, which is given after the desired response (McLeod, 2007). Skinner identified three types of responses can follow behaviors. Neutral operants are responses from the environment that neither increase nor decrease the probability of a behavior being repeated. Reinforcers are responses from the environment that increase the probability of a behavior being repeated. Reinforcers can be either positive or negative. Punishers are responses from the environment that decrease the likelihood of a behavior being repeated (McLeod, 2007). Punishment weakens behavior. Skinner showed how positive reinforcement worked by placing a hungry rat in his box which contained a lever in the side and as the rat moved about the box it would accidentally knock the lever (McLeod, 2007). Immediately as it did so, a food pellet would drop into a container next to the lever. The rats quickly learned to go straight to the lever, after only a few times of being put in the box. The consequence of receiving food, if they pressed the lever, ensured that they would repeat the action again and again without fail (McLeod, 2007). Positive reinforcement strengthens a behavior by providing a “consequence” one finds rewarding (McLeod, 2007). As an example, McLeod (2008) states that “if your teacher gives you $15 each time you complete your homework you are more likely to repeat this behavior in the future, thus strengthening the behavior of completing your homework.” The removal of an unpleasant reinforcer may strengthen a behavior (McLeod, 2007). This is known as a negative reinforcement because it is the removal of an adverse stimulus which is giving a reward to the animal (McLeod, 2007). Negative reinforcement strengthens behavior because it removes an unpleasant experience from that animal (McLeod, 2007). For example, if you do not complete your homework, you would give your teacher $15 instead of receiving $15 (McLeod, 2007). You will complete your homework to avoid paying $15, thus strengthening the behavior of completing your homework. Skinner showed how negative reinforcement worked by placing a rat in his Skinner box. Then subjected it to an unpleasant electric current which caused it some discomfort. As the rat moved about the box, it would accidentally knock the lever and immediately as it did so, the electric current would be turned off (McLeod, 2007). The rats quickly learned to go straight to the lever so it wouldn’t get shocked, after a few times of being put in the box (McLeod, 2007). The consequence of escaping the electric current ensured that they would repeat the action again and again (McLeod, 2007). Skinner even taught the rats to avoid the electric current. He did so by turning on a light just before the electric current came on (McLeod, 2007). The rats soon learned to press the lever when the light came on because they knew that this would stop the electric current being switched on resulting in them getting electrically shocked (McLeod, 2007). These two learned responses are known as Escape Learning and Avoidance Learning (McLeod, 2007).
Synopsis of Operant Conditioning
The animals were being tested and would either receive a positive or negative reinforcer. With the positive reinforcer, the animals would keep on pushing the lever to obtain food until satisfied. With the negative reinforcer, the animals quickly learned to hit the lever when the light came on to avoid being physically hurt.
Biblical Integration
Operant conditioning can help people learn Biblical principles. It can be used to show when the person has done something good or when the person has done something bad. When the person does something good, or that resembles the Bible and what it teaches, the person will be rewarded, with things like Heaven and eternal life through Jesus. When the person has done something wrong, they are punished with the threat of an eternity in hell. Nonetheless, operant conditioning can be used to learn Biblical principles in a good, loving, caring, and patient way. All that is required is a few reinforcers. Whether they are good or bad is up to the person trying to get to know Jesus and God, or by the person who pushes away and rejects God.
Conclusion
Without operant conditioning, we wouldn’t know anything close to what we know today about learning and how things relate to one another. Operant conditioning plays a vital way in the way people teach and how certain or all people learn. The world would most definitely be a different place if there was no such thing as operant conditioning.
Cite This Work
To export a reference to this article please select a referencing style below: