Search results
Results from the WOW.Com Content Network
To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box), [8] and to measure rate he invented the cumulative recorder. Using these tools, he and Charles Ferster produced Skinner's most influential experimental work, outlined in their 1957 book Schedules of Reinforcement .
B. F. Skinner first identified and described the principles of operant conditioning that are used in clicker training. [6] [7] Two students of Skinner's, Marian Kruse and Keller Breland, worked with him researching pigeon behavior and training projects during World War II, when pigeons were taught to "bowl" (push a ball with their beaks). [8]
Operant conditioning, also called instrumental conditioning, is a learning process where voluntary behaviors are modified by association with the addition (or removal) of reward or aversive stimuli. The frequency or duration of the behavior may increase through reinforcement or decrease through punishment or extinction .
Kenneth MacCorquodale (June 26, 1919 - February 28, 1986) was an American psychologist who played a major role in developing scientifically validated operant conditioning methods. He was a student of B. F. Skinner at the University of Minnesota. [1] [2]
Central to operant conditioning is the use of a Three-Term Contingency (Discriminative Stimulus, Response, Reinforcing Stimulus) to describe functional relationships in the control of behavior. Discriminative stimulus (S D) is a cue or stimulus context that sets the occasion for a response. For example, food on a plate sets the occasion for eating.
That includes his study of the basic principles. For example, the original behaviorists treated the two types of conditioning in different ways. The most generally used way by B. F. Skinner constructively considered classical conditioning and operant conditioning to be separate and independent principles. In classical conditioning, if a piece ...
The free operant has advantages in this respect, because it removes restrictions on the frequency with which a response can occur and permits the observation of moment-to-moment changes in frequency. C.B. Ferster, The use of the free operant in the analysis of behavior, 1953 Psychological Bulletin, 50, 263-274.
One bird pecked more than 10,000 times in 45 minutes (Note 2/20/89 BFSkinner Foundation and author's collection.) [2] As long as the target remained in the center of the screen, the screen would not move, but if the bomb began to go off track, the image would move towards the edge of the screen. The pigeons would follow the image, pecking at it ...