» skinner theory. B. Skinner's theory of operant conditioning Skinner's operational theory educational psychology

skinner theory. B. Skinner's theory of operant conditioning Skinner's operational theory educational psychology

Pilot study the conditions for acquiring a truly new behavior, as well as the dynamics of learning, were the focus of attention of the American psychologist E. Thorndike. Thorndike's work mainly studied the patterns of solving problem situations by animals.

An animal (cat, dog, monkey) had to independently find a way out of a specially designed "problem box" or from a labyrinth. Later, small ...

The mechanisms of respondent and operant learning were insufficient to explain the acquisition of complex social behavior. In search of an answer, paramount importance began to be given to a special type of learning - visual learning, or learning through observation.

A. Bandura (born in 1925) calls this method of learning social-cognitive, respectively, the theory of social learning - social-cognitive. Cognitive learning implies a much greater activity of the learner; can...

The first generation (30-60s of the XX century) - N. Miller, D. Dollard, R. Sears, B. Whiting, B. Skinner (these researchers are classified both as behaviorism and social learning theories).

The second generation (60-70s) - A. Bandura, R. Walters, S. Bijou, J. Gewirtz and others.

The third generation (since 70 of the XX century) - V. Khartup, E. Maccoby, J. Aronfried, W. Bronfenbrenner and others. N. Miller and D. Dollard - the first representatives of the direction of social learning, who tried to supplement the basic principles of behavioral. ..

The most prominent theorist of strict behaviorism B.F. Skinner (1904-1990) insisted that scientific methods it is possible to know all human behavior, since it is determined objectively ( environment). Skinner rejected the concept of hidden mental processes, such as motives, goals, feelings, unconscious tendencies, etc. He argued that a person's behavior is almost entirely shaped by his external environment.

This position is sometimes called environmentalism (from the English environment - environment ...

At the end of the 30s. 20th century in America, a powerful psychological trend of social learning emerged. The term "social learning" itself was introduced by N. Miller and D. Dollard to refer to the lifetime building of the individual's social behavior through the transfer of behavior patterns, roles, norms, motives, expectations, life values, emotional reactions to him.

Socialization is seen as a process of gradual transformation of a biological being, an infant, into a full-fledged member of a family, a group...

We are moving on to the next major stage in the development of psychology. It was marked by the fact that completely new facts were introduced into psychology - the facts of behavior.

What do they mean when they talk about the facts of behavior, and how do they differ from the phenomena of consciousness already known to us?

In what sense can we say that these are different areas of facts (and some psychologists even opposed them)?

According to the tradition that has developed in psychology, behavior is understood as external manifestations of mental ...

Skinner Burres Frederick (b. 1904) is an American psychologist, a representative of modern behaviorism. He spoke out against neobehaviorism, believing that psychology should limit itself to describing externally observable regular connections between stimuli, reactions and the reinforcement of these reactions.

He put forward the concept of "operant" (from "operation") learning, according to which the body acquires new reactions due to the fact that it reinforces them, and only after that the external stimulus causes reactions ...

The problems of freedom and responsibility are in a number of respects fundamental to counseling and psychotherapy. But in recent years, we find ourselves in the grip of several topical and important dilemmas that are directly related to these problems. These dilemmas are inextricably linked to the radical shift and transformation of values ​​in Western culture, particularly in America, in the last three or four decades. Of course, it is by no means accidental that these decades coincided with...

At the base of the theory operant conditioning Skinner lies the simple fact that not always the actions of a living being are a reaction to one or another combination of external influences - incentives. Quite often (according to Skinner, in most cases) the behavior appears as if it were not preceded by any visible stimuli. In famous experiments Skinner a laboratory rat was placed in an empty box with a pedal inside (the so-called "box Skinner") and received complete freedom of action. In the process of chaotic exploration of the box, the rat inevitably touched the pedal and received a portion of food. After several random pressings on the pedal, the rat formed a new form of behavior that was not associated with any previous stimuli. Now, hungry, the rat purposefully followed the pedal and, by pressing it, got what she wanted.Thus, the key difference operant conditioning from the classical one is that in the case operant conditioning a living organism by its behavior actively influences the environment and faces certain consequences. In the case of the formation of a conditioned reflex, such an effect is not observed. Animals in Pavlov's experiments were specifically, in order to maintain the purity of the experiment, deprived of any opportunity to influence the environment. In this sense, operant behavior is active and is aimed at exploring the surrounding world, respondent behavior is reactive and only follows certain influences, which, in the process of classical conditioning, have acquired a certain signal effect for the organism. But by itself, research activity does not give anything - it only increases the chances of meeting certain consequences. How behavior is modified depends primarily on the nature of the consequences—whether those consequences are pleasant or unpleasant. Pleasant Consequences Skinner called "reinforcements". Experimenting with different types of reinforcement, Skinner deduced one indisputable and always reproducible pattern: patterns of behavior (operants), followed by pleasant consequences, are more common in the future. The rat presses the pedal more often if it receives a piece of food immediately after this action. A dove placed in a cage with a red spot on the floor can only randomly peck into it. But if immediately after this he receives food - a grain, then this operant (action based on success) will occur more often in the future. A person who gets a tasty meal in one of the restaurants in the city will go to this restaurant more often, even if it is quite far from home. Skinner called this pattern "the law of gain (acquisition)", sometimes it is also called the first law. operant learning. The law of acquisition meant for Skinner and his followers the following: if the therapist or teacher is faced with the task of forming new habits, new patterns of behavior, then the only way that gives predictable and reliable results is that we deliberately create positive consequences for the so-called "target" behavior, i.e. .e. behavior that we would like to meet more often in the future. By reinforcing this behavior, we will definitely achieve our goal: this behavior will occur more often.

Operant Conditioning Theory (Thorndack)

Operant-instrumental learning

According to this theory, most forms of human behavior are arbitrary, i.e. operant; they become more or less likely, depending on whether the consequences are favorable or unfavorable. In accordance with this idea, the definition was formulated.

Operant (instrumental) learning is a type of learning in which the correct response or change in behavior is reinforced and made more likely.

This type of learning was experimentally studied and described by American psychologists E. Thorndike and B. Skinner. These scientists introduced into the learning scheme the need to reinforce the results of exercises.

The concept of operant learning is based on the scheme "situation - reaction - reinforcement".

The psychologist and educator E. Thorndike introduced a problem situation as the first link into the learning scheme, the way out of which was accompanied by trial and error, leading to random success.

Edward Lee Thorndike (1874-1949) American psychologist and educator. Conducted research on animal behavior in "problem boxes". The author of the theory of learning by trial and error with a description of the so-called "learning curve". He formulated a number of well-known laws of learning.

E. Thorndike conducted an experiment with hungry cats in problem cages. An animal placed in a cage could get out of it and receive top dressing only by activating a special device - pressing a spring, pulling a loop, etc. The animals made many movements, rushed in different directions, scratched the box, etc., until one of the movements happened to be successful. With each new success, the cat has more and more reactions leading to the goal, and less and less - useless.

Rice. 12.

psychoanalytic theory operant child

"Trial, error and random success" - such was the formula for all types of behavior, both animals and humans. Thorndike suggested that this process is determined by 3 laws of behavior:

1) the law of readiness - for the formation of a skill in the body, there must be a state that pushes to activity (for example, hunger);

2) the law of exercise - the more often an action is performed, the more often this action will be chosen subsequently;

3) the law of effect - the action that gives a positive effect (“rewarded”) is repeated more often.

Concerning problems schooling and education, E. Thorndike defines "the art of learning as the art of creating and delaying stimuli in order to cause or prevent certain reactions" . At the same time, stimuli can be words addressed to the child, a look, a phrase that he will read, etc., and responses - new thoughts, feelings, actions of the student, his state. You can consider this provision on the example of the development of educational interests.

The child, through his own experience, has a variety of interests. The task of the teacher is to see among them the “good” ones and, based on them, develop the interests necessary for learning. Directing the interests of the child in the right direction, the teacher uses three ways. The first way is to connect the work being done with something important for the student that gives him satisfaction, for example, with the position (status) among peers. The second is to use the imitation mechanism: the teacher himself is interested in his own subject, will be interested in the class in which he teaches. The third is to inform the child of such information that sooner or later will arouse interest in the subject.

Another well-known behavioral scientist B. Skinner revealed the special role of reinforcing the correct response, which involves the “designing” of a way out of the situation and the obligation of the correct answer (this was one of the foundations of programmed learning). According to the laws of operant learning, behavior is determined by the events that follow it. If the consequences are favorable, then the likelihood of repeating the behavior in the future increases. If the consequences are unfavorable and not reinforced, then the likelihood of the behavior decreases. Behavior that does not lead to the desired effect is not learned. You will soon stop smiling at a person who does not smile back. There is a learning to cry in a family where there are small children. Crying becomes a means of influencing adults.

At the heart of this theory, as well as in the Pavlovian one, is the mechanism for establishing links (associations). Operant learning is also based on the mechanisms of conditioned reflexes. However, these are conditioned reflexes of a different type than classical ones. Skinner called such reflexes operant or instrumental. Their peculiarity is that activity is first generated not by a signal from outside, but by a need from within. This activity has a chaotic random character. In the course of it, not only innate responses are associated with conditioned signals, but any random actions that received a reward. In the classical conditioned reflex, the animal, as it were, passively waits for what will be done to it, in the operant reflex, the animal itself is actively looking for the right action, and when it finds it, it learns it.

The technique of developing "operant reactions" was used by Skinner's followers in the education of children, their upbringing, and in the treatment of neurotics. During World War II, Skinner worked on a project to use pigeons to control aircraft fire.

Having once visited an arithmetic lesson in college, where his daughter studied, B. Skinner was horrified at how little the data of psychology are used. In order to improve teaching, he invented a series of teaching machines and developed the concept of programmed learning. He hoped, based on the theory of operant reactions, to create a program for "manufacturing" people for a new society.

Operant learning in the works of E. Thorndike. An experimental study of the conditions for the acquisition of a truly new behavior, as well as the dynamics of learning, was the focus of attention of the American psychologist E. Thorndike. Thorndike's work mainly studied the patterns of sample solution. An experimental study of the conditions for the acquisition of a truly new behavior, as well as the dynamics of learning, was the focus of attention of the American psychologist E. Thorndike. Thorndike's work mainly studied the patterns of solving problem situations by animals. An animal (cat, dog, monkey) had to independently find a way out of a specially designed "problem box" or from a labyrinth. Later, small children also participated as subjects in similar experiments.

When analyzing such complex spontaneous behavior as the search for a way to solve a labyrinth problem or unlock a door (as opposed to a response, respondent) seems to be, it is difficult to isolate a stimulus that causes a certain reaction. According to Thorndike, initially the animals made many chaotic movements - trials and only accidentally produced the necessary ones, which led to success. On subsequent attempts to exit the same box, there was a decrease in the number of errors and a decrease in the amount of time spent. The type of learning, when the subject, as a rule, unconsciously tries different behaviors, operettas (from the English operate - act), from which the most suitable, most adaptive one is “selected”, is called operant conditioning.

The method of "trial and error" in solving intellectual problems began to be considered as general pattern characterizing the behavior of both animals and humans.

Thorndike formulated four basic laws of learning.

1. Law of repetition (exercises). The more often the connection between stimulus and response is repeated, the faster it is fixed and the stronger it is.

2. Law of effect (reinforcement). When learning reactions, those of them that are accompanied by reinforcement (positive or negative) are fixed.

3. Law of readiness. The state of the subject (the feelings of hunger and thirst he experiences) is not indifferent to the development of new reactions.

4. Law of associative shift (adjacency in time). A neutral stimulus, associated by association with a significant one, also begins to cause the desired behavior.

Thorndike also singled out additional conditions for the success of a child's learning - the ease of distinguishing between a stimulus and a reaction and awareness of the connection between them.

Operant learning occurs when the organism is more active, it is controlled (determined) by its results, consequences. The general trend is that if actions have led to a positive result, to success, then they will be fixed and repeated.

The labyrinth in Thorndike's experiments served as a simplified model of the environment. The labyrinth technique does, to some extent, model the relationship between the organism and the environment, but very narrowly, one-sidedly, limitedly; and transfer patterns discovered within the framework of this model to social behavior a person in a complex organized society is extremely difficult.

The operant behaviorism of B. F. Skinner was subordinated to the main task - to predict and control the behavior of specific individuals.

The main provisions of the theory of B. F. Skinner:

Behavior can be reliably defined, predicted and controlled by environmental conditions. To understand behavior means to control it and vice versa.

Did not accept the idea of ​​a person or self that stimulates and directs behavior.

He emphasized intensive analysis of the characteristic features of a person's past experience and unique innate abilities.

The study of personality involves finding the peculiar nature of the relationship between the behavior of the organism and the results that reinforce it.

He believed that people are dependent on past experience.

BF Skinner considered the human body as a "black box". Behavior is only a function of its consequences or legitimate S-R relationships. He considered personality only as a set of forms of reactions that are characteristic of a given behavior. The individual's personality consists of relatively complex, yet independently acquired reactions. To understand behavior, one only needs to understand a person's past learning experience.

In the system of B. F. Skinner, behavior consists of specific elements - operant reactions. He recognized two main behavior type:

respondent as a response to a familiar stimulus,

operant, determined and controlled by the result that follows it.

Operant conditioning, according to B. F. Skinner, denotes a special way of forming conditioned reflexes, which consists in reinforcing a reaction that spontaneously arises in the subject, and not a stimulus (in contrast to the “classical” Pavlovian way). Reinforcement is the key concept of the author's system. Reinforcement incentives can be divided into primary and secondary incentives. Primary - they themselves have reinforcing properties (for example, food, water, comfort). Secondary stimuli (for example, money, attention, approval, etc.) - an event or object that acquires the ability to provide reinforcement through close association with the primary reinforcer.

BF Skinner did not consider it necessary to consider the internal forces or motivational states of a person as a causal factor in behavior, but focused on the relationship between certain environmental phenomena and overt behavior. He was of the opinion that personality is nothing more than certain forms of behavior that are acquired through operant learning.

Continues and develops the ideas of Watson Burres F. Skinner (1904-1990), who developed the theory of operant learning. He is the leader of the modern form of behaviorism (or neobehaviorism).

Skinner considered psychoanalytic theories to be speculative; based on assumption. They suggest the existence of intrapsychic factors (drive, the unconscious) that cannot be empirically tested. Skinner believed that human behavior should be studied from the position that it is shaped by the circumstances of the environment (environment and people). All human actions and behavior are explained by the influence of the environment.

Skinner argued that the human body is a "black box". Its content (emotions, motives, intrapsychic conflicts, drives) cannot be objectively measured, so they should be excluded from the scope of empirical observation.

Human behavior can and should be reliably and objectively measured. And thus Skinner's theory moves from the category of speculative to the category of empirical (scientifically substantiated). He put the science of behavior in the category of natural sciences, i.e. sciences: based on facts (1) and whose goal is to predict and control the phenomenon under study (2).

Skinner proposed as a method of studying behavior - a functional analysis of behavior. He pointed out that behavior is best studied by referring to how it relates to prior events. He believes that behavior can be learned and controlled by manipulating the environment in which the organism is included. In this case, there is no need to consider the mechanisms operating inside the body.

Thus, functional analysis makes it possible to establish precise and conditional relationships between open behavior (reaction) and environmental conditions (stimuli) that control behavior. Functional analysis makes it possible to establish a causal relationship between behavior and the environment. By manipulating environmental variables (independent variables - those manipulated by the experimenter), it is possible to predict and measure human behavior (dependent variable - the one that changes as a result of manipulation).

Skinner did not accept the idea of ​​a person or self that directs or stimulates behavior. He believes that it is necessary to abandon the idea that behavior is generated by forces that are inside the individual (traits, needs, thoughts, feelings), in favor of more scientific ideas about forces outside of man. He believes that human behavior is regulated not from the inside, but from the outside - by the environment. According to Skinner, the study of personality is finding the peculiar nature of the relationship between the behavior of the organism and the results of this behavior, which reinforce it later. This approach focuses on predicting and controlling observed behavior.

Like Watson, Skinner paid great attention to learning, but unlike Watson, his main interest was not classical, but so-called operant learning. In classical learning, the organism associates different stimuli; in operant learning, the organism associates its behavior with the subsequent result. Operant learning is governed by the law of effect, which was discovered by the American psychologist Edward Thorndack in the late 19th century. In his experiments, Thorndike used the so-called problem cages, in which he placed hungry cats. To get out of such a problematic cage, the cat had to pull the rope or lift the hook. While observing the animals, Thorndike noticed that when placed in a problematic cage, the cat would randomly rush around the cage and, in the end, accidentally touch the rope or hook. However, with each subsequent attempt, the activity of the animals concentrated more and more around the rope or hook, and after repeated trials, the cat learned to leave the cage. This kind of learning is also called trial and error learning. This learning is subject to the law of effect, according to which if the behavior leads to the desired result (is rewarded), the likelihood of its repetition increases.

For his approach to understanding personality, Skinner adds to his theory two types of behavior: respondent and operant behavior.

Respondent behavior refers to a response that is elicited by a stimulus. The stimulus always precedes the response.

There are two types of respondent behavior:

  1. conditioned reflex
  2. definitely reflex.