Premack principle
The Premack principle appears in the context of operant conditioning and supports the existence of a determining psychological dimension in the repetition or extinction of a behavior. This dimension is the value that the individual attaches to a specific event, which is generated through their interactions with that event.
This principle represented one of the great postulates of operant conditioning in the mid-twentieth century, as it established a rupture with the traditional definition of “reinforcer”, which had important consequences in learning models and motivation studies.
Definition and Origins of Premack principle
Between the years 1954 and 1959, American psychologist David Premack and his wife and collaborator Ann James Premack conducted several investigations into operant conditioning, analyzing the behavior of monkeys belonging to the genus Cebus .
Initially, these investigations were carried out at the Yerkes Primate Biology Laboratory, located in the state of Florida. Then at the University of Missouri, Columbia State; later at the University of California, and finally at the University of Pennsylvania.
Premack hypothesis was: all responses reinforce any response B, if and only if the probability of occurrence of response A is greater than response B. That is, they wanted to prove that a rare behavioral response can be reinforced by another response, as long as the latter implies a greater preference for the former.
In other words, the premack principle supports the following: if there is a behavior or activity that arouses little interest, it is more likely that this behavior will not occur spontaneously . However, if immediately after doing so, there is an opportunity to perform another behavior or activity that interests you, the first one (the one that doesn’t interest you) will significantly increase your chance of repetition.
Contributions to operant conditioning
In Skinner’s operant conditioning, enhancers are stimuli that have the intrinsic property of increasing the incidence of a behavior. Thus, the very definition of a “reinforcer” was given by its effects on behavior, with which it was any stimulus that had the capacity to increase behavior whenever it was in operation. This made the reinforcer itself the center of efforts to increase any behavior.
But when Primack’s hypothesis is proven, Skinner’s theory of operant conditioning takes an important turn: far from working absolutely, reinforcers work relatively.
That is, the reinforcer does not matter by itself, what matters is how many response opportunities it offers the individual. In this sense, what determines the effect of an event is the value that the subject attributes to the event itself . For this theory, the main thing is the responses, with which what increases the appearance of a behavior is not so much “a reinforcer” as a series of “reinforcing events”.
The theory of response deprivation
Later, other experiments and research carried out in the context of operant conditioning questioned the operation of the Premack principle.
Among them is the theory of response deprivation. In general terms, it suggests that there are situations in which the restriction of access to the reinforcing response, far from increasing the preference for the instrumental response, what it does is increase the motivation for the first one and, therefore, the series of associated behaviors. with this. In a nutshell, it suggests that the less you can access a behavior, the more motivation it generates.
The value according to this theory
According to Pereira, Caycedo, Gutiérrez and Sandoval (1994), due to the importance that the Premack principle attaches to the motivation generated by reinforcing events, one of the central concepts of the Premack principle is “value”, whose definition can be summarized and defined as follows:
Organisms classify world events according to a hierarchy of values .
The value is measured by the probability of an organism responding to a stimulus. In turn, probability can be measured by the duration of interaction with said response. That is, the more you spend doing an activity, the value that activity has for the individual is certainly greater.
If an event that is more valued is immediately presented to another that is less valued, the latter’s behaviors are reinforced. Likewise, the least valued event and the behaviors involved in it acquire “instrumental” value.
If the opposite effect occurs (a lower-valued event occurs immediately after a higher-valued one), what happens is the punishment of the instrumental behavior , that is, it decreases the probability that the less valued behavior will be repeated.
Likewise, “value” is defined as a psychological dimension that individuals attribute to events, just as other properties are attributed (size, color, weight, for example). In the same sense, the value is assigned according to the specific interaction that an individual establishes with the event.
It is this psychological dimension that determines the probability of occurrence or disappearance of a behavior, that is, the effect of reinforcement or punishment. For this reason, to ensure that a behavior occurs or is extinct , it is essential to analyze the value that the individual attaches to it.
This entails analyzing the individual’s present and past interactions with the event he or she wishes to reinforce, such as opportunities to generate other responses or events.
The pinball and candy experiment
To bring all of this to fruition, we conclude by describing an experiment that David Premack and his collaborators performed with a group of children . In the first part, two alternatives (called “answers”) were presented: eating a candy or playing with a pinball machine.
In this way, it was possible to determine which of these two behaviors are more likely to be repeated for each child (and with this, the level of preference was determined).
In the second part of the experiment, the children were told that they could have candy as long as they played with the pinball machine. Thus, “eating a candy” was the reinforcing response and “playing with the pinball machine” was the instrumental response. The result of the experiment was the following: only children who had a greater preference for “eating a candy” reinforced their least likely behavior or that caused less interest, that of “playing with the pinball machine”.