The representativeness heuristic is a psychological term wherein people judge the probability or frequency of a hypothesis by considering how much the hypothesis resembles available data as opposed to using a Bayesian calculation. Bayesian probability is one of the different interpretations of the concept of probability and belongs to the category of evidential probabilities. The Bayesian interpretation of probability can be seen as an extension of logic that enables reasoning with uncertain statements. Though, often very useful in everyday life, representativeness heuristic can also result in neglect of relevant base rates and other cognitive biases.
Like any other rule of thumb, representativeness heuristic has pluses and minuses. The base rate fallacy, also called base rate neglect, is an error that occur when the conditional probability of some hypothesis H given some evidence E is assessed without taking into account the "base rate" or "prior probability" of H and the total probability of evidence E. A cognitive bias is a pattern of deviation in judgment that occurs in particular situations. Implicit in the concept of a "pattern of deviation" is a standard of comparison. The representative heuristic is what determines the reaction of people to others that are different from them.
The representative heuristic was first proposed by Amos Tversky, a cognitive and mathematical psychologist, and Daniel Kahneman, an Israeli-American psychologist in the early 1970s. Kahneman won the 2002 Nobel Prize in Economics.
The representative heuristic is the spirit of how society is institutionally biased. It is formed from childhood when as children, we are taught to learn by association. For instance, we learn to associate dark clouds with storms so that we don't need a weather forecast to tell us what's coming when we see dark clouds. We are taught the inclination and the prejudices of the grown ups. Even at that tender age, even as still very susceptible, we absorb ideas rapidly. We hear the adults make comments about others, call names, make jokes, etc, that in our eyes seem acceptable as we grow up. As children, we don't know better.
In order to test for the representativeness heuristic, Kahneman and Tversky gave their subjects the following information:
Tom W. is of high intelligence, while lacking in true creativity. He has necessitated for order and clarity, and for neat and tidy systems in which every detail finds its correct place. His writing is quite dull and mechanical, seldom cheered up by somewhat old puns and by flashes of imagination of the sci-fi type. He has a strong drive for capability. He seems to feel little sympathy for other people and does not enjoy mingling with others. Self-centered, he nonetheless has a deep moral sense.
The subjects receiving the information were then divided into three groups, each being given a different decision task:
The results of the study depicted that the subjects had a high tendency to assign Tom W. to the engineering group based on representativeness alone, in spite of the fact that engineering students were quite rare at the school where the study was conducted, making up significantly less than 1/9 of all students. Being misled based on representations; the subjects ignored the background probabilities of Tom W. being in any given major, notwithstanding any of his personal qualities. Extensive subsequent testing has found that this pathology is universal and applies in a wide variety of problem domains. So, the results made a point that instead of judging something based just on its qualities, consider the background probabilities and try not to make too many assumptions.
The use of Representativeness Heuristic may lead to a Disjunction Fallacy. From probability theory, the disjunction of two events is at least as likely as either of the events individually. For example, the probability of being either a physics or biology major is at least as likely as being a physics major, if not more likely. Yet, when a personality description (data) seems to be very representative of a physics major (e.g., pocket protector) over a biology major, people judge that it is more likely for this person to be a physics major than a natural sciences major (which is a superset of physics).
Jon Krosnick, a professor in Communication at Stanford, in his work has proposed that the effects that Kahneman and Tversky saw in their work may be partially attributed to information order effects. When the order of information was reversed - with probability figures coming later, a lot of the effects were mitigated.