|
|
Line 1: |
Line 1: |
| {{Citations missing|date=October 2007}}
| | <br><br>Last week I woke up and [http://Www.answers.com/topic/realized+- realized -] Now I have been single for some time and after much intimidation from pals I now locate myself registered for internet dating. They [http://lukebryantickets.iczmpbangladesh.org luke bryan stage setup] assured me that there are lots of standard, sweet and interesting folks to meet, so the pitch is gone by here!<br>My household and friends are magnificent and hanging out together at bar gigabytes or dishes is obviously a must. I have never been in to night clubs as I luke bryan 2014 tickets ([http://lukebryantickets.lazintechnologies.com lukebryantickets.lazintechnologies.com]) come to realize you could never have a significant dialogue with the noise. I likewise have two definitely cheeky and really adorable dogs who are invariably eager to meet up fresh people.<br>I make an effort to maintain as physically fit as potential staying at the fitness center many times a week. luke bryan book ([http://minioasis.com minioasis.com]) I enjoy my sports and make an effort to play or see as many a possible. I am going to frequently at Hawthorn suits being winter. Notice: If you really considered buying a hobby I really don't brain, I've seen the carnage of wrestling suits at stocktake revenue.<br><br>Also visit my web site [http://www.museodecarruajes.org where to buy luke bryan tickets] |
| The '''representativeness heuristic''' is used when making judgments about the probability of an event under uncertainty.<ref name="kt72">Kahneman & Tversky,
| |
| 1972</ref> It is one of a group of [[heuristics in judgment and decision making|heuristics]] (simple rules governing judgment or decision making) proposed by psychologists [[Amos Tversky]] and [[Daniel Kahneman]] in the early 1970s.
| |
| Heuristics are described as “judgmental shortcuts that generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course.” <ref name="GS96">Gilovich, T. & Savitsky, K. (1996). Like goes with like: The role of representativeness in erroneous and pseudoscientific beliefs. Skeptical Inquirer, 34-40.</ref> Heuristics are useful because they use effort-reduction and simplification in decision making.<ref name="Shah08">Shah, A. K., & Oppenheimer, D. M. (2008). Heuristics made easy: An effort-reduction framework. Psychological Bulletin, 134(2), 207-222.</ref>
| |
| | |
| Tversky and Kahneman defined
| |
| representativeness as "the degree to which [an event] (i) is similar in essential
| |
| characteristics to its parent population, and (ii) reflects the salient features of the process by
| |
| which it is generated".<ref>{{cite book|last=Kahneman, Tversky|first=Daniel, Amos|title=Judgment under uncertainty: Heuristics and biases|year=1972|publisher=Cambridge University Press|location=Cambridge|editor=Kahneman, Slovic, Tversky|chapter=Subjective probability: A judgment of representativeness}}</ref><ref>{{cite book|last=Kahneman|first=D|title=Judgment under uncertainty: Heuristics and biases|year=1972|publisher=Cambridge University Press|location=Cambridge}}</ref> When people rely on
| |
| representativeness to make judgments, they are likely to judge wrongly because the fact
| |
| that something is more representative does not make it more likely.<ref name="tk82">Tversky & Kahneman,
| |
| 1982</ref> The representativeness heuristic is simply described as assessing similarity of objects and organizing them based around the category prototype (e.g. like goes with like and causes and effects should resemble each other.<ref name="GS96" />
| |
| This heuristic is used because it is an easy computation.<ref name="tk82" /> The problem is that people overestimate its ability to accurately predict the likelihood of an event.<ref>Fortune & Goodie, 2011</ref> Thus it can result in [[base rate fallacy|neglect of relevant base rates]] and other [[cognitive bias]]es.<ref>Judgment under Uncertainty: Heuristics and Biases. ''Science'', New Series, Vol. 185, No. 4157, pp. 1124-1131</ref><ref>''Human Inference: Strategies and Shortcomings of Social Judgment.'' Prentice Hall, Englewood Cliffs NJ, pp.115-118</ref>
| |
| | |
| ==Determinants of representativeness==
| |
| Certain factors of the judgment or decision to be made make the use of the representativeness heuristic more likely.
| |
| ===Similarity===
| |
| When judging the representativeness of a new stimulus/ event people usually pay attention
| |
| to the degree of similarity between the stimulus/ event and a standard/ process.<ref name="kt72" /> It is also important that those features be salient.<ref name="kt72" /> Nilsson, Juslin and Olsson (2008) found this to be influenced by the
| |
| exemplar account of memory (concrete examples of a category are stored in memory) so
| |
| that new instances were classed as representative if highly similar to a category as well as if
| |
| frequently encountered.<ref>{{cite journal|last=Nilsson, Juslin, Olsson|first=H, P, H|title=Exemplars in the mist: The cognitive substrate of the representativeness heuristic|journal=Scandinavian Journal of Psychology|year=2008|volume=49|pages=201–212|doi=10.1111/j.1467-9450.2008.00646.x}}</ref>
| |
| Several examples of similarity have been described in the representativeness heuristic literature. Research has focused on medical beliefs. People often believe that medical symptoms should resemble their causes or treatments. For example, people have long believed that ulcers were caused by stress, due to the representativeness heuristic, when in fact bacteria cause ulcers. In a similar line of thinking, in some alternative medicine beliefs patients have been encouraged to eat organ meat that corresponds to their medical disorder. Use of the representativeness heuristic can be seen in even simpler beliefs, such as the belief that eating fatty foods makes one fat.<ref name="GS96" /> Even physicians may be swayed by the representativeness heuristic when judging similarity, in diagnoses, for example.<ref name="Garb">Garb, H. N. (1996). The representativeness and past-behavior heuristics in clinical judgment. Professional Psychology: Research and Practice, 27(3), 272-277.</ref> The researcher found that clinicians use the representativeness heuristic in making diagnoses by judging how similar patients are to the stereotypical or prototypical patient with that disorder.<ref name="Garb" />
| |
| | |
| ===Randomness===
| |
| Irregularity and local representativeness affect judgments of randomness.
| |
| Things that do not appear to have any logical sequence are regarded as representative of
| |
| randomness and thus more likely to occur. E.g. THTHTH as a series of coin tosses would
| |
| not be considered representative of randomly generated coin tosses as it is too well ordered.<ref name="kt72" />
| |
| | |
| Local representativeness is an assumption wherein people rely on the law of small numbers, whereby small
| |
| samples are perceived to represent their population to the same extent as large samples
| |
| (Tversky and Kahneman, 1971). A small sample which appears randomly distributed would reinforce the belief, under the assumption of local representativeness, that the population is randomly distributed. Conversely, a small sample with a skewed distribution would weaken this belief. If a coin toss is repeated several times and the majority of the results consists of 'heads', the assumption of local representativeness will cause the observer to believe the coin is biased toward 'heads'.
| |
| | |
| ==Tversky and Kahneman's Classic Studies==
| |
| ===Tom W.===
| |
| In a study done in 1973,<ref>Kahneman, D., & Tversky, A. (1973). On the Psychology of Prediction. Psychological Review, 80, 237-251.</ref> Kahneman and Tversky divided their participants into three groups:
| |
| | |
| *“Base-rate group,” which was given the instructions: “Consider all the first-year graduate students in the U.S. today. Please write down your best guesses about the percentage of students who are now enrolled in the following nine fields of specialization.” The nine fields given were business administration, computer science, engineering, humanities and education, law, library science, medicine, physical and life sciences, and social science and social work.
| |
| | |
| *“Similarity group,” who were given a personality sketch. "Tom W. is of high intelligence, although lacking in true creativity. He has a need for order and clarity, and for neat and tidy systems in which every detail finds its appropriate place. His writing is rather dull and mechanical, occasionally enlivened by somewhat corny puns and by flashes of imagination of the sci-fi type. He has a strong drive for competence. He seems to feel little sympathy for other people and does not enjoy interacting with others. Self-centered, he nonetheless has a deep moral sense." The participants in this group were asked to rank the nine areas listed in part 1 in terms of how similar Tom W. is to the prototypical graduate student of each area.
| |
| | |
| *“Prediction group,” who were given the personality sketch described in 2, but were also given the information “The preceding personality sketch of Tom W. was written during Tom’s senior year in high school by a psychologist, on the basis of projective tests. Tom W. is currently a graduate student. Please rank the following nine fields of graduate specialization in order of the likelihood that Tom W. is now a graduate student in each of these fields.”
| |
| | |
| The judgments of how likelihood were much closer with the judgments of similarity than with the estimated base rates. The findings supported the authors’ predictions that people make predictions based on how representative something is (similar), than based on relative base rate information. For example, more than 95% of the participants said that Tom would be more likely to study computer science than education or humanities, when there were much higher base rate estimates for education and humanities than computer science.
| |
| | |
| ===The taxicab problem===
| |
| | |
| In another study done by Tversky and Kahneman, subjects were given the following problem:
| |
| | |
| :"A cab was involved in a hit and run accident at night. Two cab companies, the Green and the Blue, operate in the city. 85% of the cabs in the city are Green and 15% are Blue.
| |
| | |
| :A witness identified the cab as Blue. The court tested the reliability of the witness under the same circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time.
| |
| | |
| :What is the probability that the cab involved in the accident was Blue rather than Green knowing that this witness identified it as Blue?"
| |
| | |
| Most subjects gave probabilities over 50%, and some gave answers over 80%. The correct answer, found using [[Bayes' theorem]], is lower than these estimates:
| |
| *There is a 12% chance (15% times 80%) of the witness correctly identifying a blue cab.
| |
| *There is a 17% chance (85% times 20%) of the witness incorrectly identifying a green cab as blue.
| |
| *There is therefore a 29% chance (12% plus 17%) the witness will identify the cab as blue.
| |
| *This results in a 41% chance (12% divided by 29%) that the cab identified as blue is actually blue.
| |
| | |
| [[Image:TaxicabProblem.png]] | |
| | |
| Representativeness is cited in the similar effect of the [[gambler's fallacy]], the [[regression fallacy]] and the [[conjunction fallacy]].
| |
| | |
| ==Problems in Using the Representativeness Heuristic==
| |
| ===Base Rate Neglect and Base Rate Fallacy===
| |
| The use of the representativeness heuristic will likely lead to violations of [[Bayes' Theorem]].
| |
| Bayes' Theorem states:
| |
| :<math>P(H|D) = \frac{P(D | H)\, P(H)}{P(D)}.</math>
| |
| <br />However, judgments by representativeness only look at the resemblance between the hypothesis and the data, thus inverse probabilities are equated:
| |
| | |
| <math>P(H|D)=P(D|H)</math>
| |
| | |
| As can be seen, the [[base rate]] P(H) is ignored in this equation, leading to the [[base rate fallacy]]. A base rate is a phenomenon’s basic rate of incidence. The base rate fallacy describes how people do not take the base rate of an event into account when solving probability problems.<ref name="Axelsson">Axelsson, S. (2000). The base-rate fallacy and the difficulty of intrusion detection. ACM Transactions on information and System Security, 3(3), 186-205.</ref> This was explicitly tested by Dawes, Mirels, Gold and Donahue (1993)<ref>Dawes, Mirels, Gold, Donahue (1993). Equating inverse probabilities in implicit personality judgments. ''Psychological Science,''4,6, 396-400</ref> who had people judge both the base rate of people who had a particular personality trait and the probability that a person who had a given personality trait had another one. For example, participants were asked how many people out of 100 answered true to the question "I am a conscientious person" and also, given that a person answered true to this question, how many would answer true to a different personality question. They found that participants equated inverse probabilities (e.g., <math>P(conscientious|neurotic)=P(neurotic|conscientious)</math>) even when it was obvious that they were not the same (the two questions were answered immediately after each other).
| |
| | |
| A medical example is described by Axelsson.<ref name="Axelsson" /> Say a doctor performs a test that is 99% accurate, and you test positive for the disease. However, the incidence of the disease is 1/10,000. Your actual chance of having the disease is 1%, because the population of healthy people is so much larger than the disease. This statistic often surprises people, due to the base rate fallacy, as many people do not take the basic incidence into account when judging probability. Research by Bar-Hillel (1980) suggests that perceived relevancy of information is vital to base-rate neglect: base rates are only included in judgments if they seem equally relevant to the other information.<ref name="BH80">Bar-Hillel, M. (1980). The base-rate fallacy in probability judgments. Acta Psychologica, 44, 211-233.</ref>
| |
| | |
| Some research has explored base rate neglect in children, as there was a lack of understanding about how these judgment heuristics develop.<ref name="Davidson">Davidson, D. (1995). The representativeness heuristic and the conjunction fallacy effect in children’s decision making. Merril-Palmer Quarterly, 41(3), 328-346.</ref><ref name="Jacobs">Jacobs, J. E., & Potenza, M. (1991). The use of judgment heuristics to make social and object decisions: A developmental perspective, Child Development, 62, 166-178.</ref> The authors of one such study wanted to understand the development of the heuristic, if it differs between social judgments and other judgments, and whether children use base rates when they are not using the representativeness heuristic. The authors found that the use of the representativeness heuristic as a strategy begins early on and is consistent. The authors also found that children use idiosyncratic strategies to make social judgments initially, and use base rates more as they get older, but the use of the representativeness heuristic in the social arena also increase as they get older. The authors found that, among the children surveyed, base rates were more readily used in judgments about objects than in social judgments.<ref name="Jacobs" /> After that research was conducted, Davidson (1995) was interested in exploring how the representativeness heuristic and conjunction fallacy in children related to children’s stereotyping.<ref name="Davidson" /> Consistent with previous research, children based their responses to problems off of base rates when the problems contained nonstereotypic information or when the children were older. There was also evidence that children commit the conjunction fallacy. Finally, as students get older, they used the representativeness heuristic on stereotyped problems, and so made judgments consistent with stereotypes.<ref name="Davidson" /> There is evidence that even children use the representativeness heuristic, commit the conjunction fallacy, and disregard base rates.
| |
| | |
| Research suggests that use or neglect of base rates can be influenced by how the problem is presented, which reminds us that the representativeness heuristic is not a “general, all purpose heuristic,” but may have many contributing factors.<ref>Gigerenzer, G., Hell, W., Blank, H. (1988). Presentation and content: The use of base rates as a continuous variable. Journal of Experimental Psychology: Human Perception and Performance, 14(3), 513-525.</ref> Base rates may be neglected more often when the information presented is not causal.<ref>Ajzen, I. (1977). Intuitive theories of events and the effects of base-rate information on prediction. Journal of Personality and Social Psychology, 35(5), 303-314.</ref> Base rates are used less if there is relevant individuating information.<ref>Koehler, J. J. (1996). The base rate fallacy reconsidered: Descriptive, normative, and methodological challenges. Behavioral and Brain Sciences, 19, 1-17.</ref> Groups have been found to neglect base rate more than do individuals.<ref>Argote, L., Seabright, M. A., & Dyer, L. (1986). Individual versus group use of base-rate and individuating information. Organizational Behavior and Human Decision Processes, 38(1), 65-75.</ref> Use of base rates differs based on context.<ref>Zukier, H. (1984). Social roles and strategies in prediction: Some determinants of the use of base-rate information. Journal of Personality and Social Psychology, 47(2), 349-360.</ref> Research on use of base rates have been inconsistent, with some authors suggesting a new model is necessary.<ref>Medin, D. L. (1988). Problem structure and the use of base-rate information from experience. Journal of Experimental Psychology, 117(1), 68-85.</ref>
| |
| | |
| | |
| ==Disjunction fallacy==
| |
| In addition to extensionality violation, base-rate neglect, and the conjunction fallacy, the use of representativeness heuristic may lead to a disjunction fallacy. From probability theory the disjunction of two events is at least as likely as either of the events individually. For example, the probability of being either a physics or biology major is at least as likely as being a physics major, if not more likely. However, when a personality description (data) seems to be very representative of a physics major (e.g., pocket protector) over a biology major, people judge that it is more likely for this person to be a physics major than a natural sciences major (which is a superset of physics).
| |
| | |
| Further evidence that the representativeness heuristic may cause the disjunction fallacy comes from Bar-Hillel and Neter (1986).<ref>Bar-Hillel, M., Neter, E. (1986). How alike is it? versus how likely is it?: A disjunction fallacy in probability judgments, ''Journal of Personality and Social Psychology', 65, 1119-1131'</ref> They found that people judge a person who is highly representative of being a statistics major (e.g., highly intelligent, does math competitions) as being more likely to be a statistics major than a social sciences major (superset of statistics), but they do not think that he is more likely to be a Hebrew language major than a humanities major (superset of Hebrew language). Thus, only when the person seems highly representative of a category is that category judged as more probable than its superordinate category. These incorrect appraisals remained even in the face of losing real money in bets on probabilities.
| |
| | |
| ===Conjunction Fallacy===
| |
| The representativeness heuristic violates one of the fundamental properties of probability: [[extensionality]]. For example, participants were provided with a description of Linda who resembles a feminist. Then participants were asked to evaluate the probability of her being a feminist, the probability of her being a bank teller, or the probability of being both a bank teller and feminist. Probability theory dictates that the probability of being both a bank teller and feminist (the [[Logical conjunction|conjunction]] of two sets) must be less than or equal to the probability of being either a feminist or a bank teller. A conjunction cannot be more probable than one of its constituents. However, participants judged the conjunction (bank teller and feminist) as being more probable than being a bank teller alone.<ref>Tversky, A., Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgments. "Psychological Review", 90, 293-315.</ref>
| |
| Some research suggests that the conjunction error may partially be due to subtle linguistic factors, such as inexplicit wording or semantic interpretation of “probability.”<ref>Fiedler, K. (1988). The dependence of the conjunction fallacy on subtle linguistic factors. Psychological Research, 50, 123-129.</ref><ref name="Politzer">Politzer, G. & Noveck, I. A. (1991). Are conjunction rule violations the result of controversial rule violations? Journal of Psycholinguistic Research, 20(20), 83-103.</ref> The authors argue that both logic and language use may relate to the error, and it should be more fully investigated.<ref name="Politzer" />
| |
| | |
| ===Extensionality Violation===
| |
| In logic, extensionality requires “two formulas which have the same truth-value under any truth-assignments to be mutually constitutable salva veritae in a sentence that contains one of these formulas.”<ref name="BG09">Bourgeois-Gironde, S. & Giraud, R. (2009). Framing effects as violations of extensionality. Theory and Decision, 67, 385-404.</ref> Put simply, objects that have the same external properties are equal. This principle, applied to decision making, suggests that making a decision in a problem should not be affected by how the problem is described. For example, varied descriptions of the same decision problem should not give rise to different decisions, due to the extensionality principle. If judgments are made on the bases of irrelevant information as described, that is called an extensionality violation.<ref name="BG09" />
| |
| | |
| ==See also==
| |
| {{Portal box|Psychology|Thinking}}
| |
| *[[Affect heuristic]]
| |
| *[[Attribute substitution]]
| |
| *[[Availability heuristic]]
| |
| *[[List of biases in judgment and decision making]]
| |
| *[[Heuristic]]
| |
| *[[Base rate fallacy]]
| |
| *[[Conjunction fallacy]]
| |
| *[[Amos Tversky]]
| |
| *[[Daniel Kahneman]]
| |
| | |
| ==References==
| |
| <references/>
| |
| * Baron, J. (2000). ''Thinking and Deciding'' (3d ed.). Cambridge University Press.
| |
| * Plous, S. (1993). ''The Psychology of Judgement and Decision Making'' New York: McGraw-Hill
| |
| * Kahneman, D., & Tversky, A. (1973). On the Psychology of Prediction. ''Psychological Review, 80,'' 237-251.
| |
| * Tversky, A., & Kahneman, D. (1982). Evidential Impact of Base Rates. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), ''Judgment under Uncertainty: Heuristics and Biases.'' Cambridge: Cambridge University Press.
| |
| *
| |
| *
| |
| | |
| ==External links==
| |
| * [http://posbase.uib.no/posbase/Presentasjoner/K_Representativeness.ppt Powerpoint presentation on the representativeness heuristic (with further links to presentations of classical experiments)]
| |
| | |
| {{DEFAULTSORT:Representativeness Heuristic}}
| |
| [[Category:Heuristics]]
| |
| [[Category:Prospect theory]]
| |
| | |
| [[fr:Heuristique de disponibilité]]
| |