Curse of knowledge

Last updated

The curse of knowledge, also called the curse of expertise [1] or expert's curse, is a cognitive bias that occurs when a person who has specialized knowledge assumes that others share in that knowledge. [2]

Contents

For example, in a classroom setting, teachers may have difficulty if they cannot put themselves in the position of the student. A knowledgeable professor might no longer remember the difficulties that a young student encounters when learning a new subject for the first time. This curse of knowledge also explains the danger behind thinking about student learning based on what appears best to faculty members, as opposed to what has been verified with students. [3]

History of concept

The term "curse of knowledge" was coined in a 1989 Journal of Political Economy article by economists Colin Camerer, George Loewenstein, and Martin Weber. The aim of their research was to counter the "conventional assumptions in such (economic) analyses of asymmetric information in that better-informed agents can accurately anticipate the judgement of less-informed agents". [4]

Such research drew from Baruch Fischhoff's work in 1975 surrounding hindsight bias, a cognitive bias that knowing the outcome of a certain event makes it seem more predictable than may actually be true. [5] Research conducted by Fischhoff revealed that participants did not know that their outcome knowledge affected their responses, and, if they did know, they could still not ignore or defeat the effects of the bias. Study participants could not accurately reconstruct their previous, less knowledgeable states of mind, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge". [6] This receipt of knowledge returns to the idea of the curse proposed by Camerer, Loewenstein, and Weber: a knowledgeable person cannot accurately reconstruct what a person, be it themselves or someone else, without the knowledge would think, or how they would act. In his paper, Fischhoff questions the failure to empathize with ourselves in less knowledgeable states, and notes that how well people manage to reconstruct perceptions of lesser informed others is a crucial question for historians and "all human understanding". [6]

This research led the economists Camerer, Loewenstein, and Weber to focus on the economic implications of the concept and question whether the curse harms the allocation of resources in an economic setting. The idea that better-informed parties may suffer losses in a deal or exchange was seen as something important to bring to the sphere of economic theory. Most theoretical analyses of situations where one party knew less than the other focused on how the lesser-informed party attempted to learn more information to minimize information asymmetry. However, in these analyses, there is an assumption that better-informed parties can optimally exploit their information asymmetry when they, in fact, cannot. People cannot utilize their additional, better information, even when they should in a bargaining situation. [5]

For example, two people are bargaining over dividing money or provisions. One party may know the size of the amount being divided while the other does not. However, to fully exploit their advantage, the informed party should make the same offer regardless of the amount of material to be divided. [7] But informed parties actually offer more when the amount to be divided is larger. [8] [9] Informed parties are unable to ignore their better information, even when they should. [5]

Experimental evidence

A 1990 experiment by a Stanford University graduate student, Elizabeth Newton, illustrated the curse of knowledge in the results of a simple task. A group of subjects were asked to "tap" out well known songs with their fingers, while another group tried to name the melodies. When the "tappers" were asked to predict how many of the "tapped" songs would be recognized by listeners, they would always overestimate. The curse of knowledge is demonstrated here as the "tappers" are so familiar with what they were tapping that they assumed listeners would easily recognize the tune. [10] [11]

A study by Susan Birch and Paul Bloom involving Yale University undergraduate students used the curse of knowledge concept to explain the idea that the ability of people to reason about another person's actions is compromised by the knowledge of the outcome of an event. The perception the participant had of the plausibility of an event also mediated the extent of the bias. If the event was less plausible, knowledge was not as much of a "curse" as when there was a potential explanation for the way the other person could act. [12] However, a replication study conducted in 2014 found that this finding was not reliably reproducible across seven experiments with large sample sizes, and the true effect size of this phenomenon was less than half of that reported in the original findings. Therefore, it is suggested that "the influence of plausibility on the curse of knowledge in adults appears to be small enough that its impact on real-life perspective-taking may need to be reevaluated." [13]

Other researchers have linked the curse of knowledge bias with false-belief reasoning in both children and adults, as well as theory of mind development difficulties in children.

Related to this finding is the phenomenon experienced by players of charades: the actor may find it frustratingly hard to believe that their teammates keep failing to guess the secret phrase, known only to the actor, conveyed by pantomime.

Implications

In the Camerer, Loewenstein and Weber article, it is mentioned that the setting closest in structure to the market experiments done would be underwriting, a task in which well-informed experts price goods that are sold to a less-informed public.

Investment bankers value securities, experts taste cheese, store buyers observe jewelry being modeled, and theater owners see movies before they are released. They then sell those goods to a less-informed public. If they suffer from the curse of knowledge, high-quality goods will be overpriced and low-quality goods underpriced relative to optimal, profit-maximizing prices; prices will reflect characteristics (e.g., quality) that are unobservable to uninformed buyers. [5]

The curse of knowledge has a paradoxical effect in these settings. By making better-informed agents think that their knowledge is shared by others, the curse helps alleviate the inefficiencies that result from information asymmetries (a better informed party having an advantage in a bargaining situation), bringing outcomes closer to complete information. In such settings, the curse on individuals may actually improve social welfare ("you get what you pay for").

Applications

Marketing

Economists Camerer, Loewenstein, and Weber first applied the curse of knowledge phenomenon to economics, in order to explain why and how the assumption that better-informed agents can accurately anticipate the judgments of lesser-informed agents is not inherently true. They also sought to support the finding that sales agents who are better informed about their products may, in fact, be at a disadvantage against other, less-informed agents when selling their products. The reason is said to be that better-informed agents fail to ignore the privileged knowledge that they possess and are thus "cursed" and unable to sell their products at a value that more naïve agents would deem acceptable. [5] [14]

Education

It has also been suggested that the curse of knowledge could contribute to the difficulty of teaching. [3] The curse of knowledge means that it could be potentially ineffective, if not harmful, to think about how students are viewing and learning material by asking the perspective of the teacher as opposed to what has been verified by students. The teacher already has the knowledge that they are trying to impart, but the way that knowledge is conveyed may not be the best for those who do not already possess the knowledge.

The curse of expertise may be counterproductive for learners acquiring new skills. [15] [16] This is important because the predictions of experts can influence educational equity and training as well as the personal development of young people, not to mention the allocation of time and resources to scientific research and crucial design decisions. [17] Effective teachers must predict the issues and misconceptions that people will face when learning a complex new skill or understanding an unfamiliar concept. This should also encompass the teachers’ recognizing their own or each other's bias blind spots.

Quality assurance (QA) is a way of circumventing the curse of experience by applying comprehensive quality management techniques. Professionals by definition get paid for technically well defined work so that quality control procedures may be required which encompass the processes employed, the training of the expert and the ethos of the trade or profession of the expert. Some experts (lawyers, physicians, etc.) require a licence which may include a requirement to undertake ongoing professional development (i.e. obtain OPD credits issued by collegiate universities or professional associations – see also normative safety.)

Decoding the Disciplines is another way of coping with the curse of knowledge in educational settings. It intends to increase student learning by narrowing the gap between expert and novice thinking resulting from the curse of knowledge. The process seeks to make explicit the tacit knowledge of experts and to help students master the mental actions they need for success in particular disciplines.

Academics

Academics are usually employed in research and development activities that are less well understood than those of professionals, and therefore submit themselves to peer review assessment by other appropriately qualified individuals.

Computer programming

It can also show up in computer programming where the programmer fails to produce understandable code, e.g. comment their code, because it seems obvious at the time they write it. But a few months later they themselves may have no idea why the code exists. The design of user interfaces is another example from the software industry, whereby software engineers (who have a deep understanding of the domain the software is written for) create user interfaces that they themselves can understand and use, but end users - who do not possess the same level of knowledge - find the user interfaces difficult to use and navigate. This problem has become so widespread in software design that the mantra "You are not the user [18] " has become ubiquitous in the user experience industry to remind practitioners that their knowledge and intuitions do not always match those of the end users they are designing for.

To-do lists

Another example is writing a to-do list and viewing it at a future time but forgetting what you had meant as the knowledge at the time of writing is now lost. [19] [ self-published source ]

The difficulty experienced people may encounter is exemplified fictionally by Dr. Watson in discourses with the insightful detective Sherlock Holmes. [20]

See also

Related Research Articles

<span class="mw-page-title-main">Expert</span> Person with broad and profound competence in a particular field

An expert is somebody who has a broad and deep understanding and competence in terms of knowledge, skill and experience through practice and education in a particular field or area of study. Informally, an expert is someone widely recognized as a reliable source of technique or skill whose faculty for judging or deciding rightly, justly, or wisely is accorded authority and status by peers or the public in a specific well-distinguished domain. An expert, more generally, is a person with extensive knowledge or ability based on research, experience, or occupation and in a particular area of study. Experts are called in for advice on their respective subject, but they do not always agree on the particulars of a field of study. An expert can be believed, by virtue of credentials, training, education, profession, publication or experience, to have special knowledge of a subject beyond that of the average person, sufficient that others may officially rely upon the individual's opinion on that topic. Historically, an expert was referred to as a sage. The individual was usually a profound thinker distinguished for wisdom and sound judgment.

<span class="mw-page-title-main">Behavioral economics</span> Academic discipline

Behavioral economics is the study of the psychological factors involved in the decisions of individuals or institutions, and how these decisions deviate from those implied by traditional economic theory.

Hindsight bias, also known as the knew-it-all-along phenomenon or creeping determinism, is the common tendency for people to perceive past events as having been more predictable than they were.

The anchoring effect is a psychological phenomenon in which an individual's judgments or decisions are influenced by a reference point or "anchor" which can be completely irrelevant. Both numeric and non-numeric anchoring have been reported in research. In numeric anchoring, once the value of the anchor is set, subsequent arguments, estimates, etc. made by an individual may change from what they would have otherwise been without the anchor. For example, an individual may be more likely to purchase a car if it is placed alongside a more expensive model. Prices discussed in negotiations that are lower than the anchor may seem reasonable, perhaps even cheap to the buyer, even if said prices are still relatively higher than the actual market value of the car. Another example may be when estimating the orbit of Mars, one might start with the Earth's orbit and then adjust upward until they reach a value that seems reasonable.

<i>The Wisdom of Crowds</i> 2004 book by James Surowiecki

The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations, published in 2004, is a book written by James Surowiecki about the aggregation of information in groups, resulting in decisions that, he argues, are often better than could have been made by any single member of the group. The book presents numerous case studies and anecdotes to illustrate its argument, and touches on several fields, primarily economics and psychology.

Affective forecasting, also known as hedonic forecasting or the hedonic forecasting mechanism, is the prediction of one's affect in the future. As a process that influences preferences, decisions, and behavior, affective forecasting is studied by both psychologists and economists, with broad applications.

A hot-cold empathy gap is a cognitive bias in which people underestimate the influences of visceral drives on their own attitudes, preferences, and behaviors. It is a type of empathy gap.

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) overplacement of one's performance relative to others; and (3) overprecision in expressing unwarranted certainty in the accuracy of one's beliefs.

<span class="mw-page-title-main">Zone of proximal development</span> Difference between what a learner can do without help and what they can do with help

The zone of proximal development (ZPD) is a concept in educational psychology. It represents the space between what a learner is capable of doing unsupported and what the learner cannot do even with support. It is the range where the learner is able to perform, but only with support from a teacher or a peer with more knowledge or expertise. The concept was introduced, but not fully developed, by psychologist Lev Vygotsky (1896–1934) during the last three years of his life. Vygotsky argued that a child gets involved in a dialogue with the "more knowledgeable other", such as a peer or an adult, and gradually, through social interaction and sense-making, develops the ability to solve problems independently and do certain tasks without help. Following Vygotsky, some educators believe that the role of education is to give children experiences that are within their zones of proximal development, thereby encouraging and advancing their individual learning skills and strategies.

The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known. Specifically, the outcome effect occurs when the same "behavior produce[s] more ethical condemnation when it happen[s] to produce bad rather than good outcome, even if the outcome is determined by chance."

The wisdom of the crowd is the collective opinion of a diverse and independent group of individuals rather than that of a single expert. This process, while not new to the Information Age, has been pushed into the mainstream spotlight by social information sites such as Quora, Reddit, Stack Exchange, Wikipedia, Yahoo! Answers, and other web resources which rely on collective human knowledge. An explanation for this phenomenon is that there is idiosyncratic noise associated with each individual judgment, and taking the average over a large number of responses will go some way toward canceling the effect of this noise.

Expertise finding is the use of tools for finding and assessing individual expertise. In the recruitment industry, expertise finding is the problem of searching for employable candidates with certain required skills set. In other words, it is the challenge of linking humans to expertise areas, and as such is a sub-problem of expertise retrieval.

Metamemory or Socratic awareness, a type of metacognition, is both the introspective knowledge of one's own memory capabilities and the processes involved in memory self-monitoring. This self-awareness of memory has important implications for how people learn and use memories. When studying, for example, students make judgments of whether they have successfully learned the assigned material and use these decisions, known as "judgments of learning", to allocate study time.

A judge–advisor system (JAS) is a type of advice structure often studied in advice taking research, a subset of decision-making in the social sciences. The two roles in a JAS are the judge and advisor roles. The judge is the decision maker who evaluates information concerning a particular decision and makes the final judgment on the decision outcome. The advisor is an individual who provides advice, information, or suggestions to the judge. A key component of the dynamics in a JAS is the differentiation between the two roles in that while the advisor provides input to the decision, the ultimate decision-making authority resides solely with the judge. This one person decision power differentiates the JAS and related models such as Hollenbeck's Hierarchical Decision-Making Team model from more widely studied models where the final decision is mutually decided upon by the team as a whole.

In social psychology, naïve realism is the human tendency to believe that we see the world around us objectively, and that people who disagree with us must be uninformed, irrational, or biased.

<span class="mw-page-title-main">Naïve cynicism</span> Cognitive bias

Naïve cynicism is a philosophy of mind, cognitive bias and form of psychological egoism that occurs when people naïvely expect more egocentric bias in others than actually is the case.

Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors.

Debiasing is the reduction of bias, particularly with respect to judgment and decision making. Biased judgment and decision making is that which systematically deviates from the prescriptions of objective standards such as facts, logic, and rational behavior or prescriptive norms. Biased judgment and decision making exists in consequential domains such as medicine, law, policy, and business, as well as in everyday life. Investors, for example, tend to hold onto falling stocks too long and sell rising stocks too quickly. Employers exhibit considerable discrimination in hiring and employment practices, and some parents continue to believe that vaccinations cause autism despite knowing that this link is based on falsified evidence. At an individual level, people who exhibit less decision bias have more intact social environments, reduced risk of alcohol and drug use, lower childhood delinquency rates, and superior planning and problem solving abilities.

Algorithm aversion is "biased assessment of an algorithm which manifests in negative behaviours and attitudes towards the algorithm compared to a human agent." It describes a phenomenon where humans reject advice from an algorithm in a case where they would accept the same advice if they thought it was coming from another human.

References

  1. Hinds, Pamela J. (1999). "The curse of expertise: The effects of expertise and debiasing methods on prediction of novice performance". Journal of Experimental Psychology: Applied. 5 (2): 205–221. doi:10.1037/1076-898X.5.2.205. S2CID   1081055.
  2. Kennedy, Jane (1995). "Debiasing the Curse of Knowledge in Audit Judgment". The Accounting Review. 70 (2): 249–273. JSTOR   248305.
  3. 1 2 Wieman, Carl (2007). "The 'Curse of Knowledge', or Why Intuition About Teaching Often Fails" (PDF). APS News. 16 (10). Archived from the original (PDF) on 2016-04-10.
  4. Froyd, Jeff; Layne, Jean (2008). "Faculty development strategies for overcoming the "curse of knowledge"". 2008 38th Annual Frontiers in Education Conference. doi:10.1109/FIE.2008.4720529. ISBN   978-1-4244-1969-2. S2CID   27169504.
  5. 1 2 3 4 5 Camerer, Colin; Loewenstein, George; Weber, Martin (1989). "The Curse of Knowledge in Economic Settings: An Experimental Analysis" (PDF). Journal of Political Economy. 97 (5): 1232–1254. CiteSeerX   10.1.1.475.3740 . doi:10.1086/261651. S2CID   8193254. Archived (PDF) from the original on 2015-03-06.
  6. 1 2 Fischhoff, Baruch (1975). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty". Journal of Experimental Psychology: Human Perception and Performance. 1 (3): 288–299. doi:10.1037/0096-1523.1.3.288. Reprinted: Fischhoff, Baruch (2003). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty". Qual Saf Health Care. 12 (4): 304–11. doi:10.1136/qhc.12.4.304. PMC   1743746 . PMID   12897366.
  7. Myerson, Roger B. "Negotiation in Games: A Theoretical Overview". In Un-certainty, Information, and Communication: Essays in Honor of Kenneth J. Arrow, vol. 3, edited by Walter P. Heller, Ross M. Starr, and David A. Starrett. New York: Cambridge Univ. Press, 1986.
  8. Forsythe, Robert; Kennan, John; Sopher, Barry (1991). "An Experimental Analysis of Strikes in Bargaining Games with One-Sided Private Information". The American Economic Review. 81 (1): 253–278. JSTOR   2006799. Archived (PDF) from the original on 2016-05-08.
  9. Banks, Jeff; Camerer, Colin F.; and Porter, David. "Experimental Tests of Nash Refinements in Signaling Games." Working paper. Philadelphia: Univ. Pennsylvania, Dept. Decision Sci., 1988.
  10. Heath, Chip; Heath, Dan (Dec 2006). "The Curse of Knowledge". Harvard Business Review. Retrieved 26 April 2016.
  11. Elizabeth L., Newton (1990). The rocky road from actions to intentions (PDF) (PhD thesis). Stanford University.
  12. Birch, S. A.J.; Bloom, P. (2007). "The Curse of Knowledge in Reasoning About False Beliefs" (PDF). Psychological Science. 18 (5): 382–386. CiteSeerX   10.1.1.583.5677 . doi:10.1111/j.1467-9280.2007.01909.x. PMID   17576275. S2CID   18588234. Archived (PDF) from the original on 2016-05-07.
  13. Ryskin, Rachel A.; Brown-Schmidt, Sarah (25 March 2014). "Do Adults Show a Curse of Knowledge in False-Belief Reasoning? A Robust Estimate of the True Effect Size". PLOS ONE. 9 (3): e92406. Bibcode:2014PLoSO...992406R. doi: 10.1371/journal.pone.0092406 . PMC   3965426 . PMID   24667826.
  14. Birch, Susan A. J.; Bernstein, Daniel M. (2007). "What Can Children Tell Us About Hindsight Bias: A Fundamental Constraint on Perspective–Taking?" (PDF). Social Cognition. 25 (1): 98–113. CiteSeerX   10.1.1.321.4788 . doi:10.1521/soco.2007.25.1.98. Archived (PDF) from the original on 2016-05-07.
  15. Sian beilock (2011-09-09). Choke: What the Secrets of the Brain Reveal About Getting It Right When You Have To . Atria Publishing Group/Simon & Schuster. ISBN   978-1416596189.
  16. The curse of Expertise
  17. Hinds, Pamela J. (1999). "The Curse of Expertise: The Effects of Expertise and Debiasing Methods on Predictions of Novice Performance". Journal of Experimental Psychology: Applied. 5 (2). American Psychological Association: 205–221. doi:10.1037/1076-898X.5.2.205. ISSN   1076-898X. S2CID   1081055.
  18. Budiu, Raluca. "You Are Not the User: The False-Consensus Effect". Nielsen Norman Group. Retrieved 2021-04-07.
  19. Berg, Al. "Principle 69 - The Curse of Knowledge". 112 Key Principles for Success.
  20. BBC Future: What Sherlock Holmes can teach us about the mind
Listen to this article (8 minutes)
Sound-icon.svg
This audio file was created from a revision of this article dated 23 December 2019 (2019-12-23), and does not reflect subsequent edits.