The problem with experts (and those who do not listen to them)

1 Introduction

The seminar will develop students’ understanding of the role of experts in a technocratic society. We will try to explain what issues arise when experts give advice, in what sense those issues are unavoidable, and how to manage and reduce those problems.

The seminar will also help students to develop their understanding of modern society and of their role in it.

The seminar will finally be an opportunity to train the ability to write a literature review on a topic. In that respect, students will have to summarize and connect different works on the same topic, define their differences and respective contributions, and combine them in a logical and structured way.

In the introductory lecture, I will present a lists of main issues with experts and with the way people respond to information. Each students will choose one of those issues and prepare a presentation and an essay. I offer a few academic references for each issue, as a start for their work.

2 Organization of the seminar and grading

• There will be a brief presentation of the seminar on November 6 from 10.00 to 12.00 in person and/or online, depending on the coronavirus situation. Attendance is not mandatory. I will present the main themes of the seminar, answer questions and assist students in their choice of topic.

• The deadline for sending me your choice of a topic is on November 20. After that date, those who did not make a choice will be assigned a topic by me.

• Presentations will take place on January 15 and January 22 as a block seminar from 10.00 to 12.00 and from 14.00 to 17.00 in person and/or online. Students must attend all presentations by all other students and participate in the discussion by asking questions.

• The deadline for submitting the essay is February 5. Essays must be submitted as a pdf file and sent to my email adress You will receive an acknowledgment of receipt. Contact me if you do not.

2.1 Choosing a problem

Students will choose one from a list of problems with experts (Section [sec:List-of-topics]). I provide a short list of associated academic articles as an introduction to each problem.

Assignment of problems to students is on a first-come, first-served basis. This means that students may be asked to choose a problem other than their first choice if another student already chose it.

Students may present on a problem that is not in the list, but only after approval by the seminar leader.

2.2 Presentation

Students will present their chosen problem at the end of the semester to others along the following lines:

  1. What is the problem, its background and context?
  2. What theories have been advanced to explain the issue?
  3. Have those theories been tested, and what are the main findings from those tests?
  4. Which theory or combination of theory offers the best explanation for the problem?
  5. What are the possible solutions to the problem?
  6. What are the remaining unresolved questions from the literature?

2.3 Essay

Students will then write an essay about their chosen problem. The essay will therefore take into account discussion and input by others after their presentation. In writing the essay, please follow general instructions here:

Presentation and essay may be in either English or German.

2.4 Grading

• The presentation (20 minutes + 10 minutes discussion) will count for 50% of the grade.

• The essay (max. 10 pages) will count for the other 50% of the grade.

• Presence and participation during the presentations of other students is mandatory.

3 General references

The following books are a good general introduction to the topic. Read both to have a balanced overview of the two sides of the debate about the role of experts. On the one hand, experts often fail to give good advice, on the other hand, even the best experts are only good as the people they give advice to.

• Nichols, T. M. (2017). The Death of Expertise: The Campaign against Established Knowledge and Why it Matters. Oxford University Press, USA.

• Freedman, D. H. (2010). Wrong: Why experts keep failing us—and how to know when not to trust them. Little, Brown and Company.

4 List of topics

In the following list of topics, each topic is introduced by a brief motivating paragraph, and following by a few references. References are chosen either because they were the first on the topic, or because they provide a review of the topic, or because they illustrate one particularly important point. The list of references is not meant to be complete! They only provide a start for the interested student. You do not need to cite them in your presentation or essay if you find them not to be relevant to your own way to see the topic. There are several different ways to treat the same topic, each student may choose his or her own, as long as this is properly motivated. If you choose a sub-topic within your chosen topic, you must show that you understand the main topic and explain why you made the choice to concentrate on only one sub-topic.

4.1 Problems with experts

4.1.1 Distant

Experts differ from the general population and that matters in terms of whether their advice is appropriate and likely to be accepted. There are many types of experts, including doctors, epidemiologists, policy-makers, researchers, engineers, technicians, economists, financial advisers, etc… They often have very specialized formations with their own specific language that is difficult to understand for the rest of the population. They also generally come from advantaged socio-economic backgrounds in terms of wealth and education. This makes it difficult for them to understand, care about and take into account the concerns of the rest of the population. It makes it also less likely their advice will be listened to and followed.

  1. Hofstadter, R. (1966). Anti-Intellectualism in American Life (1st edition). Vintage.
  2. Sapienza, P., & Zingales, L. (2013). Economic Experts versus Average Americans. American Economic Review, 103, 636–642.
  3. Taylor-Gooby, P. (2006). Social Divisions of Trust: Scepticism and Democracy in the GM Nation? Debate. Journal of Risk Research, 9(1), 75–95.

4.1.2 Suspicious

Trust in experts is low because of a series of failures and scandals, including during the 2008-2009 global financial crisis which was not foreseen and was mismanaged. In the European and German context, events such as the 2010-2014 Greek financial crisis, plagiarism scandals, diesel scandal, fraud in organ transplants, etc… have also lowered trust in institutions. Distrust of experts is often linked with distrust of science, at least in some domains (e.g. climate change). It has led to the rise of populist politicians who rail against the “establishment”, the technocracy, the media, and generally all institutions.

  1. Onora O’Neill (2002). “A Question of Trust: The BBC Reith Lectures”, Cambridge University Press, available as text and audio at
  2. Van der Linden, S., and S. Lewandowsky (2015). “How to combat distrust of science.” Scientific American.
  3. Rutjens, B. T., Sutton, R. M., & van der Lee, R. (2018). Not All Skepticism Is Equal: Exploring the Ideological Antecedents of Science Acceptance and Rejection. Personality & Social Psychology Bulletin, 44(3), 384–405.

4.1.3 Conflicted

In many respects, experts have conflicts of interests, and may be fraudulent, biased and self-serving. A good knowledge of their background and incentives should allow people to interpret their advice appropriately in order to find out the truth. However, it turns out people are not aware of those issues and even when they are, are not able to extract the truth from biased expert advice. In this setting, can regulation, transparency and a concern for reputation solve the problem?

  1. Thompson, D. F. (1993). Understanding financial conflicts of interest. New England Journal of Medicine, 329, 573-573.
  2. Inderst, R., & Ottaviani, M. (2012). Financial Advice. Journal of Economic Literature, 50(2), 494–512.
  3. Loewenstein, George, Daylian M. Cain, and Sunita Sah (2011). The Limits of Transparency: Pitfalls and Potential of Disclosing Conflicts of Interest.” American Economic Review 101(3), 423–28.

4.1.4 Wrong

When are experts wrong and when do they fail to anticipate issues? An issue is that policy problems are often “wicked problems”, so that there is not one good solution to them, only combinations of possible improvements. Another issue is that experts often underplay the uncertainty inherent in their recommendations and predictions.

  1. Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLOS Medicine, 2(8), e124.
  2. Colander, D., Föllmer, H., Haas, A., Goldberg, M. D., Juselius, K., Kirman, A., … & Sloth, B. (2009). The financial crisis and the systemic failure of academic economics. Univ. of Copenhagen Dept. of Economics Discussion Paper, (09-03).
  3. Rittel, H. W., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4, 155-169. (this paper is the source of the term “wicked problems”).

4.1.5 Undemocratic

Are experts too influential in policy-making, and is the technocracy undemocratic? A large amount of decisions by democratic leaders is informed by the opinion of experts, and many important decisions are subject to lobbying and made with little democratic control, notably at the level of supra-national organisations (WTO, EU, WHO, UN, etc..). How can decisions be made in a democratic way when most people cannot understand the reason why those decisions have to be made? How can better communication and education help in solving the issue?

  1. Turner, Stephen. “What Is the Problem with Experts?” Social Studies of Science 31, no. 1 (February 1, 2001): 123–49.
  2. Shapiro, M. (2004). Deliberative, independent technocracy v. democratic politics: will the globe echo the EU. Law & Contemp. Probs., 68, 341.
  3. Gluckman, Peter. “Policy: The Art of Science Advice to Government.” Nature News 507, no. 7491 (March 13, 2014): 163.
  4. Dür, A. (2008). Measuring Interest Group Influence in the EU: A Note on Methodology. European Union Politics, 9(4), 559–576.

4.1.6 Conformist

Experts are reluctant to challenge the status-quo and tend to follow the herd in order to avoid being singled out for being wrong. This means that experts often all have the same opinion and care about being seen as “politically correct”. This can lead them to hide what they know if this contradicts the prevailing consensus. On the other hand, they also sometimes all contradict each other, which leads to general distrust and confusion in the population.

  1. Devenow, A., & Welch, I. (1996). Rational herding in financial economics. European Economic Review, 40(3), 603–615.
  2. Morris, S. (2001). Political correctness. Journal of Political Economy, 109, 231-265.
  3. Azoulay, P., Fons-Rosen, C., & Zivin, J. S. G. (2015). Does Science Advance One Funeral at a Time? (Working Paper No. 21788). National Bureau of Economic Research.

4.1.7 Overconfident

Experts often exaggerate their knowledge and fail to provide accurate and transparent advice. Overconfidence is a general phenomenon, and there are many reasons for it. Experts are not immune. The issue is that experts that are overconfident are more trusted than those that are more honest about uncertainty in their predictions. There is therefore little incentive for experts to downplay their expertise.

  1. Lichtenstein, S., Fischhoff, B., & Phillips, L. D. (1982). Calibration of probabilities: the state of the art to 1980. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment under uncertainty: heuristics and biases. Cambridge University Press.
  2. Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77, 1121.
  3. Zarnoth, P., & Sniezek, J. A. (1997). The social influence of confidence in group decision making. Journal of Experimental Social Psychology, 33, 345–366.
  4. Kennedy, J. A., Anderson, C., & Moore, D. A. (2013). When overconfidence is revealed to others: Testing the status-enhancement theory of overconfidence. Organizational Behavior and Human Decision Processes, 122(2), 266–279.

4.1.8 Advice that is inapplicable, ineffective, insensitive and unskillful

Expert advice is often difficult to apply. This is because it often goes against habits, contradicts advice from other, closer sources, and does not take into account practical issues that prevent people following the advice. It is also often too general, not appropriate for all situations, and not understanding of people’s motivations for not following it. This is the case for example in terms of health advice (nutrition, smoking), but also in terms of environmental protection and in many other domains. The issue is then to find ways to change people habits, find ways to relay information and advice via community leaders and other trusted intermediaries, and identify practical issues that prevent people from following the advice.

  1. Kearney, J. M., & McElhone, S. (1999). Perceived barriers in trying to eat healthier–results of a pan-EU consumer attitudinal survey. British Journal of Nutrition, 81(S1), S133-S137.
  2. Brown, M. T., & Bussell, J. K. (2011). Medication Adherence: WHO Cares? Mayo Clinic Proceedings, 86, 304–314.
  3. Garvin, D. A., & Margolis, J. D. (2015). The art of giving and receiving advice. Harvard Business Review, 93(1), 14.
  4. Barnett-Howell, Z., & Mobarak, A. M. (2020). Should Low-income countries impose the same social distancing guidelines as Europe and North America to halt the spread of COVID-19?. Yale School of Management, New Haven, CT. http://yrise. yale. edu/wp-content/uploads/2020/04/covid19_in_low_income_countries.pdf.

4.2 Problems with those not listening to experts

4.2.1 Overconfident

Those who need advice tend not to take advice. This may be because they think they know enough to make decisions, or because they do not have the means or knowledge to get or understand advice. Knowing that you need help requires you to know you are doing something wrong. This requires you know how to evaluate your actions, which is beyond many people’s ability and knowledge. The references below focus on why people do not get advice what to consume; there are many other areas where people fail to get and follow advice.

  1. Lewis, David. “The Perils of Overconfidence: Why Many Consumers Fail to Seek Advice When They Really Should.” Journal of Financial Services Marketing 23 (July 3, 2018).
  2. Alyousif, Maher, and Charlene M. Kalenkoski. “Who Seeks Financial Advice?” SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, March 3, 2017.
  3. Bachmann, Kremena, and Thorsten Hens. “Investment Competence and Advice Seeking.” Journal of Behavioral and Experimental Finance 6 (June 1, 2015): 27–41.

4.2.2 Inflexible

It can be hard to change behavior and habits in response to nudges and expert recommendations. However, it may also be that people simply do not want to do what is the “best for themselves” from the point of view of experts. This leads us into discussing the legitimacy of a paternalistic State and well-meaning experts.

  1. Duhigg, C. (2012). The power of habit: Why we do what we do in life and business. Random House.
  2. Allcott, H., & Rogers, T. (2014). The Short-Run and Long-Run Effects of Behavioral Interventions: Experimental Evidence from Energy Conservation. American Economic Review, 104, 3003–3037.
  3. Sugden, R. (2016). Do people really want to be nudged towards healthy lifestyles? International Review of Economics, 1–11.

4.2.3 Information avoidance

People may avoid getting information even when that information is easy to find and understand. This is for example the case when stock market prices are down, because this hurts them emotionally. It can also be the case when they suspect they may be ill but do not want to deal with the consequences (quarantine, going to the hospital, being treated). Should they be given information against their will, as is done for example by exposing people to graphic warnings about the dangers of cigarette smoking?

  1. Golman, R., Hagmann, D., and Loewenstein, G. (2017). Information Avoidance. Journal of Economic Literature 55, 96–135.
  2. Karlsson, N., Loewenstein, G., & Seppi, D. (2009). The ostrich effect: Selective attention to information. Journal of Risk and Uncertainty, 38(2), 95–115.
  3. Sullivan, P. S., Lansky, A., Drake, A., & HITS-2000 Investigators. (2004). Failure to return for HIV test results among persons at high risk for HIV infection: results from a multistate interview project. JAIDS Journal of Acquired Immune Deficiency Syndromes, 35(5), 511-518.

4.2.4 Reactance

People may resist advice from experts because they resent their advice as restricting their freedom of action. Accepting the advice is experienced as relinquishing control of one’s action to someone else, even if not directly. Not accepting the advice on the other hand can lead to cognitive dissonance (split between what one is doing and what one should be doing). This may be resolved by changing one’s values and norms to fit one’s behavior, rather than trying to follow one’s morals or reason.

  1. Fitzsimons, G. J., & Lehmann, D. R. (2004). Reactance to recommendations: When unsolicited advice yields contrary responses. Marketing Science, 23, 82-94.
  2. Dillard, J. P., & Shen, L. (2005). On the nature of reactance and its role in persuasive health communication. Communication Monographs, 72(2), 144-168.
  3. Clee, M. A., & Wicklund, R. A. (1980). Consumer behavior and psychological reactance. Journal of Consumer Research, 6(4), 389-405.

4.2.5 Confirmation bias and motivated beliefs

People may seek information that supports their beliefs rather than look for unbiased information. This leads to a problem with “information bubbles” and “echo chambers”, whereby people seek information only from sources or people they already know will support their own chosen opinion. The confirmation bias also affects research and prevents belief updating in view of new results.

  1. Nickerson, Raymond S. (1998). “Confirmation bias: A ubiquitous phenomenon in many guises.” Review of General Psychology 2.2: 175.
  2. Nelson, J. A. (2014). The power of stereotyping and confirmation bias to overwhelm accurate assessment: The case of economics, gender, and risk aversion. Journal of Economic Methodology, 21(3), 211–231.
  3. Bï¿œnabou, R., & Tirole, J. (2016). Mindful Economics: The Production, Consumption, and Value of Beliefs. Journal of Economic Perspectives, 30(3), 141–164.

4.2.6 Conspiracy theories

Why people believe in conspiracy theories, and how those can be countered. Unlike often believed, education does not directly predict lower belief in conspiracies; rather, other psychological factors related to education seem to explain such beliefs better. Belief in conspicacies relates to lack of trust in experts and the power of information bubbles.

  1. Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology, 731-742.
  2. Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The Psychology of Conspiracy Theories. Current Directions in Psychological Science, 26(6), 538–542.
  3. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy Theories: Causes and Cures. Journal of Political Philosophy, 17(2), 202–227.

4.2.7 Gullibility

People trust the wrong experts and for the wrong reasons. They tend to trust confident experts more than those who are honest about uncertainty in their predictions and recommendations. They also tend to value simple advice that can be applied by all even for complicated, individual situations. Finally, they do not sufficiently lose confidence in experts that are found to have been wrong. There are however ways in which people can be educated to make better choices what information to trust, and to deal better with uncertainty.

  1. Van Swol, L. M., & Sniezek, J. A. (2005). Factors affecting the acceptance of expert advice. British Journal of Social Psychology, 44(3), 443-461.
  2. Freedman, D. H. (2010). The Idiocy of Crowds, Chapter 4 in Wrong: Why experts keep failing us—and how to know when not to trust them. Little, Brown and Company.
  3. Tauritz, R. (2012). How to handle knowledge uncertainty: Learning and teaching in times of accelerating change. In Learning for sustainability in times of accelerating change.