What Is Happening
When confronted with climate change data, people say it's "not yet definitive," overestimate their abilities, and "unsee" inconvenient health examination results.
These are not instances of ignorance forced by external power or institutions. Individuals voluntarily, and often unconsciously, choose to "remain ignorant." This is a different dimension from the "strategically manufactured ignorance" primarily analyzed by Proctor's (2008) agnotology—the ignorance systematically produced by tobacco and fossil fuel industries.
Steven Sloman & Philip Fernbach (2017/Japanese translation 2018) in "The Knowledge Illusion" demonstrated that human cognition fundamentally does not operate solely "inside individual minds." We understand the world far less than we think we do, and we are barely aware of this lack of understanding. This "illusion of explanatory depth" is a fundamental characteristic of humans discovered by cognitive science.
When the "illusion of knowledge" combines with motivated reasoning, a powerful mechanism of self-produced ignorance emerges. People develop the illusion that they already know enough and actively avoid information that doesn't align with their beliefs. There's no need for external forces to impose ignorance—a structure exists where individuals choose and maintain their own ignorance.
Background and Context
The Illusion of Knowledge—Sloman & Fernbach's Discovery
The core finding of Sloman & Fernbach (2017) is that humans suffer from an "illusion of explanatory depth."
The experiment they cite by Rozenblit & Keil (2002) clearly demonstrates this phenomenon. When participants were asked "How does a zipper work?", many answered that they "knew." However, when asked to explain step-by-step, most people got stuck partway through. We don't understand that we don't understand.
Regarding the mechanism behind this illusion, Sloman & Fernbach explain it using the concept of "cognitive division of labor." Human cognition doesn't operate in isolation but functions through collaboration with others and the environment. The feeling that we "know" how zippers work stems from the implicit assumption that we can ask someone or look it up when needed.
The problem is that this cognitive division of labor systematically distorts self-assessment of "what we actually know." We develop the illusion that knowledge distributed among others and the environment is our own personal knowledge. Sloman & Fernbach described this as "confusion between community knowledge and individual knowledge."
The Dunning-Kruger Effect—The Lower the Ability, the Greater the Confidence
The effect reported by David Dunning & Justin Kruger (1999) further sharpens the illusion of knowledge. In tasks involving logical reasoning, grammar, and humor evaluation, participants with lower performance showed greater tendencies to overestimate their own performance.
Tachibana Akira (2022) in "Fools and Ignorance" discussed the social implications of this effect. If those with low ability overestimate their capabilities while those with high ability underestimate theirs, then in social debates, the loudest voices may belong to those with the poorest judgment.
The Dunning-Kruger effect is important for agnotology because it systematically causes "metacognitive failure"—not knowing what one doesn't know. Proctor's agnotology analyzes "strategically manufactured ignorance" where external actors intentionally obstruct knowledge. However, in the Dunning-Kruger effect, individuals' inability to recognize the limits of their own cognitive abilities generates an attitude of "not wanting to know." If you don't know that you don't know, motivation to learn doesn't arise.
Confirmation Bias and Motivated Reasoning
The illusion of knowledge generates false confidence that "I know enough," while the Dunning-Kruger effect supports overconfidence that "my judgment is correct." When confirmation bias and motivated reasoning are layered on top, the self-production mechanism of ignorance is complete.
Confirmation bias is the tendency to selectively collect and interpret information that supports one's existing beliefs while ignoring or underestimating contradictory information. This tendency is universally observed in human cognition and operates even in people with high intelligence and education levels.
Motivated reasoning takes confirmation bias one step further. This refers to the phenomenon where the reasoning process itself is distorted to reach desired conclusions. As Dan Kahan's (2013) research showed, people with higher scientific literacy tend to interpret scientific data to align with their political beliefs.
This finding is counterintuitive. Most people would expect that higher scientific literacy would enable objective data evaluation. However, in reality, scientific literacy also functions as "the ability to more skillfully construct reasons supporting one's beliefs." As knowledge increases, the "weapons" for motivated reasoning multiply.
Cognitive Dissonance—The Emotional Foundation of "Not Wanting to Know"
The cognitive dissonance theorized by Leon Festinger (1957) explains the emotional foundation of motivated ignorance.
When encountering information that contradicts one's beliefs or actions, people experience unpleasant psychological tension (dissonance). There are three ways to resolve this dissonance: ①deny the information, ②modify beliefs, or ③find new interpretations that reconcile both.
Psychologically, ①denying information is easiest, while ②modifying beliefs is most difficult. Therefore, humans tend to move toward "unknowing" inconvenient information.
Smokers who know lung cancer risk data but don't quit smoking are a well-known example of cognitive dissonance. However, what's interesting from an agnotological perspective is smokers' behavior of "actively avoiding" risk information. Looking away from warning images printed on cigarette packages, not reading articles about smoking and health—these behaviors show a state of voluntarily choosing to "remain ignorant" despite the existence of information.
Reading the Structure
A Three-Layer Model of Motivated Ignorance
Integrating the above cognitive science findings, motivated ignorance can be described as a model consisting of three layers.
Layer 1: Cognitive Laziness
As Daniel Kahneman (2011) showed in "Thinking, Fast and Slow," human cognition has two modes: System 1 (fast, automatic, low-effort) and System 2 (slow, conscious, high-effort). System 1 dominates by default, and activating System 2 requires conscious effort.
The illusion of knowledge is connected to this efficiency of System 1. By developing the illusion that "I already know," there's no need to activate System 2 for deep thinking. Cognitive laziness maintains the comfortable state of "not knowing that you don't know."
This layer of ignorance involves little intention or emotion. It simply results from avoiding the cognitive cost of deep thinking.
Layer 2: Emotional Defense
As cognitive dissonance shows, inconvenient information causes psychological discomfort. At the second layer, information acceptance is refused to avoid this discomfort.
Data showing one's investment decisions were wrong, report cards showing one's child's academic performance is below average, test results showing one's health is deteriorating—this information is emotionally avoided because acceptance would demand modification of self-image.
Emotional defense, unlike cognitive laziness, is an active process. Information exists and is accessible, but it's consciously or unconsciously avoided to prevent emotional discomfort.
Layer 3: Identity Defense
The most powerful layer is identity defense. When information is not merely "inconvenient" but threatens the core of one's identity, avoidance motivation becomes extremely strong.
The phenomenon Kahan (2013) calls "cultural cognition" falls into this layer. People who deny scientific data on climate change don't lack understanding of the data's content. They reject the data because accepting climate change contradicts the identity of the community they belong to—conservative values, free market support, skepticism toward government regulation.
At the identity defense level, "what you know" directly connects to "who you are." Acknowledging climate change might mean "defecting" to the camp supporting environmental regulation. When accepting facts carries the risk of exclusion from one's reference group, rejecting facts becomes a "rational" choice for the individual.
The "post-truth" state analyzed by McIntyre (2018) in "Post-Truth" can be understood as the result of this identity defense operating on a social scale. Truth hasn't become unimportant; rather, maintaining identity has taken priority over accepting truth.
Mutual Amplification of Structural Ignorance and Motivated Ignorance
While the analysis so far has focused on the individual cognitive level, motivated ignorance doesn't operate independently from the structural ignorance analyzed by Proctor. Both mutually amplify each other.
The mechanism by which structural ignorance amplifies motivated ignorance is as follows: When the tobacco industry systematically spreads doubt that "the causal relationship between smoking and lung cancer is not established," smokers obtain "material" to resolve their cognitive dissonance. The framing that "scientists have divided opinions" (which is actually false) provides "reasons why one doesn't need to know." Structurally produced uncertainty legitimizes individual-level motivated ignorance.
Conversely, there's also a mechanism by which motivated ignorance amplifies structural ignorance. When consumers "don't want to know" inconvenient information, media and politicians emerge to meet that demand. Conspiracy theory media, populist politicians, and pseudo-scientific health food advertisements that provide the message "what you don't want to know, you don't need to know" systematically exploit individual motivated ignorance and scale it up to structural ignorance production.
This mutual amplification dynamic has important implications for agnotology's analytical framework. Analyzing structural ignorance and motivated ignorance separately is insufficient; we need to understand the mechanisms of their interaction.
The Cost Structure of "Knowing"
Let's organize the conditions under which motivated ignorance becomes a rational choice from the perspective of cost structure.
"Knowing" involves multiple costs. First, the cognitive cost of information gathering (investment of time and attention). Second, the psychological cost of belief modification (experiencing cognitive dissonance). Third, the practical cost of behavioral change (changing lifestyle habits, restructuring social relationships). Fourth, the cost of identity reconstruction (modifying self-image, changing reference groups).
When these costs exceed the benefits of "knowing," "remaining ignorant" becomes a rational choice for the individual. The fourth cost of identity reconstruction is particularly high, and when this cost is large, knowledge updating is unlikely to occur regardless of how much information is provided.
This analysis suggests that information provision alone cannot counter motivated ignorance. Fact-checking and science education lower the first cost (information gathering) but don't address the second through fourth costs. Rather, designing social environments that increase psychological safety for belief modification and identity reconstruction is necessary.
Questions for Our Research Lab
The analysis of motivated ignorance poses the following questions for our research lab:
- What are effective interventions for individuals to become aware of the illusion of knowledge (improving metacognition)?
- Are there methods to lower the threshold at which identity defense operates—that is, to design environments where belief modification doesn't threaten identity?
- Where are the intervention points to break the mutual amplification loop between structural ignorance and motivated ignorance?
- Where should the boundary be drawn between the "right not to know" and the "duty to know"?
These questions are positioned at the intersection of cognitive science findings and agnotology's structural analysis.
References
知ってるつもり——無知の科学
Sloman, S. & Fernbach, P.(土方奈美訳). 早川書房
Read source
バカと無知——人間、この不都合な生きもの
橘玲. 新潮新書
Read source
Post-Truth
McIntyre, L.. MIT Press
Read source
Agnotology: The Making and Unmaking of Ignorance
Proctor, R. N. & Schiebinger, L.. Stanford University Press
Read source