“Maybe they’re just ignorant?” I’ve lost count of how many times I have heard this hopeful suggestion from students and colleagues trying to navigate ideological divides. It’s usually offered as a charitable way of trying to understand why someone doesn’t agree with a particular viewpoint on a controversial issue, often one related to identity or equality. It implies that the person would agree with the speaker if only they knew better. In some cases, that may in fact be the case. But its use as a default rationalization has turned it into something I call the fallacy of equal knowledge. It’s based in the unstated assumption that if we all had the same information, we’d all agree.
The difficulty of talking across political divides owes much to this assumption. I saw a clear example of it in the fall of 2020, while teaching a course in social problems. It was the semester following a summer of nationwide protests on the issue of race and policing. For weeks, my students and I had been discussing social ills from various perspectives, gradually building up trust. At one point, we found ourselves talking about law enforcement.
Given the timing of the course and the events of recent months, the death of George Floyd was on many people’s minds. Over the course of our discussion, I asked the class if they thought a reasonable person could view his killing solely through the lens of bad policing, not race. The poll I conducted suggested that about 60 percent said yes, they thought that this was possible.
I was surprised by their openness to this idea. But as the discussion unfolded, it became clear that several people in that group of 60 percent had something else in mind. Many assumed that an otherwise reasonable person could only hold this view if they didn’t yet understand that the reality of racism made it important—even necessary—to see Floyd’s death through a racial lens. This point is controversial, even within the black community, but the students assumed that, once informed, such a person would change his mind.
I ran the poll again. This time, I asked: could a reasonable person, with the same information you have, perceive the killing of George Floyd solely through the lens of bad policing and be unsure about whether it should also be seen through the lens of race? This time, the share of students answering yes dropped to 30 percent.
Assuming someone disagrees with a particular political position or claim because they’re ignorant is a challenge I encounter frequently. By way of context, much of my job involves facilitating conversations about topics that make people uncomfortable. The fallacy of equal knowledge tends to emerge among people used to thinking in a specific way about hot-button political topics. When they consider a view such as opposition to affirmative action, the idea that gender-dysphoric children may be influenced by peers, or even opposition to Covid mandates, they suggest that ignorance could explain such thinking.
However, when treated as a default supposition, this outlook can stand in the way of constructive engagement. It is grounded in the often-false assumption that what divides people on controversial social issues is misinformation. It then creates the idea that giving those with opposing views more or better information must be the solution.
To be clear, sometimes ignorance is a real obstacle. But recognizing that point doesn’t mean believing that all differences on controversial questions can be solved by simply getting everyone on the same page with respect to what they know about the world. No one likes to be treated—or condescended to—as though they simply don’t know any better, but the fallacy of equal knowledge does just that. It fails to take opposing values seriously.
Unfortunately, the notion is pervasive. The vast majority of traditional diversity, equity, and inclusion (DEI) training programs are based on it. One DEI consulting firm states on its website that participants will, on completion of the course, “notice how their unconscious biases have been impacting their interactions with others”—this despite research on unconscious bias showing that it does not consistently predict problematic behavior. In fact, because of this inconsistency, the British government phased unconscious bias training out of its programming just over a year ago.
Second, the fallacy of equal knowledge also underpins certain curricula on empathy and social and emotional learning (SEL). One of the biggest firms in the world of SEL tweeted last year, “We hold fast to the belief that our work must actively contribute to antiracism.” But the concept of antiracism is itself infused with particular assumptions about how the world works—for example, that the right way to solve social problems is to see them through the lens of race.
Ultimately, these programs are based in the assumption that, by imparting information about the importance of unconscious bias and the need to adopt an antiracist stance, previously reluctant people will see the error of their ways. But this commits the fallacy of equal knowledge by assuming that the same information will lead people to the same position on these issues.
This fallacy may partially explain why such programming is so fraught. Strong evidence suggests that DEI training doesn’t yield positive results and can even be counterproductive and generate resentment. Better options would focus on building a stronger workplace with open communication, while respecting a variety of viewpoints.
The upshot is missing information isn’t always what makes people disagree. When we pretend that it is, we make it even harder to communicate across our political and ideological differences.