Mark Twain once quipped, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” The problem is that we rarely question our own beliefs. Once a false assumption takes hold, it becomes a default lens we use to interpret the world—and dislodging it becomes incredibly difficult.
One very basic assumption that lies at the heart of many change efforts is that information is power—the notion that if you arm people with the right knowledge, they will act on it. That’s why so many change programs are rooted in education and training, because they assume that the right information will change people’s behavior.
There’s even a name for this assumption: the information deficit model. Decades of research show that it doesn’t hold up. The truth is that we rarely change our behavior after being exposed to new facts. When confronted with evidence that contradicts our beliefs, we’re more likely to question the evidence than to update our views. Our brains prefer stability to change.
The Information Deficit Model
The core assumption of the information deficit model is that when people lack basic knowledge, exposing them to new evidence will change their opinions. But that assumes that their minds are blank slates, which is rarely true for most subjects and contexts. We all have preconceived notions of how the world works and will tend to cling to our views.
For example, people who believe in a flat earth don’t simply lack knowledge of a round earth, but have a model of the universe in which the earth is flat. In order to change those beliefs, they would not only need to accept new evidence, but also to discard old beliefs that they have relied on to navigate the world.
To do that they face a number of barriers they will need to overcome, including the synaptic pathways built up in their brains that are devoted to their existing model, the social pressure of people in their community who hold similar views, and the switching costs involved in changing their behavior based on their new knowledge.
That’s why people not only tend to resist new knowledge, but also why they can be actively hostile to it. It doesn’t feel like gaining insight—it feels like losing part of their identity, their community, and the mental models that help them make sense of the world.
How We Dig In Our Heels
The tendency to reject new evidence that contradicts our existing beliefs is so prevalent and consistent that there’s even a name for it: the Semmelweis effect. It’s named after a young Hungarian doctor, Ignaz Semmelweis, who pioneered the use of handwashing in hospitals in the 1840s and then was ostracized by the medical establishment.
We like to think of ourselves as rational actors, objectively weighing evidence to arrive at our conclusions, but the evidence shows that’s not really true. For example, one study found that when offered opposing research about the death penalty, subjects embraced the evidence that confirmed their views while discounting facts to the contrary. Another study that researched attitudes for affirmative action and gun control had similar results.
We shouldn’t assume that intelligence and education will make us immune, either. In fact, smart people are often the most easily fooled. Hucksters like Elizabeth Holmes of Theranos and Anna Sorokin of New York’s art scene fooled some of the world’s most sophisticated and successful figures with shams that should have been obvious even to laypeople.
The problem is that smart people expect to see what others miss, so they often look for alternative narratives even when the evidence to support them is thin. This is related to what Stanford professors Jeffrey Pfeffer and Robert Sutton call the “smart-talk trap,” when executives prioritize eloquent discussion and jargon over action.
How We Really Change Our Minds
Much more than we realize, our reasoning is often socially motivated. Decades of research show that we conform to the opinions of our peers and that the effect extends to three degrees of social distance. So it is not only those we know well, but even the friends of our friend’s friends—people we don’t even know—that affect our opinions.
A vivid example is the spread of air conditioners in the 1950s. Back then, units were installed in windows. The sociologist William Whyte observed that adoption of the cooling appliances wasn’t uniform, but in clusters from building to building. People weren’t buying air conditioners after just hearing about them. They bought one after visiting a neighbor who owned one.
In a similar vein, David McRaney, in his book How Minds Change, found that people involved in cults or conspiracy theory groups didn’t change their opinions when confronted with new facts, but when they changed their social environment. Research also shows that our social networks influence things like happiness, obesity, and behaviors related to cooperation and trust.
The truth is that the best indicator of what people do and think is what the people around them do and think. Instead of trying to shape opinions, we need to shape networks. That’s why we need to focus our efforts on working to craft cultures rather than wordsmithing slogans. To do that, we need to understand the subtle ways we influence each other.
For leaders, the lesson is clear: Lasting change comes less from the facts we share and more from the networks and cultures we create.
Starting With A Majority
We like to think we can shape the ideas of others. That’s why most transformation efforts start out with some snappy slogans, a communication program, and a big launch. Most generate a burst of excitement and activity, only to fizzle out within months. This fuels change fatigue, making success for the next initiative even less likely.
We need to be far more humble about our persuasive powers. Anyone who has ever been married or raised kids knows how difficult it is to convince even a single person of something. If you expect to shift the opinions of dozens or hundreds—much less thousands or millions—with pure sophistry, you’re bound to be disappointed.
A simple alternative is to start with a majority, focusing on people who already buy into the idea.
Go out and find people who are as enthusiastic as you are, who are willing to support your idea, to strengthen it and help you work through the inevitable problems along the way. Even if that majority is only three people in a room of five, you can always expand a majority out.
That’s how you can begin to gain traction and build a sense of shared mission. As you begin to work out the kinks, you can embark on a keystone project, show some progress, build a track record, and accumulate social proof. That’s how you get out of the business of selling an idea and into the business of selling success. As you gain momentum, you can build support through peer networks.
Real change doesn’t come from persuading the unconvinced with more information. It is small groups, loosely connected but united by a shared purpose, that drive truly transformational change.