Today you can scarcely read a story about government without wondering how our country and our politics became so divided. A recent study from Pew Research reported, “fully 86 percent describe the country as more politically divided today than in the past.” This is compared to about 46 percent who said the same thing in 2009.
This is surprising because the rise occurred during a time when people have more access to information, data and facts than any other time in history. The facts that should help all of us make more intelligent decisions and bring us together, often fail to resonate and even harden the divide. This raises an interesting question, why don’t facts change minds?
Recent studies provide some rationale for how we think and process information. A review shows there are three fundamental reasons why facts fail us. First, from an evolutionary perspective, human reasoning is an evolved trait developed to help us cooperate in large groups. This makes it easier to fall victim to groupthink regardless of facts. We have a powerful biological system designed to serve and survive in a group. Second, we are all subject to “confirmation bias.” When we want an idea to be true, we tend to believe it is true. Third, we have an “illusion of explanatory depth” which is simple overconfidence in our ability to understand complexity. We believe we know more than we actually do.
It’s better to agree with the group than get kicked out of the group:
Recent New York Times bestseller, “Sapiens: A Brief History of Mankind” by Yuval Noah Harari, does a remarkable job of accounting how humans came to dominate the world. The key trait was our ability to cooperate flexibly in very large numbers. It’s what set us apart from all other species.
Studies have shown reason, and our reasoning abilities, developed not to enable us to solve abstract problems or draw conclusions for unfamiliar data, but to resolve problems posed by living in a collaborative group. For our ancestors, it was more important to figure out how to be in a group than to determine what our group was doing was right.
This means we are hard wired to place a higher value on maintaining our status or inclusion in a defined group (political party, religious affiliation, political cause) than using our reasoning to discover what may be wrong with our group’s positions.
We like information that supports our current view and discount information that doesn’t:
Confirmation bias is a well-researched trait and is described as the tendency for people to embrace information that supports their beliefs and reject information that contradicts them. A recent Stanford University experiment that researched students who had opposing views on capital punishment is an interesting example of the power of false information. In the study, researchers gave students who supported capital punishment data showing it deterred crime while giving the students who opposed capital punishment data that called it into question. The researchers then told the students the data was made up and wasn’t true. After this, they tested the students again and found the information changed their initial feeling. They found that both groups were more adamant about their original positions. Those who supported capital punishment were more in favor of it while those against it were more opposed.
Today, we see the effects of confirmation bias throughout our political environment. Republicans may believe President Trump is being treated unfairly by the media. If you believe that, you will see any negative story about the President as proof they are out to get him and won’t believe the information presented in the story. Democrats may believe President Trump worked with Russia to hack the election. If you believe that, you will tend to filter new information that comes out about potential investigations as proof that you are right.
This means we all have a tendency to make a decision not based on the facts. We are more likely to have preconceived notions and select facts that support our original thinking.
We think we know more than we do and other people do too:
The “illusion of explanatory depth” means we feel we understand the world with far greater detail, coherence and depth than we really do. A book entitled “The Knowledge Illusion: Why We Never Think Alone” by Phillip Fernback and Steven Sloman, describes why this is helpful for us in most parts of our life, but a bad trait in the political domain.
When looking at new technology, people can use an iPhone without knowing exactly how it works. It’s one thing not to understand how your iPhone works but quite another to favor (or oppose) an immigration ban without knowing what you are talking about. If your position on immigration is baseless, but I rely on it, then my opinion is baseless, and it keeps expanding exponentially.
Additional research has shown that strong feelings about an issue do not come from a deep understanding of the problem. This is where our dependence on other minds reinforces the problem.
This means a lot of information being shared is from people who care passionately about an issue but may not know the facts of what they are talking about. This is how a myth can become a reality in people’s minds.
But if facts can’t change minds…what can?
Stay tuned. That will be the subject of a future post.