Confirmation Bias: How do you convince someone who believes the exact opposite?

Definition Confirmation Bias: ‘the tendency to interpret new evidence as confirmation of one’s existing beliefs or theories.’

We all influence people every day. On a small scale (can you give me the salt) and on a larger scale (choose me, or my company). But how do you get people on your side who happen to believe the exact opposite of what you’re trying to convince them of? Or how do you get people to exhibit behaviors that are better for them, the community or the planet if they don’t believe those behaviors are the right thing to do or are too difficult to execute?

How do you convince someone who believes the exact opposite?

Often we resort to convincing people with information, arguments and reasons. Behavioral science sheds an attractive light on how we as humans process information and, more importantly, how to ensure that people are not only willing to change, but also willing to consider what you have to say that is necessary for that change?

We must first understand the information context of the people we are trying to influence. What do we need to consider when we want to convey our information? We live in an interesting information age, to say the least. News and messages come to us in many ways, but not all ways are created equal. This is the time when we are all stuck in filter bubbles. The digital ecosystem and algorithms tailor our information delivery to our pre-existing views with a preference for extreme viewpoints, creating more and more distance between different perspectives and a larger social divide.

Confirmation Bias: The godmother of information processing

Before we can understand how information is processed, we must realize that as humans we all suffer from what is known as confirmation bias. We process information to confirm what we already think or believe. In other words, we value evidence that confirms our beliefs more than we value new viewpoints. Whether that information or evidence is true or false does not matter. Our brain’s post-rationalization ability helps us feel good about our views.

“We, as humans, are champions of justification after something has happened.”

However, the combination of tailored information technology and confirmation bias can produce systemic effects that can be troublesome. People who have strong opinions about complex social issues are likely to examine relevant evidence in a biased way. This can cause us to become firmly entrenched in our beliefs. We create polarized societies that show little willingness to cooperate or empathize instead of narrowing differences of opinion that are so desperately needed to solve current social challenges.

Let me give an example of how strong confirmation bias can be. An experiment by researchers at Stanford University proved that even scientific facts would be rejected if they did not match our existing beliefs.

The researchers invited both opponents and supporters of the death penalty. Both groups were divided into two and came to different conclusions about the effectiveness of capital punishment based on presented scientific research. Opponents either came to a research conclusion that favored the death penalty or a conviction against it. Proponents were also given the for or against conclusion. These research conclusions were as follows:

Research conclusion in favor of death penalty

Kroner en Phillips vergeleken moordcijfers voor het jaar ervoor en het jaar na de invoering van de doodstraf in 14 staten. In 11 van de 14 staten waren de moordcijfers lager na invoering van de doodstraf. Dit onderzoek ondersteunt het afschrikkende effect van de doodstraf.

Research conclusion against the death penalty

Palmer and Crandall compared murder rates in 10 pairs of neighboring states with different death penalty laws. In 8 of the 10 pairs, murder rates were higher in the state with the death penalty. This research does not support the deterrent effect of the death penalty.

Opponents of the death penalty read the first post and were strengthened in their conviction: “The experiment was well thought out, the data collected was valid, and they were able to respond to all criticism.” But after reading the second post, they did not change their convictions, but rejected the study: “The evidence given is relatively meaningless without data on how overall crime rates increased over those years.” And also, “There were too many errors in choosing the conditions and too many variables involved in the experiment as a whole to change my opinion.”

It worked the same way. Opponents of death penalty abolition who read the conclusions against the death penalty agreed: “It shows a good direct comparison about the effectiveness of the death penalty. Using neighboring states helps make the experiment more accurate by using similar locations.” While the evidence for the death penalty was rejected: “I don’t think they have a complete set of data. Also, as suggested, the murder rates should be expressed as percentages, not absolutes.” The study showed clearly:

“Mensen kunnen tot verschillende conclusies komen nadat ze zijn blootgesteld aan hetzelfde bewijs, afhankelijk van hun reeds bestaande overtuigingen.”

This bias is so strong that previous beliefs (also known as the attitude effect) caused people to reject even scientific evidence. The evidence only reinforced their beliefs and caused further polarization. Other research found an interesting addition to this conclusion. People accept “confirming” evidence more readily and evaluate disconfirming information much more critically. It’s not a fair game.

Confirmation bias in interpretation and memory

We now know that confirmation bias determines how we interpret information: what we focus on, what we value and what we prefer. But it also affects what we remember. We all suffer from selective memory or memory bias. Schema theory, for example, has shown that information that confirms our prior beliefs is stored in our memory, whereas when evidence is contradictory, it is not.

Confirmation bias not only affects our individual decision-making; it also affects groups. We, as humans, are social animals; we communicate with each other and we want to belong. However, our need to belong causes us to adapt our views to those of the group.

We seek recognition by streamlining our position. This creates a tendency to group think. Our need to conform precludes consideration of different viewpoints and also precludes exploratory thinking. This can negatively affect the quality of group decisions.

Why does confirmation bias occur?

Our brain is constantly trying to reduce our cognitive load. It does this by using shortcuts (heuristics) to interpret the information we are confronted with, such as using our past experiences, social norms or our instincts. Incorporating new information, evidence, facts and figures takes energy. Confirmation bias is a perfect way to browse information more easily.

“This means that we as humans never make a fully informed decision; we automatically choose the path of least resistance and rely on shortcuts.”

But there is more. If you have a firm belief, it is part of your identity. Holding on to that belief even helps us maintain our identity or self-respect. Switching almost feels like we didn’t make an intelligent decision the first time, so we’d better hold on to what we previously saw as truth.

I think we’ve all experienced it ourselves that it can be quite painful to admit that your strong beliefs were wrong. Getting over hurts; admitting we were wrong is not one of our favorite pastimes. This does not mean that we can never convince someone who has different beliefs than we do. We just have to be mindful of confirmation bias.

How do you convince someone with different beliefs?

What we can learn from this is that if we want to convince someone who has strong beliefs or does not yet have the same beliefs about desired behavior, we should consider three things

  1. First, we need to know where we stand. When we look at the decision we want someone to make or the behavior we want someone to perform, are we in someone’s zone of acceptance or rejection? In other words, how much distance is there between you and them? A key to persuading people is to reduce the belief distance between you and them.
  2. Second, we need to know how strong are these beliefs? Feeling strongly about something narrows our zone of acceptance and broadens our zone of rejection. In short, if you want someone to go along with you, you need to have a clear picture of where you stand on the field of influence.
  3. Third, identify what we nomen the “movable” middle. You must realize (or accept) that getting everyone on your side is an uphill battle. Haters will be haters. Or, put another way, people who are really extreme believers are extremely hard to get moving. People who are vehemently against abortion, climate deniers who think climate change is a hoax or conspiracy thinkers who are convinced that Covid doesn’t exist…Well, don’t waste your energy on them.

The truth is that in any group there is a large majority of people who are not yet sure. People whose zone of acceptance and rejection is somewhat balanced. Consider swing voters, often a large group of people who decide on election day who they are going to vote for. Often this group flips a coin. Therefore, these are the best people to target. The trick is not to influence everyone, it’s about those with an agile mind.

Our job is to narrow the distance between us and the people we are trying to change or convince. Not by giving people more evidence or information. That only activates confirmation bias; it will cause people to dig their heels in. We need to use behavioral psychology. So, how can we do this?

Getting around confirmation bias (1): Find the common ground

First, we need to see if we can find common ground. Suppose you want people to be actively involved in behaviors that promote sustainability. You may encounter skeptics, people who believe that climate change is not that bad. Instead of counterarguing with facts, first find a belief that you might both have in common. For example, the belief that family is important. Beliefs in exchange are closely related to motivations. The belief that family is important might be a stronger motivator for someone to do everything possible to give his/her children the most carefree life possible.

We call this a job-to-be-done: A deeper motivation that drives behavior. Something that people want to achieve in their lives and for which they are willing to take action, make decisions or exhibit behavior. I may not like paying extra on my mortgage (behavior), but I want to take financial action to ensure that I can still live in my beloved family home after I retire (JTBD). If that involves additional costs, so be it. It’s pretty fascinating that even people with very different beliefs can have similar jobs. It’s where you can find common ground and build a bridge to narrow the distance between you. For example, you could use a principle from behavioral psychology called demand substitution.

Let’s go back to the climate change skeptics. Instead of asking them, “Do you want to engage in sustainable behavior?” you could ask them another question, “Do you want to help build a healthy community for your children? That is a much easier question for them to answer because it fits with their beliefs about the importance of family. That to help create that family-friendly community, it (also) requires sustainable behavior, such as preventing litter in local parks, limiting car use in the neighborhood, buying locally grown produce, planting flowers that attract bees and so on, is the behavioral side effect we were aiming for. That’s how you can stretch the acceptance zone of skeptics.

Getting around confirmation bias (2): social proof

Let’s add something to the previous example. You’re still dealing with skeptics about climate change, and let’s say you need to design recycling behavior. In short, you can say that people with strong beliefs need more evidence before they are willing to change. However, evidence is not proof, nor are facts, figures or arguments. Evidence is what other people do.

We humans have a strong need for certainty. When designing a choice or behavior, you have to realize that engaging in new behavior or making a new decision involves uncertainty. It is new, so different, so uncertain. This causes our status quo bias and tendency toward loss aversion to come into play. We are simply afraid of losing what we have now and take our current state (status quo) as our baseline or reference point for future decisions. Any change from that baseline is perceived as a loss. Inertia is the result. We are really on shortcuts to see if a particular behavior or decision makes sense.

One of the strongest shortcuts our brains take is watching what other people are doing. We are “wired” as social animals. As children, we learn by watching others; we prefer to belong to the in-group, and as we have seen, we even adapt our beliefs to conform to a group. We addressed the possible dysfunctional decision-making capabilities of a group; we can also take advantage of this human tendency to follow the beliefs and behavior of others more positively. Simply by showing that more people exhibit recycling behaviors. We provide social evidence and activate the bandwagon effect.

“We adjust our beliefs and behaviors because many other people are doing the same thing.”

If you want this behavioral intervention to work, it’s best to show more people exhibiting the desired behavior. If you highlight only one person, you might run into what is called a translation problem: “That person is not like me or someone I aspire to be, so why follow his/her behavior? Preferably, we follow similar others. That’s why when you book a hotel room online, you value reviews of people like you more than random others. If you are a young couple, reviews from families with multiple children are less relevant to you. However, in the absence of another you, quantity counts. Simply because it’s harder to argue against more people.

On top of that, the more (different) sources say the same thing, the more social proof it provides. People need to hear from multiple sources to change their beliefs. Social evidence can also work to our advantage in another way: it creates network effects. If we can get more people to change their minds, maybe the people around them can change their minds, too.

The question remains, how many people do you need to create network effects? The answer is; it depends. If you’re dealing with weaker attitudes and beliefs, people don’t need evidence from many sources. However, if you are dealing with stronger opinions, you need more sources to prove your point. How does this work in practice? Jonah Berger distinguishes between a sprinkler and a fire hose strategy. Or, put another way, a scarcity or concentration approach.

If you’re trying to persuade people to engage in sustainable behavior on the outer sides of the movable middle (i.e., leaning more strongly toward the zone of rejection), a concentration strategy is more effective. That is, focus on a smaller group of people whom you confront with evidence multiple times over a short period of time (the more time there is between evidence, the less impact it will have). However, if you are dealing with people who tend toward the zone of acceptance, one or two people will be enough evidence for others to change their beliefs and mimic behavior. In this case, you can provide less social proof and focus on influencing more people at once.

Bypassing Confirmation Bias (3): Don’t ask for too much

A final intervention of behavioral science is to tone it down a bit. Rome was not built in a day; the same is true of behavioral change. We have seen that tethered, even often unconscious human tendencies cause us to prefer inertia to change.

So getting someone to change overnight is difficult, if not undoable. Are we as humans incapable of change? We absolutely can! We are constantly changing (or are you still wearing your 1990s hairstyle and outfits?) Almost everyone has something they want to change. Only often the barrier to change is too high. When we want to convince someone to adopt new behavior, we often ask for too much at once.

We underestimate that new behavior takes time, money, effort or energy. To lower the threshold for change, we need to break down end goals into smaller, more specific behaviors. What’s the difference? For example, an end goal might be a healthy life. Specific behaviors that will help achieve this end goal might include drinking six glasses of water every day, buying vegetables on Saturdays and doing two 20-minute workouts a week. These specific behaviors together will add up to the end goal. Furthermore, we must understand that:

“Behavior is a process; if you can commit someone to the process, change will happen.”

There is a magic word in the quote above: bind. Our brain likes simplicity. If you can tie people to an initial question, chances are they will act on it. If you’ve said “yes” to A, doing A is the easiest. It just feels logical and requires no further cognitive effort. This is known in behavioral science as the commitment/consistency principle. The trick is to start with a small question rather than a big question. To link back to what we learned earlier, start with a small question about common ground. Find that place of common ground that helps you build an initial connection.

Let’s go back to the climate skeptic who holds his family so dear: don’t ask them to live more sustainably from now on. For example, ask them to drop off their scrap paper at their children’s school. Put a dumpster next to the schoolyard. Make it easy for them (they are there to pick up their children anyway). Make it relevant to them (their motivation is to provide the best living conditions for their children, so storing paper in a container next to the playground is a subconscious reminder of a way to keep their children’s playground clean).

But most importantly, by doing this, you are forcing them to ask an initial small question. From this first question you can build on and slowly open up their zone of acceptance. Maybe even reverse their initial beliefs about sustainability, giving you the space to give them more information about sustainable behavior. In other words, you’ve used behavioral psychology to prepare their brains to be amenable to both change and the information needed for that change.

Summary

Confirmation bias is the human tendency to seek, focus on or prefer only information that confirms our existing beliefs. It is a strong bias because it operates quite unconsciously in our brains and is taken care of by the filter bubbles we all find ourselves in.

Confirmation bias reinforces our prior beliefs and makes societies more polarized. If you want to change the behavior of people who do not share the same opinion, you do not achieve this by providing more information, evidence, facts or figures.

You achieve this by reducing the distance between you and the people you are trying to influence with behavioral psychology, using human psychology and deep human understanding as a starting point. If you want someone to change, you must first make people willing to listen to the information necessary for this change.

 

Source: (Dutch) Astrid Groenewegen, CEO at SUE Amsterdam/Behavioral Design Academy
Published: https://www.marketingfacts.nl/berichten/confirmation-bias-hoe-overtuig-je-iemand-die-precies-het-tegenovergestelde-gelooft/

nl_NLDutch