The Psychology Of False Beliefs

VEDANTAM: At some point, though, her conviction started to waver. Those doting moms on Facebook, they had some weird beliefs.

DYNDA: People denying that AIDS exists, people saying that the reason there’s gay people is vaccines — on and on and on with really crazy conspiracy theories.

VEDANTAM: And then it hit her. If she didn’t believe those ideas, why was she trusting them on vaccines?

DYNDA: And I stepped back. I stopped going to the Facebook group as much, and I decided I needed to look at this issue from a purely logical perspective — no emotion in it, no, oh, my God, what if something happens to my baby? And I completely readdressed the issue all over again pretty much from the start.

Fear works in two situations. It works when people are already stressed out. And it also works when what you’re trying to do is get someone not to do something, an inaction. For example, if you try to get someone not to vaccinate their kids, fear may work. If there’s, you know, an apple that looks bad, I don’t eat it. Fear is actually not such a good motivator for inducing action, while hope is a better motivator, on average, for motivating action.

VEDANTAM: You talk about one study in your book where a hospital managed to get its workers to practice hand hygiene to get staff members to wash their hands regularly. But it turned out the most effective thing wasn’t frightening the staff about the risks of transmitting infections. It was something else.

SHAROT: So in a hospital on the East Coast, a camera was installed to see how often medical staff actually sanitize their hand before and after entering a patient’s room. And the medical staff knew that the camera was installed and yet only 1 in 10 medical staff sanitized their hands before and after entering a patient’s room. But then an intervention was introduced — an electronic board that was put above each door, and it gave the medical staff in real time positive feedback. It showed them the percentage of medical staff that wash their hands in the current shift and the weekly rate as well. So anytime a medical staff will wash their hands, the numbers will immediately go up and there will be a positive feedback saying, you know, good job. And that affected the likelihood of people washing their hands significantly. It went up from 10% to 90%, and it stayed there.

Instead of using the normal approach, instead of saying, you know, you have to wash your hands because otherwise you’ll spread the disease — basically instead of warning them of all the bad things that can happen in the future, which actually results in inaction, they gave them positive feedback.

To test how people update their beliefs when confronted with new information. So she presented statements to two kinds of people — those who believe that climate change was real and those who were deniers. She found that for both groups, when the statement confirmed what they already thought, this strengthened their beliefs. But when it challenged their views, they ignored it. Tali says it’s because of a powerful phenomenon known as confirmation bias.

There’s four factors that determine whether we’re going to change our beliefs — our old belief, our confidence in that old belief, the new piece of data and our confidence in that piece of data. And the further away the piece of data is from what you already believe, the less likely it is to change your belief. And on average, as you go about the world, that is not a bad approach. However, it also means that it’s really hard to change false beliefs. So if someone holds a belief very strongly but it is a false belief, it’s very hard to change it with data.

The factors that affect whether you’re influential can be can you elicit emotion in the other person? Can you tell a story? Are you taking into account the state of mind of the person that’s in front of you? Are you giving them data that confirms to their preconceived notions? All those factors that make one speech more influential than the other or more likely to create an impact can be used for good and can be used for bad.

VEDANTAM: So as I was reading the book, I was reflecting on the things that I know or the things that I think I know, and I couldn’t come up with a good answer for how I actually know that it’s the Earth that revolves around the sun and not the other way around.

O’CONNOR: Yeah. That’s right. Ninety-nine percent of the things you believe probably you have no direct evidence of yourself. You have to trust other people to find those things out, get the evidence and tell it to you. And so one thing that we talk a lot about in the book is the fact that we all have to ground our beliefs in social trust. So we have to decide what sources and what people we trust and therefore what beliefs we’re going to take up because there’s just this problem where we cannot go verify everything that we learned directly. We have to trust someone else to do that for us.

VEDANTAM: We trust the historian who teaches us about Christopher Columbus. We trust the images from NASA showing how our solar system is organized. Now, we say we know Columbus was Italian, and we know the Earth revolves around the sun. But, really, what we mean to say is we trust the teacher, and we trust NASA to tell us what is true.

They tend to trust those who are more like them. They also tend to trust those who share beliefs and values and practices with them.

O’CONNOR: So in the end, she did something really smart, which took advantage of the ways that we use our social connections to ground our beliefs and our trust. So she ended up convincing Princess Caroline of Ansbach to variolate her own two small daughters and to do it in this kind of public way. So she got one of the most influential people in the entire country to engage in this practice. So that did two things. So №1, it made clear, you know, because she did in this kind of public way and her daughters were fine, it gave people evidence that this is, in fact, a safe practice, and it’s a good idea. But it also made clear to people that if they want to conform to the norm, if they want to share a practice with this really influential person, then they should do the same thing. And after Princess Caroline did this, variolation spread much more quickly, especially among people who had a personal connection to either Mary Montagu or to the princess.

VEDANTAM: What’s fascinating here is that this wasn’t, in some ways, a rational way to solve the problem. It wasn’t saying, look; there’s really convincing evidence here. You’re almost using a technique that’s pretty close to propaganda.

Propagandists tend to be very savvy about the ways that people use their social connections to ground trust and knowledge and choose their beliefs. And they take advantage of those. In this case, it was using those — that social trust for good.

Because scientific theories in the past have always eventually been overturned, we ought to think that our theories now will probably be overturned as well.

But there is actually an optimistic side to this, which is that if you look at many theories in the past, ones that were overturned, often the reason people believed them is that even if they were wrong, they were a good guide to action. Even the theory of stomach acid causing ulcers — well, if you treat stomach acid, it actually does help with ulcers. You know, it wasn’t a completely unsuccessful theory. It’s just that it wasn’t totally right, and it wasn’t as successful as the bacteria theory of ulcers because antibiotics do better.

For us to actually be on the right side of the misinformation information divide, it’s helpful for us to think in probabilistic terms rather than in binary terms.

O’CONNOR: Yeah, that’s absolutely right. So we do think it’s really important to think about belief in terms of degrees and evidence and believing something strongly enough.


But, ultimately, not being sure about something is not what matters. We’re never really 100% sure about anything. I mean, if you think about, think about any belief you could have — you know, that the sun will come up tomorrow. Well, it always has in the past, but that doesn’t mean that 100% sure it will tomorrow. There’s a really good chance it will tomorrow. We shouldn’t be looking for certainty. Instead, we need to be saying to ourselves when do we have enough evidence to make good decisions?



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store


Buddhism, mixed with my current interests in economics, privilege, immigration, etc. Email <my username>