Range Widely
Subscribe
Cover photo

Why Propaganda Works (The “Illusory Truth Effect”)

Freedom of speech does not include the right to amplification of speech.

David Epstein

Mar 15
1
29

This coda to a New York Times piece about the war in Ukraine caught my eye:

Your Twitter timeline may be filled with profile pics featuring azure and yellow flags, showing support for Ukraine, but (to borrow a Washington Post headline) Putin is probably still “winning the information war that counts” — the one at home and in China.

Some of the Russian propaganda seems so transparent that it’s hard to believe it’s effective. But propaganda works. To learn a bit about why it works, and how to combat misinformation in general, I called Lisa Fazio, a Vanderbilt psychologist who studies misinformation. Below is an edited version of our conversation, which had some concrete takeaways for how I plan to think about and behave on social media.

This post is longer than usual (more like a 10-minute read than the usual 5-minute read) because I thought the conversation was interesting and important. I broke the chat up into sections, in case you want to pick one that seems interesting. The section in which Lisa says what she’d do as emperor of anti-misinformation is the last one. Let’s begin:

The “Illusory Truth Effect” — Repetition, Repetition, Repetition... Repetition...

David Epstein: Putin has been repeating this idea that he wants to “De-Nazify Ukraine,” which sounds random and ridiculous to a lot of the rest of the world. But apparently state-controlled media has been building up that idea in Russia for years now. As Novaya Gazeta, one of the last independent outlets in Russia put it: “Russian television never tires of reminding about the Nazis.” I think this might be related to the “illusory truth effect” you’ve studied. Can you explain that?

Lisa Fazio: This is a term we use for the finding that when you hear something multiple times, you're more likely to believe that it's true. So, for example, in studies, say that you know that the short, pleated skirt that men wear in Scotland is called a “kilt,” but then you see something that says it’s a “sari.” You’re likely to think that’s definitely false. If you see it twice, most people still think it's false, but they give it a slightly higher likelihood of being true. The illusory truth effect is simply that repetition of these statements leads to familiarity and also to this feeling of truth.

DE: Is it possible those people just didn’t know the correct answer to begin with?

LF: We’ve studied that, and this is true even for people who answered the question correctly two weeks earlier. When you present the false statement twice, they’re still more likely to think that it’s true.

DE: This reminds me of a phrase I saw in your work: “knowledge neglect.” So these people know the right answer, are they just not thinking about the knowledge they actually have?

LF: Exactly. You can think of two main ways that we could determine the truth of the statement. One would be to actually consult our knowledge base — to think about everything else we know about the topic. And the other would just be to use this quick heuristic or "gut level" feeling of “Does this feel true?” And it's that kind of quick "gut level" feeling that's affected by things like repetition.

DE: So if you can get people to slow down and check with their prior knowledge, does that help?

LF: It does seem to help. We've done studies where we get people to pause and tell us how they know that the statement is true or false. And when people do that, they seem to be less likely to rely on repetition.

DE: Does this hold for more impactful statements than suggesting that a kilt is actually a sari?

LF: We tried some really bizarre, health-related claims that are false, like that women retain DNA from every man they’ve ever slept with. And with those, people were more likely to slow down and consider their existing knowledge. But it’s complicated, and plausibility doesn’t necessarily matter. So crazy statements like the Earth is a perfect square, or smoking prevents lung cancer — we still see some increase in how likely people are to think those are true when they’re repeated. It’s a smaller effect, but it’s still there.

DE: How do you measure the effect size?

LF: We have people rate the statement on a scale from “definitely false” to “definitely true,” so you see less movement with outlandish statements, but there’s still an effect.

DE: So if repeating something twice makes a difference, does repeating it 30 times make a massive difference, or does the effect diminish?

LF: We just published a study where we actually texted people different trivia statements. So they were just going about their daily lives and we would text them some statement. Later, when we asked them to rate the truth of these statements, some were new to them, some they may have seen two, four, eight, sixteen times. And we got this pretty logarithmic curve — those initial repetitions cause a larger increase on truth rating than do later repetitions. But it’s still going up from eight to sixteen repetitions.

DE: Does this hold for various levels of cognitive ability?

LF: We’ve seen the illusory truth effect from five-year-olds to Vanderbilt undergrads, and other adults. I think that's one of the big takeaways from all of the research we've done on misinformation is that we all like to believe that this is something that only happens to other people. But, in reality, just given the way our brains work, we're all vulnerable to these effects.

DE: Right, I get that. It applies to pretty much everyone. But, I don’t know, maybe not to me. I mean, this is me we’re talking about.

LF: Exactly.

“Information Deficit Model,” and Effective Debunking — i.e. "Truth Sandwich"

DE: In your review paper, you and colleagues suggest that the traditional “information deficit model” — the idea that people just don’t have enough information, and will accept the truth if they get more info — isn’t really adequate. The idea that you just provide correct facts, and that’ll fix everything, isn’t borne out by the research. So what might effective debunking look like?

LF: One idea is what we call a “truth sandwich.” Facts are useful, but not enough to actually fix the issue. You have to address the false information directly. So in a truth sandwich, you start with true information, then discuss the false information and why it’s wrong — and who might have motivation for spreading it — and come back to the true information. It’s especially useful when people are deliberately misinforming the public. So for someone who has a false belief about climate change, if you can pull back the curtain and say, “No, actually this is a narrative that's been pushed by these oil companies with these motivations for having you believe this. Here's why it's wrong and here is what's actually true."

DE: So if you just tell someone that something isn’t true, but don’t replace it with truth, does that not work because it just leaves an information vacuum? Like, you have to be Indiana Jones where he replaces the idol with a bag of sand.

LF: Exactly. People have already created this causal story in their mind of how something happened. So in a lot of the experiments, there’s a story about how a warehouse fire happens. And initially people are provided with some evidence that it was arson — there were gas cans found on the scene of the crime. And then in one case you just tell people, "Oh, oops, sorry, that was wrong. There were no gas cans found there." Versus in another you give them an alternative story to replace it — that there weren't any gas cans at all; instead, it turns out that there was a faulty electrical switch that caused the fire. If you only tell people the gas cans weren't there, they still think it's arson. They just are like, "Oh, yeah. The gas cans weren't there, but it was still arson, of course." Whereas in the second story, they'll actually revise the story they had in mind and now remember it was actually accidental.

DE: So even when the evidence they were basing their judgment on, they’re told that's gone, but the judgment stays behind anyway, unless replaced?

LF: Exactly.

DE: Fascinating. That also seems like a difficult challenge because what if you just don’t know what caused the fire?

LF: Yeah, and with false information you can make it really engaging, really catchy, really easy to believe. And the truth is often complicated and nuanced and much more complex. So it can be really hard to come up with easy ways of describing complicated information in a way that makes it as easy to believe as the false information.

DE: And given the importance of repetition, does that mean you have to attempt to match disinformation repetition with debunking repetition?

LF: Yes. Unfortunately, memories fade, and current evidence is that debunkings fall into the same category as everything else. A week later, or a couple of weeks later, you’ve forgotten it. You might also forget the false information too, but if you keep seeing it again, then a one-time correction isn’t doing too much for you. A good example was the situation with the Sharpies in Arizona. If one time you read a debunking that it wasn’t actually evidence of election fraud, but then later you see 20 posts talking about it being fraud, that one correction doesn’t have much of a chance.

What Lisa Fazio Would Do Tomorrow As Emperor of Anti-Disinformation

DE: Ok so some of this is a little depressing. False information is readily available, catchy, and you just have to repeat it a lot, and make it hard to replace. You’re reminding me of a line I just read in a New York Review of Books piece in which author Emily Witt describes the internet as “structurally engineered to shove a bouquet of the dumbest arguments in human history in our faces several times a day.” So is there any hope for us?

LF: Yes, I think there is hope, but I do think it requires revamping some of the ways that information is currently spread. Right now, there are so many ways that the advantage is with people pushing disinformation. And there's not one simple fix to it, but I think it's the case where if we did a bunch of small fixes, we'd be in a much better place. So one simple thing social media companies can do is provide more scrutiny of larger accounts, and accounts with more followers. Around the 2020 election, the Election Integrity Partnership found that a small number of accounts were spreading most of the disinformation about the U.S. election.

DE: Well, this newsletter is published on Bulletin, a new platform from Facebook’s parent company, Meta, and I happen to know that some people who work there read this newsletter. So if I can put you on the spot: If you were named emperor of anti-disinformation tomorrow, what would be your first move?

LF: Ooh, I’ll give two. One is the bigger focus on large accounts. Right now it’s easier to get banned as a small account than a large one because it isn’t controversial. The idea should be that greater platforms come with greater responsibility. And if I’m YouTube or Twitter or Facebook, and have someone that I’m broadcasting to millions of people, I think I want to have more say on whether the information they share is correct. And I am concerned with private businesses making decisions about what’s true and false and all of that, but they’re already making decisions about who to amplify and who not to amplify. I think it’s already happening, so they should use that power.

DE: You and your colleagues wrote that “freedom of speech does not include the right to amplification of speech.”

LF: And then a second, simpler thing that I think we could fix tomorrow, if people would pay attention to it and implement it, is adding some metadata to photos and videos to prevent some of these cheap fakes. So there's a set of photos that we know will be used anytime there's a new conflict, as evidence of missile strikes. And having that set up so that it's easy to identify — that this is reused, this is a photo that someone took four years ago — would prevent a lot of easy-to-create misinformation we're seeing right now.

DE: Wait, there’s like a usual suspects of misinformation images?

LF: Yeah, the classic one is anytime we get a hurricane, there's a photoshopped image of a shark swimming on a flooded highway. That one goes around anytime there's a storm. And meteorologists have picked up on this — they know it's coming. And yet it goes viral every time.

DE: Well, maybe every storm washes a large shark onto the highway, ever think of that??

LF: Haha, unfortunately someone photoshopped it years ago.

DE: Right. So, in both cases there’s an easy-targeting advantage here. A small number of large accounts and a small number of known images that do a lot of the dirty work. So that at least bodes well for theoretical fixes. Thanks so much, Lisa, for sharing insights that are actionable for individuals, and institutions. I’m less depressed than I was a few minutes ago!

Thank you for reading. If you would like more detail from this interview in next week's newsletter, let me know in the comments.

I loved Lisa's tips for what platforms can do to make a huge dent quickly. If you want to help combat misinformation by sharing this post, here's a link. And, as always, you can subscribe here. Until next week...

David

P.S. Shout-out to the reader who sent me a note of appreciation about the chef Sanji reference in last week’s newsletter. You know who you are!

Subscribe for free to Range Widely
By subscribing, you agree to share your email address with David Epstein to receive their original content, including promotions. Unsubscribe at any time. Meta will also use your information subject to the Bulletin Terms and Policies
1
29

More from Range Widely
See all

Can Bad Number Sense Amplify Deadly Hate?

Here's a dire reason to be curious about the numbers around you
May 18
4
6

The Nazis Invented Marriage Counseling, And Other Surprising Relationship Research

Q&A with the bestselling author of a new book on the science of relationships
May 9
4
19

Here Are Two Tips for Picking Out Misleading Stats in the News

I used these, and it led to this newsletter’s first official correction of a scientific journal article
May 2
7
13
Comments
Log in with Facebook to comment

29 Comments

  • James Hamblin
    Writes THE BODY
    People find it annoying when you repeat yourself, but it does make them remember. People find it annoying when you repeat yourself, but it does make them remember. People find it annoying when you repeat yourself, but it does make them remember.
    • 8w
    • Author
      David Epstein
      James, that's a great poin — ....nm, I looked at twitter and forgot what you were saying.
      • 8w
  • Ian Bremmer
    Writes GZERO World with Ian Bremmer
    great piece
    • 3w
  • Paula Veselovschi
    Hey David, great interview! I found especially useful the "truth sandwich" part, but yeah, there are those circumstances where we simply don't know the truth, and still have to stop misinformation from spreading.
    This whole discussion reminded me of …
    See more
    • 9w
    • Author
      David Epstein
      Paula, I love a story that starts with "distant as it may seem..." First, I love hearing about other peoples' lines of work, because I always learn something new, so I really appreciate this, and please feel free to share examples from your auditing ex…
      See more
      • 8w
  • Sean D'Souza
    There's a story that originates in India. A man is taking a goat home and puts it on his back. Three friends set out to relieve him of his goat, so they can have a feast. The first man he encounters asks him, "why are you carrying a donkey?" The man ca…
    See more
    • 9w
    • Edited
    • Author
      David Epstein
      Sean, these are amazing. Especially the goat story. Incredible! Yet another piece of wisdom, validated by science, but long contained in a folktale. I really appreciate you sharing this. If I'd known about it, I would've mentioned it in the post!
      • 9w
    View 1 more reply
  • Nancy Carroll
    Thanks for this reminder of the importance of critical thinking. Especially liked the visual of replacing the stone... Indiana Jones rules.
    False flags and disinformation seem to be exclusively used by tyrants and megalo maniac bullies. It somehow appe…
    See more
    • 9w
    • Author
      David Epstein
      Nancy, I agree, and that's scary. Lisa and I talked a bit about the sort of in-group signaling, and it's mentioned in the review paper I linked, although I didn't leave it in the final post here, mostly for length. It's frightening that tyrants seem to…
      See more
      • 9w
  • Ivan Privalko
    Thanks for this important piece. I'm Russian, my half-brother is Ukrainian. He lived in Kyiv when the war started and was giving us all a day by day breakdown of what was happening and how violent the conflict was.
    Despite this, we still have family m…
    See more
    2
    • 9w
    • Edited
    • Author
      David Epstein
      Ivan, thanks so much for this. First of all, I hope that your half-brother is doing as well as might be expected, given the circumstances. ...That is amazing, and terribly disconcerting to hear. That droning of "special military operation" must be the …
      See more
      • 9w
    View 1 more reply
  • Matt Thomas
    This is so timely! Thanks, David. I'm glad we have people like Lisa out there and working on the problem. One thought I had: Lisa's two suggestions seem great, but they are action items for large corporations like YouTube, Twitter, or Meta. For us indi…
    See more
    2
    • 9w
    • Author
      David Epstein
      Hey Matt, here were my personal takeaways — and granted, the interview was about three times this length, so I may also be leaning on some things Lisa emphasized that were lost in condensing:
      -Slow down! At the start of the Ukraine war, for example, I …
      See more
      Here’s what we have in store for MisinfoDay 2022
      CIP.UW.EDU
      Here’s what we have in store for MisinfoDay 2022
      Here’s what we have in store for MisinfoDay 2022
      • 9w
  • Kevin Bracker
    Not much to add other than that I loved this piece. Misinformation and how it spreads and get amplified is amazing!
    • 9w
    • Author
      David Epstein
      Glad you appreciated it, Kevin! It is amazing...and disheartening. I found some reasons for hope in this interview, but they really require proactive tactics that are in short supply.
      • 9w
  • Bill Welter
    Another excellent and timely piece. I love the concept of the "truth sandwich;" maybe we could get McDonalds to put it on their menu.
    2
    • 9w
    • Author
      David Epstein
      Hahaha....thanks Bill. If only the concept were a "truth McRib," then we'd have a full on cult of truth seekers;)
      • 9w
  • Vicki L Jones
    Also with parenting. The child, as he's growing up, hears the same wrong information over and over....so many things I could list.
    • 9w
    • Author
      David Epstein
      Great point, Vicki. This is not a great analogy, but you just reminded me of talking to some coaches in sports at the elite level who would say that a huge part of their job was undoing things that parents showed or told a kid over and over.
      • 9w
    View 1 more reply
View 4 more comments
Share quoteSelect how you’d like to share below
Share on Facebook
Share to Twitter
Send in Whatsapp
Share on Linkedin
Privacy  ·  Terms  ·  Cookies  ·  © Meta 2022
Discover fresh voices. Tune into new conversations. Browse all publications