Unbelievable News? Read It Again and You Might Think It’s True
The spread of fake news on social media is not an easy fix.
By Lisa Fazio, Vanderbilt University
In the weeks since the U.S. election, concerns have been raised about the prominence and popularity of false news stories spread on platforms such as Facebook. A BuzzFeed analysis found that the top 20 false election stories generated more shares, likes, reactions, and comments than the top 20 election stories from major news organizations in the months immediately preceding the election. For example, the fake article “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement” was engaged with 960,000 times in the three months prior to the election.
Facebook has discounted the analysis, saying that these top stories are only a tiny fraction of the content people are exposed to on the site. In fact, Facebook CEO Mark Zuckerberg has said, “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way – I think it is a pretty crazy idea.” However, psychological science suggests that exposure to false news would have an impact on people’s opinions and beliefs. It may not have changed the outcome of the election, but false news stories almost definitely affected people’s opinions of the candidates.
Psychological research, including my own, shows that repeated exposure to false information can change people’s beliefs that it is true. This phenomenon is called the “illusory truth effect.”
This effect happens to us all – including people who know the truth. Our research suggests that even people who knew Pope Francis made no presidential endorsement, would be susceptible to believing a “Pope Endorses Trump” headline when they had seen it multiple times.
Repetition Leads to Belief
People think that statements they have heard twice are more true than those they have encountered only once. That is, simply repeating false information makes it seem more true.
In a typical study, participants read a series of true statements (“French horn players get cash bonuses to stay in the U.S. Army”) and false ones (“Zachary Taylor was the first president to die in office”), and rate how interesting they find each sentence. Then, they are presented with a number of statements and asked to rate how true each one is. This second round includes both the statements from the first round and entirely new statements, both true and false. The outcome: Participants reliably rate the repeated statements as being more true than the new statements.
In a recent study, I and other researchers found that this effect is not limited to obscure or unknown statements like those about French horn players and Zachary Taylor. Repetition can also bolster belief in statements that contradict participants’ prior knowledge.
For example, even among people who can identify the skirt that Scottish men wear as a kilt, the statement A sari is the skirt that Scottish men wear” is rated as more true when it is read twice versus only once. On a six-point scale, the participants’ truth ratings increased by half a point when the known falsehoods were repeated. The statements were still rated as false, but participants were much less certain, rating the statements as “possibly false” rather than closer to “probably false.”
This means that having relevant prior knowledge does not protect people from the illusory truth effect. Repeated information feels more true, even if it goes against what you already know.
Even Debunking Could Make Things Worse
Facebook is looking at ways to combat fake news on the site, but some of the proposed solutions are unlikely to fix the problem. According to a Facebook post by Zuckerberg, the site is considering labeling stories that have been flagged as false with a warning message. While this is a common sense suggestion, and may help to reduce the sharing of false stories, psychological research suggests that it will do little to prevent people from believing that the articles are true.
People tend to remember false information, but forget that it was labeled as false. A 2011 study gave participants statements from sources described as either “reliable” or “unreliable.” Two weeks later, the participants were asked to rate the truth of several statements – the reliable and unreliable statements from before, and new statements as well. They tended to rate the repeated statements as more true, even if they were originally labeled as unreliable.
This can also apply to reporting about false public statements. Even a debunking-focused headline like CNN’s “Trump falsely claims ‘millions of people who voted illegally’ cost him popular vote” can reinforce the falsehood Trump was spreading.
Correcting After the Fact Doesn’t Help Much
When media outlets publish articles that contain factual errors – or that make assertions that are later proved false – they print corrections or retractions. But when people have strong preconceptions, after-the-fact updates often have no effect on their beliefs, even when they remember the information has been retracted.
In the early days of the second Iraq war, many news events were initially presented as true, and then retracted. Examples included allegations that Iraqis captured U.S. and allied soldiers as prisoners of war and then executed them, in violation of the Geneva Conventions.
In 2005, cognitive psychologist Stephan Lewandowsky gave Americans and Germans statements about various news events during the war. Some of the statements were true; others were reported as true, but later retracted; still, others were false – though those labels were not provided to the study participants.
The participants were then asked to rate whether they remembered the news event, whether they thought it was true or false, and whether the information had been retracted after its initial publication. Participants were also asked how much they agreed with official statements about the causes of the Iraq war.
Americans who remembered reports that had been retracted, and who remembered the retractions, still rated those items just as true as accurate reports that had not been retracted. German participants rated the retracted events as less true. In responding to other questions in the study, the Americans had shown themselves to be less suspicious of the official justifications for the war than the Germans were.
The researchers concluded that the Germans suspicions made them more likely to adjust their beliefs when the information was retracted. Americans, more likely to believe the war was justified, were also less likely to change their beliefs as new information arrived.
The study suggests that Clinton supporters, who tend to be suspicious of positive information about Trump, may remember that the pope-endorsement story was false, and discount the information. Trump supporters, by contrast, would be left with a more positive opinion of Trump, even if they remembered that the story was false.
There is no easy solution to the problem of fake news. But it’s clear that it is a problem: Exposure to false news stories can affect readers’ beliefs and opinions. Simply labeling the information as false is unlikely to reduce this effect.
A true solution would somehow limit the spread of these fake stories, preventing people from seeing them in the first place. A first step that each of us can take, is to check our sources and not share unreliable articles on social media, even if they affirm our beliefs.
Lisa Fazio, Assistant Professor of Psychology, Vanderbilt University.
This article was originally published on The Conversation. Read the original article.