Fake News: 80% of Shares During 2016 Election Came From a Few Twitter Users
"We did a double take on that one."
by Sarah SloatThe fake news phenomenon has created two competing schools of concern. On one side, there are citizens anxious that the circulation of fake news on social media led to the 2016 election of Donald Trump. On the other side are Trump supporters who are worried that legitimate sources of news are actually fake. As the president’s definition of what’s “fake” continues to expand, scientists publishing in Science are seriously asking: How much fake news is actually out there?
According to a study released Thursday, the number of Americans who shared fake news on Twitter during the 2016 presidential election was actually a very small group of individuals. An analysis of 16,442 registered voters on the social networking site revealed that just about 1 percent of those users accounted for 80 percent of all exposures to fake news content. Furthermore, only 0.1 percent of the same users were responsible for 81 percent of the fake news shared.
"We did a double take on that one.
This result, says study co-author and Northeastern University professor David Lazer, Ph.D., came as a surprise to the team. “We did a double take on that one,” Lazer tells Inverse. “We did expect that it would be concentrated, but if you had asked me before the study, I probably would have said something on the order of magnitude of 2 to 5 percent.”
These results, based on tweets sent out from August to December 2016, demonstrates that most people were exposed to news that came from factual media outlets. Fake news outlets were defined as those that had the “trappings of legitimately produced news” but lacked “the news media’s editorial norms and processes for ensuring the accuracy and credibility of information.”
Lazer and his colleagues found that the small fraction of Twitter users who consumed and shared fake news were individuals who are older, conservative, and politically engaged. Comparing the 16,442 Twitter users in the study to a representative panel of U.S. voters on Twitter obtained by Pew Research Center, Lazer’s team demonstrated that their sample was reflective of the nation as a whole.
"It turns out that there are many basic things we don’t know.
A study released earlier in January in Science came to a similar conclusion. In that study, the researchers examined the characteristics linked to Americans who shared fake news on Facebook during the 2016 elections and the frequency at which fake news was shared. They also found that “sharing this content was a relatively rare activity” and that conservatives were more likely to share articles from fake news domains.
The results of these studies suggest that the role of bots sharing fake news needs to explored further and that measuring the number of shares that a fake news post receives is a misleading way of determining the breadth of its influence. This might change how we look at reports like a 2016 BuzzFeed analysis showing that the top 20 fake news election stories generated more shares, likes, reactions, and comments than the top 20 election stories from major news outlets preceding the election. Lazer cautions that those particular stories may be outliers, and their number of shares may have become “artificially pumped up.”
“It turns out that there are many basic things we don’t know, and what we think we know is actually based on nonscientific foundations,” Lazer explains.
The good news is that fake news may not be as much of a systemic problem as some people believe, he says. The bigger concern we should have, he adds, is “the rhetorical use by political leaders around the world to bash media that would hold them accountable.” Underlying the fake news issue is the fact that the information ecosystem in the US has undergone rapid change, and because the way people get informed — or not — is fundamental to democracy, it’s crucial for us to understand it.