Science

Solving Facebook's Fake News Problem Takes Ten Lines of Javascript

Daniel Sieradski sought to show Facebook how to handle its growing problem.

by Joe Carmichael
Getty Images / Win McNamee (Photo Illustration)

Having initially denied the significance of the issue, Facebook is now publicly mulling what to do about the fake news sites running rampant across its social network and maybe affecting the outcome of elections. Though CEO Mark Zuckerberg maintains that Facebook isn’t a media company, he has grudgingly admitted that the company could do better with fake news. What he hasn’t said is how exactly he’d go about doing it, resorting instead to hypotheticals and commitments to address the problem over time. Which is where Daniel Sieradski comes in. Unwilling to wait for Zuck, the programmer took matters into his own hands and built a Google Chrome extension, the “B.S. Detector.”

Facebook has been struggling with news content for months. In May, it came under fire for suppressing conservative news, as well as any negative stories about itself, creating enough of a scandal that Congress started to ask questions. Eventually, the site fired all of its human moderators from the “Trending” news section and replaced them with algorithms — which wasted no time in disseminating bullshit all over the site. While Zuckerberg waffled, Sieradski stepped in to cut through to cut through the bureaucracy with a simple Google Chrome Extension. He told Inverse about his quick fix, and why Facebook should take a page from his book.

Mark Zuckerberg and one of his many slogans.

Getty Images (Justin Sullivan)/ Wikimedia Commons Photo Illustration

How’d you make the extension?

I compiled a list of a few hundred sites that are known to be sources of fake news, or conspiracy theory, or unsourced claims and innuendo, and then created a very small snippet of Javascript that looks at whatever page you are browsing for links to any of the sites in the list. Then, when you hover over that link, it will show a tooltip, which is like a little hover-tag, that tells you that the site itself is a questionable source.

The tooltip, warning about this source.

Daniel Sieradski

The code is very simple — it’s not more than like ten lines of Javascript code. It’s in a Chrome plug-in framework, so I just dropped my code in, and uploaded it to the Chrome store. And that’s it.

What’s the reception been like?

I can’t see yet. I didn’t put Google Analytics inside the Chrome plug-in, because I am a privacy advocate. The Chrome store will tell me, it just takes a few days to update the numbers. So I won’t know, for a few days, how many people are using it. But the reception’s been very positive.

How hard would it be for Facebook to produce and then roll out this product?

It would be exceptionally easy. The issue is that Facebook is probably overthinking it, in terms of, ‘Oh, how do we design an algorithm to detect fake news?’ As opposed to: ‘Why don’t we just compile a list of sites that we know are sources of fake news?’ Their desire to be as impartial as possible, so that they can blame it on an algorithm instead of taking responsibility for the content on their site, is their way of indemnifying themselves against any claims that they’re censoring content because of political bias. So they’re making a business decision to avoid taking responsibility for that content.

Do you see any other reasons why Facebook dragging its feet, and Zuck feigning innocence and ignorance?

Did you hear, yesterday — BuzzFeed reported that there’s a sort of uprising happening inside Facebook, where a group of engineers there are collaborating to address this issue?

I did.

That’s the thing. It’s like: They could do this. It seems that Zuckerberg is afraid of the alt-right, and the far-right extremists, screaming at him that he’s calling their websites questionable sources. But the reality is that they are [questionable sources]. And it just takes — uh — a spine. To stand up to them. I don’t know that he has that spine. Or, his business interests may be that he cares more about keeping the traffic of cranks than he is being a responsible citizen.

Another example of the extension's functionality.

But, given that there’s a lot of subjectivity that goes into discerning fact from fiction, do you think it’s Facebook’s place to call one from the other?

I don’t think that, when it comes to fake news, that’s subjective. It’s very clear that a parody news site is a fake news site. It’s very clear that websites that don’t source their claims, and make wild allegations without any kind of factual basis, are not news sites, by any reasonable standard. So, I think that it’s kind of a farce to say that it’s subjective.

When it comes to the political sites on the right and the left — yes: They both advance claims that are dubious. Discerning which ones are legitimate sources making dubious claims, and which ones are not, I think, is also easy. Fox News makes dubious claims, but they are a legitimate, reputable news source that, like, hosts presidential debates. You can make these distinctions, and do it in a fair, reasonable way. And you can do it in a way that does not come across as biased, as long as you are able to articulate, clearly, what your criteria are.

I’m guessing you don’t really expect Facebook to roll this out immediately. Could you imagine an alternative solution that could be a temporary fix?

Well, what I’m doing is that temporary fix. My plug-in seeks to address the issue through user action, rather than through Facebook’s action. Hopefully, it’ll spread virally, and, you know, gain wide adoption, and have some success in addressing this issue. People who want to read fake news should be able to read fake news, but people should have a warning label. They shouldn’t necessarily believe everything they see on the internet.

Do you expect a call from Facebook with a job offer?

Not in the least. I have a reputation for bitching out large tech companies that adopt policies that enable abuse, and fail to address hate speech, and which fail to address issues — like the viral spreading of false information. So, I don’t expect — as a new-media critic, of sorts — to be receiving any job offers from major tech firms anytime soon.

Try out the B.S. Detector here.

This interview has been edited for brevity and clarity.

Related Tags