Innovation

Bullying might go up on Facebook during coronavirus

Zuckerberg discusses the big challenges facing his big company.

by David Grossman
picture alliance/picture alliance/Getty Images

The world's biggest social media network is ramping up its effort to battle coronavirus misinformation, Mark Zuckerberg told reporters on a call on Wednesday.

As a side-effect, it seems not-unlikely that bullying will go up on the platform of 2.5 billion monthly active users.

"Productivity will go down" among the content moderators who enforce Facebook policies, Zuckerberg predicts.

The challenges faced by Facebook amidst the global coronavirus pandemic speak to the company's vast reach. While the chief concern of many companies is their employees, Facebook must also concern itself with combatting the type of misinformation that could lead to violence, injury, or sway political opinion.

To that end, the company has created the Coronavirus (COVID-19) Information Center, which will rely on information culled from the World Health Organization and the Center for Disease Control, as well as information from a user's local health authorities.

The Center will be given a prominent spot at the top of each user's News Feed, with the hope that they see that first before finding potential misinformation elsewhere.

“The top priority and focus for us has been making sure people can get access to good authoritative information from trusted health sources,” Zuckerberg told reporters.

“We are also very focused on making sure misinformation doesn’t spread. We’ve been able to partner with organizations, such as the WHO, to identify a list of claims that they classify as harmful misinformation and we’ve been able to take those down.”

The company, which according to Forbes saw over $55 billion in revenue in 2019, will be making a matching $20 million to support COVID-19 relief efforts around the world. It will also be starting a $100 million grant for small businesses around the world affected by the health crisis.

New Challenges

While Facebook faces unique challenges in any global event, the coronavirus pandemic has forced a stressful moment for some of the company's most-discussed employees: its contractors, especially those who work in content moderation.

Being a content moderator for Facebook is stressful enough within a "normal" news cycle. Reporting last year from The Verge painted a picture of contractors who lacked Facebook’s full-time benefits, dealt with the daily dread of severe racism and violence, and made under $30,000a year. “Collectively,” The Verge wrote at the time, “the employees described a workplace that is perpetually teetering on the brink of chaos.”

That workplace will now transfer over to work from home, as is the case for the rest of Facebook’s employees (Zuckerberg, when asked, noted that he is also working from home). But some content moderator’s work can only be done from their worksite, Zuckerberg told reporters. “A lot of these folks aren't going to be able to work, but they’re going to continue to be paid as normal,” he said.

Paying the moderators during their forced downturn is surely a positive, but there's a question that remains: Who will be doing their work?

There will be a focus, Zuckerberg said, on Facebook users who send posts that suggest suicidal ideation or focus on self-harm. Those posts will be moderated by full-time employees.

As for the rest, “there's about 20 different categories of harmful content that we track, everything from counterterrorism to child exploitation to graphic violence, to incitement of violence,” Zuckerberg said. While the company uses A.I to track content violations, “all these different categories that you can imagine and each one basically requires somewhat different A.I. work and systems to be trained to find that content. And we basically hold ourselves accountable. Our goal is to -- is to make it so that as much of the content as we take down as possible, our systems can identify proactively before people need to look at it at all.”

Zuckerberg declined to mention which categories of harmful content might flourish in this period, although he encouraged journalists to examine the company’s transparency reports. Categories like bullying and harassment relied heavily on following up to the reports of users.

Vast Reach

Over the course of the call, Facebook’s vast reach as a company was on display. Zuckerberg referenced Instagram, the popular photo app, and the encrypted WhatsApp, fielding questions on how to prevent misinformation there. While Zuckerberg does not see any “heightened pressure” to remove WhatsApp’s famed encryption, the company has made it “hard for people to forward content to a lot of people at once, because we have seen that of the misinformation that spreads.”

The company has also seen many successes across its various fronts. Churches, synagogues, and mosques have turned to Facebook Live to stream their services, doctors have been fielding questions in chats, and Facebook Messenger has allowed for families and friends socially distancing to remain in contact.

But even these do not capture the full range of Facebook’s public face. Within Facebook itself, there’s Facebook Marketplace, similar to Craigslist with sales of all sorts of products, and a casual search of items like hand sanitizer saw prices like $30 for an 8-ounce bottle of hand sanitizer. “Our teams are monitoring the COVID19 situation closely and will make necessary updates to our policies if we see people trying to exploit this public health emergency for their own benefit,” a Facebook spokesperson told Inverse.

The Inverse Analysis

No company in history has had access to society like Facebook. Once seen as the coolest website in the world with possible revolutionary abilities, the blowback in recent years has been fierce. Criticisms of the company abound, from its spreading of fake news to treatment of contract workers. If ever there were a moment where the company could work to regain the world’s trust, this is it.

Related Tags