Oh?
Facebook thinks these onions are too sexy for your timeline
It’s almost as if we’re not ready for AI to moderate social media.
On Monday, CBC reported that a Canadian seed company had one of its Facebook ads flagged for containing an “overtly sexual image.” That image only contained Walla Walla onions, but it seems like Facebook’s AI is registering them as female breasts. Facebook, mostly through its subsidiary Instagram, has a storied history with policing nipple content, exacerbated here thanks to its increased reliance on algorithms for moderation.
Sexy onions — Look, these are some pretty attractive onions, but that’s no reason to deny Gaze Seed Company’s ad. It’s clear that AI is taking some creative license when it comes to what nudes are... or that there are some interesting nudes out there that confused the algorithm.
"You'd have to have a pretty active imagination to look at that and get something sexual out of it,” Jackson McLean, a manager at the company, told CBC. “Hopefully an actual human gets to look at the photo to decide that it's not actually sexual at all. It's just onions."
A rough year for moderation — Facebook actually agrees with McLean; it’s had egg on its face from several moderation slip-ups this year. In this case, the error is lighthearted, but its algorithms have also blocked legitimate coronavirus information and let through ads inciting violence.
Unfortunately, it’s responding with a little too much zeal with hundreds of moderators being told last week that they would return to their offices en masse on October 12. Though the transition out of remote work back to California and Austin offices was expected, this news doesn’t follow the initially staggered return moderators were expecting for their safety. Given the scandal around their working conditions that kicked off this hellish year, it’s unsurprising that they now feel unprotected by their employer.
AI isn’t ready to do this job on its own, as evidenced here at Facebook and at other companies like YouTube’s flag-happy algorithm. That doesn’t mean, however, that human moderators should be treated like expendable machines.