Science

We Tested Facebook's Voice Over Photo Recognition With Graphic Images

Turns out graphic content is not the new technology's strongpoint.

by Nickolaus Hines
Cannabis Culture

Today, Facebook rolled out a feature called automatic alt text that makes the image-heavy quality of social media more accessible to people who are blind or visually impaired.

Using iOS’s VoiceOver function, Facebook’s object recognition technology lists elements an image in question may contain. Specifically, those elements include people (how many are pictured and whether they’re smiling), certain objects, indoor/outdoor scenes, actions, iconic places, and whether a picture has objectionable content.

A study in 2014 found that people who are visually impaired share fewer photos on Facebook, but, on average, “like” people’s pictures more than those who are not visually impaired. They also comment on photos just as frequently. Yet understanding the photo relied on parsing context clues out of metadata and photo captions — things we all know aren’t always the most straightforward descriptors.

With automatic alt text, however, users are able to gain new insight about a picture through a mechanized description of the image.

So what happens when the pictures might be considered objectionable content under Facebook’s own guidelines?

I uploaded some images Facebook might or has judged as “objectionable” into a (private) album to find out.

Breastfeeding

A breastfeeding image that has been taken down by Facebook in the past.

Beall Photography

Stories about Facebook removing images of women breastfeeding took over the internet in early 2015. Of the women affected, newborn photographers seemed to have many of the problems.

Jade Beall is one of those photographers. Images like the one above were repeatedly taken off her professional Facebook page. Today, Facebook allows images of breastfeeding to be on the site, which Beall celebrated by reposting the above image as her header photo with the caption:

“YAY for social media evolving and supporting the desexualization of women breastfeeding! I am still in awe that I can post this image without the fear of being FB banned!”

Facebook’s automatic alt text is also cool with breastfeeding, leaving the description to a simple:

“Image may contain: Six people.”

More direct, but still good under Facebook standards.

Getty

In the case of a basic image of one mother breastfeeding, it simply offers:

“Image may contain: Two people.”

Drug Use

Nothing like a hit from an extremely tall bong.

Cannabis Culture

Facebook explicitly bans photos and videos that contain drug use. Taking a fat bong hit is one clear example of “drug use.”

Automatic alt text deals with it like this:

“Image may contain: One person, indoor.”

Nudity

Along with drug use, Facebook also bans nudity. The line here gets a little fuzzy, as Facebook allows some nudity, but only nudity that the company censors deem have artistic merit. Paintings generally get more leeway in these cases, while photography gets cracked down on a little harder.

Nude on a Blue Cushion by Amedeo Modigliani

Amedeo Modigliani

In the case of the above painting, automatic alt text had no problem with the image:

“Image may contain: One person.”

In response to Facebook’s censorship guidelines, artists rallied under #FBNudityDay on January 14 to post nude images Facebook has the tendency to censor. A full nudity body painting image was included, and when put to the automatic alt image test, this was the response:

“Image may contain: One person.”

Graphic Violence

Facebook condemns photos and videos that depict and glorify graphic violence. Ethically, graphic violence is murky ground. The picture we used to test how the automatic alt text would respond to graphic violence was one of a Venezuelan gunshot victim posted in 2014 and owned by the San Francisco Chronicle.

The photo features a man on a gurney in pain, blood everywhere. Beside him is a woman with gloves on, but the image cuts her off at her shoulders.

This is where the most damning alt text response came in. The voiceover simply stated:

“Image may contain: Shoes.”

Live birth

Facebook’s most recent foray into what should and should not be shown on people’s social media feeds was a photo of a live birth. New York Magazine first brought attention to the image, which shows a woman named Francie holding her newborn baby, umbilical cord still attached.

It was posted in a private Facebook group, but was removed because someone in that private group reported it for violating Facebook’s nudity rules. Francie, however, stated that she considered the photo an example of female empowerment.

Automatic alt text had this to say:

“Image may contain: One or more people.”

As these examples show, Facebook’s automatic alt text isn’t as quick to judge what is and is not appropriate for the site as Facebook’s censors are. But there’s still time to catch up. Facebook says its new technology will improve as time goes on and the program learns more.

No matter how well-functioning at launch, the new tech will give visually impaired people the opportunity to interact with images on Facebook in ways they couldn’t before. We look forward to automatic alt text improving – but perhaps not so much so that it begins snitching on “inappropriate” photos, right?