Unplug & Recharge//

So Facebook Turned A Threat Against A Reporter Into An Automated Ad

When algorithms fail us, it’s time to bring back the humans.

Photo by rawpixel.com on Unsplash

Facebook has been having a rough month (see: extremely not-okay ad targeting and probably selling ads to Russians trying to meddle in the US election, for a start). The latest misstep comes after Guardian reporter Olivia Solon tweeted that Facebook was using an “engaging” Instagram post of hers to advertise using the platform via Facebook (which just so happens to own Instagram). The problem, to put it mildly, was that Solon’s most engaged post was a screenshot of a rape threat she’d received via email.

Solon posted a screenshot of the violent threat last year. “Sadly this is all too common for women on the internet,” read Solon’s caption for the post, which, according to the Guardian’s Sam Levin, received three likes and more than a dozen comments.

The ad was brought to the internet’s attention after Solon tweeted a photo of the ad, which included the original screenshot underneath the words “See Olivia Solon’s photo and posts from friends on Instagram.” In an email, Solon told me her younger sister and brother also saw the ad.

Among other serious ethical problems this situation raises, it also underscores how “engagement” as decided by a machine can produce some pretty horrible experiences for the people actually engaging with the content. (For instance, A piece on The Verge from a few years back details how Facebook’s scrapbook-like features—like Year in Review or On This Day—sometimes resurface painful memories.)

It’s worth noting that artificial intelligence and algorithms can be used for good, like potentially spotting and providing support to suicidal users. But the main problem is that, for now, those algorithms lack the nuance of a human reviewer. Which is, of course, the whole point of an algorithm.

That’s something Facebook recognizes: following the ProPublica report detailing that Facebook allowed advertisers to directly target people interested in extremely offensive subjects, including anti-semitism, COO Sheryl Sandberg announced (via Facebook post) that the company will crack down on how ads work and add “more human review and oversight to our automated processes.”

The role that technology giants should play in monitoring their platforms, and making them habitable for all users, is murky. And obviously, humans aren’t perfect either. We have biases and make decisions, consciously or not, based on our subjective experiences—something algorithms are largely designed to eliminate. But at least for now, bringing humans back into the mix seems like a good place to start.

Solon told me the incident “highlights an ongoing problem of algorithmic accountability,” adding “the oft-repeated excuse from tech companies of ‘Sorry, the algorithm did it’ is starting to wear very thin.”  

The Thrive Global Community welcomes voices from many spheres. We publish pieces written by outside contributors with a wide range of opinions, which don’t necessarily reflect our own. Learn more or join us as a community member!
Share your comments below. Please read our commenting guidelines before posting. If you have a concern about a comment, report it here.

You might also like...

Community//

“Positive Body Image Starts With US, It’s Up To Us To Show Our Children How To Love Ourselves And Our Bodies”

by Akemi Sue Fisher
Community//

“I’d like to start a movement centered on creating. Whether it’s writing, photography, or video, there’s something to be said about people who can create something they are proud of”

by Candice Georgiadis

Sign up for the Thrive Global newsletter

Will be used in accordance with our privacy policy.

Thrive Global
People look for retreats for themselves, in the country, by the coast, or in the hills . . . There is nowhere that a person can find a more peaceful and trouble-free retreat than in his own mind. . . . So constantly give yourself this retreat, and renew yourself.

- MARCUS AURELIUS

We use cookies on our site to give you the best experience possible. By continuing to browse the site, you agree to this use. For more information on how we use cookies, see our Privacy Policy.