This article is more than 1 year old

Facebook's going to block revenge porn with AI. Or humans. Or both

Lefthand dating ad: 'Righthand, let's get together soon'

+Comment Well, that's awkward. Facebook's head of global safety and CEO Mark Zuckerberg on Wednesday gave differing descriptions of the advertising network's just-launched "AI" powered “online safety” initiative.

The idea is that if someone's intimate images are shared without permission as “revenge porn," the site's systems will be able to take them down as quickly as possible to stop them spreading.

Zuckerberg said this laudable aim is going to be achieved by artificial intelligence; the head of global safety Antigone Davis says the heavy lifting will be done by “specially trained representatives from our community operations.”

Here's Zuck's post on his own account, screenshotted below in case it is deleted or edited:

Zuckerberg's Facebook post

It's pretty clear how it's been explained to Mark, as far as we can tell. He wrote: “We will now use AI and image recognition to prevent [revenge porn] being shared across all of our platforms.”

In her corporate blog post, Davis goes into somewhat more detail.

Facebookers seeing revenge porn images are asked to use the “report” button; the community operations staffers will judge whether the image violates Facebook's famously rigorous community standards, and if it does, it (and probably the poster) will get blocked.

It's after that point that technology gets involved, Davis explained: “We then use photo-matching technologies to help thwart further attempts to share the image on Facebook, Messenger and Instagram.”

Someone trying to share a reported-and-removed image will get a warning and the share will be blocked. Facebook also promises to “partner with safety organisations” to support victims.

So in The Register's reading, there's a bit of AI for the photo-matching (well, maybe, it's probably simple file hash checks or similar), and all in all, somewhat less than Zuckerberg seems to believe.

Reg comment

While we applaud the objective of the program, The Register is concerned about the process. First, there's a risk that having revenge porn images reviewed by humans might deter victims, already upset by having intimate images shared online. Second, we hope that the humans involved in making classification decisions are properly supported against the likely psychological trauma they might suffer. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like