This article is more than 1 year old

Facebook deleted my post and made me confirm pics of my kids weren't sexually explicit

When automated censorship goes awry

Usually when you read about Facebook blocking accounts or taking down videos it's because something serious has happened: like a woman filming her boyfriend dying at the hands of traffic cops; or someone going on a racist rant.

But this morning it happened to me.

Facebook informed this lowly reporter that it has removed one of his posts because it broke the company's policies on sexually explicit images.

It wasn't a request. The post has already gone, Facebook was just informing me, like a parent telling off a child after it had found cigarettes hidden in their sock drawer. And there wasn't any opportunity given to question this decision: Mom and Dad had decided. It was done.

Which left the intriguing question: what on earth had Facebook discovered that was sexually explicit? Seeing as the author is a middle-aged father-of-two rather than a horny twenty-year-old with a poor sense of personal boundaries, it seems somewhat unlikely that my Facebook account was host to saucy snaps.

Maybe it took exception to a breast-feeding pic from a few years ago, I thought. Although that also seemed unlikely given my wife's clearly and frequently stated ban on ever taking such a picture.

But no. Turns out it was a post about a book I had written and the fact I had been hired to turn it into a screenplay. The book? Sex.com: One Domain, Two Men, Twelve Years and the Brutal Battle for the Jewel in the Internet's Crown.

It is the story of the extraordinary legal and personal fight over the internet address Sex.com. And, naturally, I mentioned that fact in my post. Which had prompted Facebook to turn the title in a live hyperlink. And then to automatically pull in an image from the Sex.com website – which is, as you might expect, of an adult nature.

At least that's what I assume happened. Because when I published the post I don't recall any of that occurring – and would almost certainly have deleted it if it did. The post said something like: "Great news. I've been hired to write a screenplay of my Sex.com book." I can't recall exactly because Facebook has deleted the post, and because I wrote it maybe six months ago.

Mine, mine

We all know that the deal with Facebook is that it's not actually your content. The company is gracious enough to publish it for you for free on its service. It saves you having to purchase a domain name and hosting deal and set up a website. And it has a huge built-in audience of your friends. And in return it sells every piece of information you provide it with to advertisers.

But even so, the fact is that it can randomly decide it doesn't like a post because its own system broke its own policies, and then delete it. Well, that is a little frustrating. So, I clicked "OK" and the notification disappeared.

To be replaced with a second screen, where I had no choice but to interact with it if I wanted to use the mobile app any further.

And that screen displayed the last 10 images from my Facebook photo feed and asked me to actively confirm that each of them did not contain any sexually explicit imagery.

They didn't. Because almost all of them comprised photos of one or the other of my daughters, typically goofing off or being adorable (to be fair, there was also one picture of some Japanese whisky).

Once I told Facebook that, no, that five-year-old smiling on a swing was not a 25-year-old spread-eagled on a bed, it let me continue on my way. Although you have to wonder how a company that can identify people's faces can't tell the difference between a whisky bottle and an erect penis. What's more, I have no doubt that at some point my declaration that there was nothing to see will be checked, and if I am found to have lied to Facebook Dad, I will presumably be grounded.

Why does this matter?

Because, as many people have pointed out these past few months, Facebook is in denial about having become a news organization. A process that has resulted through the policies and services it has devised and promoted, and from which it seemingly makes more profit than all news organizations combined.

The move has already brought numerous issues: whether that is a perceived bias against conservative news, or a "technical glitch" over a shooting victim live feed. The company has shown itself time and again as one that thinks it is in charge of what you can see and not see – and which even experiments on its users out of some kind of perverse voyeurism. Facebook has very little regard for its users, or the content that they produce that has made it a corporate monster, at least at an individual level.

The fact is that whatever automated process Facebook decided to run over the weekend – presumably in response to the recent shootings in Baton Rouge, St Paul and Dallas – was a buggy piece of crap which has no doubt deleted many thousands of posts from many thousands of users, with no recourse and very limited intelligent analysis. It can't even tell the difference between a photo uploaded to its service and one that it automatically pulled in through its own hyperlinking.

Idiocy

This is of course very, very far from the only time the company's automated idiocy has caused offense. Even within my peer group, there have been frequent reports of insensitive automatic actions shoved in their face; one of the worst being pictures of a parent's funeral flashed up in a joyful end-of-year message from "your friends at Facebook."

As insensitive as that was, however, it was not disturbing. Censoring content, however, is.

With Facebook under increasing pressure from politicians and law enforcement, and with its influence continuing to grow, we have now entered the age of Facebook-approved censorship.

And as my experience shows, it is going to do a terrible job at it, and, if it has screwed up badly enough, tweak its settings and policies yet again. But what it won't do is imagine a situation where its users' content is anything but Facebook's to do with as it pleases.

We asked Facebook for comment on the precise details of my missing post. As well as another case recently where posts critical of the Singaporean police were removed. And of course, we are still waiting on confirmation from the company about what happened with the footage from St Paul. ®

More about

TIP US OFF

Send us news


Other stories you might like