nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

Jeez, we'll do something about Facebook murder vids, moans Zuckerberg

OK, OK, we'll hire 3,000 people to police our site, sighs CEO

By Kieren McCarthy, 3 May 2017

Stung by global criticism over murder videos on his sprawling web empire, Facebook CEO Mark Zuckerberg has promised to swell the ranks of his moderator army.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later," Zuckerberg wrote in an update to his personal Facebook page. "It's heartbreaking, and I've been reflecting on how we can do better for our community."

And his solution is to add 3,000 people to the existing 4,500 in its "community operations team" over the course of 2017, which would bring the company's total number of employees to around 20,000. That community team is tasked with reviewing posts, images and vids that are flagged up by Facebook users as potentially awful.

In addition to the extra eyeballs, Zuckerberg said the company was looking at how to make it easier for users to report videos and how to speed up the takedown process. And he said the company would "keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they're about to harm themselves, or because they're in danger from someone else."

The decision to act comes following a global outcry after the Facebook Live video streaming service – which lets anyone with a smart phone instantly stream to Facebook through its app – was used to film several murders.

In one last month, a man in Cleveland in the United States videoed himself driving around the city promising to kill random strangers. He uploaded one video of himself shooting and killing an old man. Just a week later, a man in Thailand streamed himself hanging his 11-month-old daughter before killing himself.

These shocking incidents follow a number of live suicide videos and videos depicting violent and abusive behavior as well as terrorist incidents.

Time is of the essence

Although Facebook is clearly not to blame for people's behavior, the company has come under severe criticism from users, the media and politicians for failing to do enough and for acting too slowly. The video of the hanged baby, for example, remained online for more than 24 hours after it was first reported to Facebook.

Zuckerberg fell back on the social media company's stock response to criticism that it is not doing enough to police its service: its sheer popularity makes it difficult.

"We're working to make these videos easier to report so we can take the right action sooner ... we'll be adding 3,000 people to our community operations team ... to review the millions of reports we get every week, and improve the process for doing it quickly," he noted.

That excuse is quickly wearing thin however, with politicians publicly calling the company out on its enormous profits and asking why some of that money can't be spent on making the service safer.

On top of that the German government, the European Commission and now the UK government have made formal proposals to fine Facebook, Google, Twitter et al millions of dollars if they do not remove illegal content within a short timeframe.

And just this week, the UK government pushed the issue one step further by arguing that social media companies should be actively searching for such content and removing it – and threatened to get the UK police to do that job for them and invoice them for their time if they didn't.

Culture

Although it is easy to see the issue in terms of Facebook as a highly successful company trying to deal with its own massive user base, increasingly people are starting to question the culture of the company itself.

A bank recently divulged that Facebook's ad team has told its executives that they could identify teenagers that were feeling "stressed," "anxious" or "nervous" and pitched it as a plus, saying that advertisers could target kids when they are "potentially more vulnerable." Facebook denies the report.

But a former Facebook exec wrote an article in which he argued that it sounds exactly like the company he worked for. He went into some detail about how Facebook ad teams push on political parties during elections, arguing that they can sway the results – while at the same time claiming the complete opposite in public.

"The question is not whether this can be done. It is whether Facebook should apply a moral filter to these decisions," wrote former product manager Antonio Garcia-Martinez. "Let's assume Facebook does target ads at depressed teens. My reaction? So what. Sometimes data behaves unethically."

That approach sums up much of the worst Silicon Valley tech bro culture: where people become little more than salable chunks of data and the push to make profits puts no value on moral and ethical judgments.

Additionally, just this week, another report covered an internal review into why women coders' submits were being rejected 35 per cent more often than those pushed by men. The review concluded that it was nothing to with gender but the seniority – or lack thereof – of the software engineers themselves: something that was taken by many to point to a lack of advancement for women in the company.

However, when the Wall Street Journal asked for a formal response, the Facebook spokesman gave a telling response: the first analysis was "incomplete and inaccurate – performed by a former Facebook engineer with an incomplete data set."

At Facebook, everything – even the feelings of its own staff – mean nothing compared to what the data and its analysis says.

Arrogant

That lack of humanity is further compounded by a petulant arrogance that has filtered down to Facebook employees from the very top.

A great example of that inability to admit fault came in the example of a famous Vietnam photo of nine-year-old Phan Thi Kim Phuc running away, naked, from a napalm attack in 1972.

When it was posted among six others to highlight photos that had "changed the history of warfare" by a user, Facebook decided it represented an image of child abuse and demanded it be taken down. When the user – Norwegian Tom Egeland – refused, he was suspended from the service.

When his case was taken up by Norwegian newspaper Aftenposten, Facebook demanded that the newspaper remove or pixelate the photo. And it went one step further: when the prime minister of Norway herself, Erna Solberg, posted the picture to her account as a protest, Facebook reached into her account and deleted her post.

Faced with global criticism yet again, Facebook did what it has done many times in the past, and continues to do today – most recently with fake news – without learning the lesson: it changed its policies on this one aspect and went on as before.

As its former product manager noted this week: "The hard reality is that Facebook will never try to limit such use of their data unless the public uproar reaches such a crescendo as to be un-mutable ... But they'll slip that trap as soon as they can. And why shouldn't they? At least in the case of ads, the data and the clickthrough rates are on their side." ®

Sign up to our Newsletter

Get IT in your inbox daily

The Register - Independent news, views and opinion for the tech sector. Part of Situation Publishing