19th Jul 2018 by Steph Butcher

Inside Facebook: Nipples Worse Than Abuse – 10 Yetis Insight

New Channel 4 Documentary shows behind the scenes at Facebook and how they decide what content stays and what content goes!

For years now, Facebook has been one of the biggest platforms and a forerunner in the world of social media. According to research, in April 2018 there were a total of 45 million Facebook users in the UK; that’s almost 70% of the population active on the platform. While a great deal of content will be light hearted holiday snaps, cute baby pics and companies trying to get you to buy their products, there is a lot darker side which rears its ugly head and can find its way into unsuspecting peoples’ feeds.

Last night’s documentary from Channel 4’s Dispatches went undercover at CPL Resources, an outsourcing company based in Dublin that works on behalf of Facebook to moderate inappropriate content.

Post moderation isn’t a small job, with 1.4 billion posts published around the world each day and thousands of posts being reported along the way. The company, CPL Resources, aims to assess 3,000 reports per day, however at the time of filming, Dispatches revealed there was a backlog of 15,000 reports that had been missed, or, even worse, ignored.

The most interesting part of the documentary, for me, was determining whether a post would be ignored, marked as disturbing (M.A.D) or removed. Unbelievably, very few posts were removed no matter how graphic the content was.

It seems strange that a nipple is classed as being more offensive and inappropriate than child abuse or graphic violence, but that’s how Facebook’s policy makers see it!

Some of the documentary was particularly hard viewing, with exerts of child abuse videos, self-harm images and far right imagery being used to illustrate points. One video used within training for the moderators that featured a small child being beaten and stamped on by an older man, was simply marked as disturbing. One trainer said that they “never delete it and never ignore it” and, unless the videos were published through Facebook Live, they weren’t reported to the police. Through further research, it was discovered that the video had be circulating Facebook for more than 6 years, had over 44,000 shares and is still available on the site despite the child needing medical attention and the man being imprisoned.

With all the adverts that are pushed into our Facebook feeds on a regular basis, and the announcement that Facebook earned $9.16 billion in the final quarter of 2017, we all know that Facebook runs on money. Unfortunately, that appears to be at the expense of victims too – no matter how many times Richard Allan, Facebook’s Vice President of Global Policy, tried to deny it.

As we all know from endless scrolling, captions on videos are usually the last thing you look at. However, if a caption on a violent or graphic video is condemning the content, this in Facebook’s eyes is fine and they’re more than happy to let the content be shared.

From undercover videos, moderators said that “extremes” of views got “better user engagement” and so were more likely to stay on the site. This is why the likes of Britain First and Tommy Robinson’s pages were given the same protection as government and new organisations. It was explained that pages can receive up to five violation notices before that page is removed – Britain First had up to nine and even then, was only removed when the leaders were jailed for hate crimes. However, despite being imprisoned for contempt of court, Tommy Robinson’s page still remains live due to the amount of followers and engagement it receives.

Facebook has pledged to employ 20,000 more people to work on security and content review on the site by the end of 2018, but attempting to keep up with the amount of reports seems like a goliath task. Plus with so many loopholes, and reasons not to delete content, Facebook don’t seem to be working particularly hard to limit the issues.

With Mark Zuckerberg banging on about making Facebook a safe place for everyone, it seems that the post moderators aren’t in the same frame of mind. While we’re all responsible for the content that gets posted on the site, there are some things that cross a line. Many young people, and those who are vulnerable, access the site too but even then, no one should be subjected to seeing these types of videos and images. If this documentary has only shown one thing, it’s that Facebook needs to take a long hard look at its content policy before they can consider themselves a safe place.


Get the Know How

Get the latest thought leading industry comment and information from our “no sales” newsletter.

Want to work with us?

hello@10yetis.co.uk