Facebook realises it screwed up. Changes its policy (again) on beheading videos

Graham Cluley
Graham Cluley
@[email protected]

FacebookFacebook knows that it screwed up badly regarding its recently reintroduced policy of not removing violent videos shared on its network of people being decapitated.

Many people, including British Prime Minister David Cameron, had expressed their revulsion and disgust that social network was refusing to act on requests to remove a gruesome video of a woman being beheaded.

Facebook’s embarrassment was compounded by the fact that back in May it did say it was taking action against violent video content, citing concerns about psychological damage caused.

Ironically, the social network *did* take a stronger line against videos showing naked female breasts than beheadings, leading to one commentator on this site to say:

Sign up to our free newsletter.
Security news, advice, and tips.

So if I made a video opposing breastfeeding on the grounds that it causes distress to the sort of people who don’t want to see breasts on Facebook, then, I could show exposed breasts in it (to illustrate the true horror of the situation) AND POST IT ON FACEBOOK! Who’s up for giving it a try?

The good news is that Facebook has clearly realised the weight of opinion was against its position, and hurriedly posted up a blog post saying it was (once again) returning to a policy of more strongly acting on reports of images and videos that celebrate violence.

Facebook blog post

As part of our effort to combat the glorification of violence on Facebook, we are strengthening the enforcement of our policies.

First, when we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence.

Second, we will consider whether the person posting the content is sharing it responsibly, such as accompanying the video or image with a warning and sharing it with an age-appropriate audience.

Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.

The post doesn’t actually go so far as to say sorry to people who were traumatised by the video, or accept that it could have added to the distress of the family and friends of the person who was killed, or acknowledge that it has any responsibility to protect the millions of youngsters who use the network… but it’s a start.

Facebook says that it asks people who share graphic content for the purpose of condemning it to “do so in a responsible manner, carefully selecting their audience and warning them about the nature of the content so they can make an informed choice about it.”

No-one has any expectations that Facebook will proactively prevent all gratuitously violent and distressing material being posted on its social network. But it should make more efforts to remove content which would clearly never be allowed to be viewed on the mainstream media when it is reported by users.

If you are on Facebook, and want to be kept updated with news about security and privacy risks, and tips on how to protect yourself online, join the Graham Cluley Security News Facebook page.

Update: Rob Schifreen sums up things pretty nicely on Twitter:


Graham Cluley is an award-winning keynote speaker who has given presentations around the world about cybersecurity, hackers, and online privacy. A veteran of the computer security industry since the early 1990s, he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows, makes regular media appearances, and is the co-host of the popular "Smashing Security" podcast. Follow him on Twitter, Mastodon, Threads, Bluesky, or drop him an email.

What do you think? Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.