What happens when you report abuse on Facebook?

If you encounter abusive content on Facebook, do you press the “Report abuse” button?

Facebook has lifted the veil on the processes it puts into action when one of its 900 million users reports abuse on the site, in a post the Facebook Safety Group published earlier this week on the site.

Reporting abuse on Facebook

Facebook has four teams who deal with abuse reports on the social network. The Safety Team deals with violent and harmful behaviour, Hate and Harrassment tackle hate speech, the Abusive Content Team handle scams, spam and sexually explicit content, and finally the Access Team assist users when their accounts are hacked or impersonated by imposters.

Sign up to our free newsletter.
Security news, advice, and tips.

Facebook User Operations teams

Clearly it’s important that Facebook is on top of issues like this 24 hours a day, and so the company has based its support teams in four locations worldwide – in the United States, staff are based in Menlo Park, California and Austin, Texas. For coverage of other timezones, there are also teams operating in Dublin and Hyderabad in India.

According to Facebook, abuse complaints are normally handled within 72 hours, and the teams are capable of providing support in up to 24 different languages.

If posts are determined by Facebook staff to be in conflict with the site’s community standards then action can be taken to remove content and – in the most serious cases – inform law enforcement agencies.

Facebook has produced an infographic which shows how the process works, and gives some indication of the wide variety of abusive content that can appear on such a popular site.

The graphic is, unfortunately, too wide to show easily on Naked Security – but click on the image below to view or download a larger version.

Facebook reporting guide. Click to view large version of infographic

Of course, you shouldn’t forget that just because there’s content that you might feel is abusive or offensive that Facebook’s team will agree with you.

As Facebook explains:

Because of the diversity of our community, it's possible that something could be disagreeable or disturbing to you without meeting the criteria for being removed or blocked. For this reason, we also offer personal controls over what you see, such as the ability to hide or quietly cut ties with people, Pages, or applications that offend you.

To be frank, the speed of Facebook’s growth has sometimes out-run its ability to protect users. It feels to me that there was a greater focus on getting new members than respecting the privacy and safety of those who had already joined. Certainly, when I received death threats from Facebook users a few years ago I found the site’s response pitiful.

I like to imagine that Facebook is now growing up. As the website approaches a billion users, Facebook loves to describe itself in terms of being one of the world’s largest countries.

Real countries invest in social services and other agencies to protect their citizens. As Facebook matures I hope that we will see it take even more care of its users, defending them from abuse and ensuring that their experience online can be as well protected as possible.

We would be interested in hearing about your experiences when you report abusive content to Facebook. Were you happy with Facebook’s reponse? Join the discussion on our Facebook page


Graham Cluley is an award-winning keynote speaker who has given presentations around the world about cybersecurity, hackers, and online privacy. A veteran of the computer security industry since the early 1990s, he wrote the first ever version of Dr Solomon's Anti-Virus Toolkit for Windows, makes regular media appearances, and is the co-host of the popular "The AI Fix" and "Smashing Security" podcasts. Follow him on Bluesky and Mastodon, or drop him an email.

What do you think? Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.