Last year, Meta allowed thousands of paid ads containing sexually explicit imagery on social media platforms, including Facebook and Instagram
By Jeremy Hsu
13 January 2025
Meta owns social media platforms including Facebook and Instagram
JRdes / Shutterstock
In 2024, Meta allowed more than 3300 pornographic ads – many featuring AI-generated content – on its social media platforms, including Facebook and Instagram.
The findings come from a report by AI Forensics, a European non-profit organisation focused on investigating tech platform algorithms. The researchers also discovered an inconsistency in Meta’s content moderation policies by re-uploading many of the same explicit images as standard posts on Instagram and Facebook. Unlike the ads, those posts were swiftly removed for violating Meta’s Community Standards.
Read more
Self-centred, spoiled and lonely? Examining the only child stereotype
Advertisement
“I’m both disappointed and not surprised by the report, given that my research has already exposed double standards in content moderation, particularly in the realms of sexual content,” says Carolina Are at Northumbria University’s Centre for Digital Citizens in the UK.
The AI Forensics report focused on a small sample of ads aimed at the European Union. It found that the explicit ads allowed by Meta primarily targeted middle-aged and older men with promotions for “dubious sexual enhancement products” and “hook-up dating websites”, with a total reach of more than 8.2 million impressions.
Such permissiveness reflects a broader double standard in content moderation, says Are. Tech platforms often block content by and for “women, femme-presenting and LGBTQIA+ users”, she says. That double standard extends to male and female sexual health. “An example is lingerie and period-related ads being [removed] from Meta, while ads for Viagra are approved,” she says.