Facebook served ads on searches related to white supremacist groups, despite a ban on such content on the platform, according to a report by the Tech Transparency Project.
The report, which was first covered by The Washington Post, identified 119 Facebook pages and 20 Facebook groups affiliated with white supremacist organizations on the platform. Researchers searched Facebook for 226 designated hate groups or dangerous organizations using sources like the Southern Poverty Law Center, Anti-Defamation League, and even Facebook itself, and found more than a third had a presence on the platform.
The study found that despite Facebook’s insistence that the company doesn’t profit from hateful content, ads appeared on 40 percent of the queries for the groups.
The white supremacist pages identified by the report include two dozen that were auto-generated by Facebook. The platform automatically creates pages when users list interests, workplaces, or businesses without an existing page. The issue of auto-generated white supremacist business pages was previously raised in a 2020 analysis, also by the Tech Transparency Project. Among the auto-generated pages identified by the 2022 report are “Pen1 Death Squad,” shorthand for a white supremacist gang.
Meta spokesperson Dani Lever says 270 groups designated by the company as white supremacist organizations are banned from Facebook, and that it invests in technology, staff, and research to keep platforms safe.
“We immediately resolved an issue where ads were appearing in searches for terms related to banned organizations and we are also working to fix an auto-generation issue, which incorrectly impacted a small number of pages,” Lever says. “We will continue to work with outside experts and organizations in an effort to stay ahead of violent, hateful, and terrorism-related content and remove such content from our platforms.”
In 2020, more than 1,000 advertisers boycotted Facebook over the platform’s handling of hate speech and misinformation. That same year, civil rights auditors released a report that found the company’s decisions resulted in “serious setbacks” for civil rights. Following the audit, Meta created a civil rights team in 2021, which has published the status of actions and recommendations issued by auditors.