A new report has found big tech companies failed to act on most instances of anti-Muslim hate speech.
The Center for Countering Digital Hate (CCDH), an American non-profit organisation, says social media platforms collectively failed to act on 89 percent of posts containing anti-Muslim hatred and Islamophobia, even after they were reported to moderators.
YouTube was the worst offender, ignoring 100 percent of anti-Muslim and Islamophobic posts. Twitter failed to act on 97 percent, Facebook ignored 94 percent, Instagram 86 percent and TikTok 64 percent.
The CCDH flagged 530 posts, viewed at least 25 million times.
CCDH chief executive Imran Ahmed said a lot of the content was easily identifiable, and yet platforms chose not to act.
"Much of the hateful content we uncovered was blatant and easy to find - with even overtly Islamophobic hashtags circulating openly, and hundreds of thousands of users belonging to groups dedicated to preaching anti-Muslim hatred," he said.
"When social media companies fail to act on hateful and violent content, it normalises these opinions, gives offenders a sense of impunity, and can inspire offline violence."
He said the platforms were aware that highly emotional, hate-filled misinformation kept people glued to their screens, driving profit.
"They aren't incentivised to spend money on cleaning it up."
The study said Facebook also hosted several groups dedicated to spreading anti-Muslim hatred, with a combined 361,922 followers.
Instagram, TikTok and Twitter allowed users to use hashtags such as #deathtoislam, #islamiscancer and #raghead, with content spread using the hashtags receiving at least 1.3 million impressions.
Collectively, the platforms failed to address 89 percent of posts promoting the 'Great Replacement' conspiracy theory, a white supremacist and Islamophobic ideology which claims that non-white immigrants are 'replacing' white people and culture in Western countries.
This was despite pledges made following the 2019 Christchurch mosque terror attacks, Ahmed said.
Two months after the attack, Prime Minister Jacinda Ardern and French President Emmanuel Macron led the 'The Christchurch Call to Action', an international summit aiming to eliminate extremist content online.
Twitter, Meta, Google and YouTube all pledged to taking measures to prevent the upload of terrorist and violent extremist content and to prevent its dissemination.
The CCDH said these findings echoed its previous Failure to Act reports - earlier this month, researchers found that Instagram failed to act on 90 percent of user reports of misogynist abuse sent via Direct Message, and in 2021 Big Tech platforms collectively ignored 84 percent of anti-semitic posts.