(RNS) — Researchers analyzing TikTok for extremist content have discovered videos that portray Muslims as supporters of terrorism, clips supporting Holocaust denial and users glorifying the mass shooters behind the Christchurch mosque and Tree of Life Synagogue attacks.
The Institute for Strategic Dialogue, a London-based nonprofit that tracks extremism online, released a report on Tuesday (Aug. 24) that found TikTok “operates as a new arena for violence-endorsing, hateful ideologies.”
Over three months, ISD analyzed a sample of 1,030 videos, equivalent to about eight hours of content, and found that 312 of the clips promoted white supremacy. More than 240 videos showed support for organizations or individuals tied to extremism or terrorism.
The study, authored by ISD investigator Ciarán O’Connor, found that TikTok creators use coded language as well as the platform’s video effects, layout and music to promote hate. It also highlights tactics they use, such as restricting comments on their videos, to evade being reported to TikTok.
ISD said that TikTok has a “content moderation problem” and that its “enforcement gap is concerning.”
“The platform enables hatred targeting Muslims, Jews, Asians, Black people, refugees, women and members of the LGBTQ+ community, including content celebrating the death of people within these communities,” the report read.
In a statement provided to ISD, TikTok said it has used the ISD’s research to remove additional accounts.
“TikTok categorically prohibits violent extremism and hateful behavior, and our dedicated team will remove any such content as it violates our policies and undermines the creative and joyful experience people expect on our platform,” according to the statement highlighted in the ISD report.
On TikTok, support for white supremacy takes many forms, the ISD said.
ISD underscored content promoting the far-right “Great Replacement” and “white genocide” conspiracy theories that assert “white people are being systematically replaced.”
For example, ISD said a frequently shared clip featured a rabbi speaking on Russia Today about growing European solidarity between Jews and Muslims. ISD said TikTok creators used the video to justify arguments that these communities were “against Europeans” and that white people were in danger of being wiped out.
ISD noted TikTok creators use vintage and retro effects in far-right videos to “reminisce about the past” and to “evoke nostalgia.”
“The intended message in such videos states that the past is preferable to the current day because there was less diversity in the US and Europe, cultures were more homogenous, religion (especially Christian religions) held more influence in the daily lives of people,” the report found.
Among the most-viewed videos captured in ISD’s sample, a clip featuring video game footage of Auschwitz concentration camp was used to mock victims of and deny the Holocaust. It had 1.8 million views. Researchers deemed 153 of the sampled videos promoted antisemitic content.
TikTok creators have also used aspects of Jewish culture, such as the folk song “Hava Nagila,” to boost hatred of Jewish people, ISD said. In a number of videos, the folk song starts playing after a Jewish baby stops crying when given money.
ISD found 81 videos it analyzed to promote anti-Muslim hatred.
In the videos ISD sampled, anti-Muslim hatred mainly materialized in content related to the Yugoslav Wars and murder of Bosnian Muslims. “This included content about the 1995 genocide at Srebenica, denial of this event, and glorification of those responsible,” the report said.
Footage also claimed there is a “systematic Islamification of Europe,” ISD said.
The report found that 30 videos featured support for Brenton Tarrant, who killed 51 people at two mosques in Christchurch, New Zealand. More than 10 of those videos were produced by Tarrant himself. Some clips featured video game footage designed to recreate the attack.
The ISD acknowledged that TikTok has taken steps to address these issues, like releasing transparency reports that detail its efforts to remove content violating community guidelines.
However, ISD said that TikTok’s interface is “severely limited in the data it provides to researchers or the public,” adding that TikTok needed to be more forthright in showing how its algorithm works.