The internet, particularly social media platforms, has long been scrutinised for its role in spreading harmful and extremist content. While such platforms have introduced robust measures to moderate and remove hate speech, the persistence of such content highlights the challenges of digital content moderation. A concerning development has emerged on TikTok, where audio clips of Nazi speeches and associated content have garnered widespread popularity, becoming embedded in a disturbing trend that has attracted millions of views and likes.
Offensive Content and Disturbing Trends
A significant portion of this problematic content revolves around Nazi speeches, particularly those delivered by Adolf Hitler and Joseph Goebbels, Hitler’s chief propagandist. These speeches, full of antisemitic vitriol, decry the supposed influence of Jewish people in undermining peace in Europe, reflecting the hateful ideologies that fuelled the horrors of the Holocaust. What makes this content even more alarming is the integration of fast-paced music, particularly Drift Phonk—a genre popular on TikTok—laid over these hate-filled rants. This fusion creates a bizarre juxtaposition, with offensive propaganda set to music designed for rapid consumption and engagement by young audiences.
More than 50,000 TikTok posts featuring Nazi speeches were uncovered between the 2nd and 3rd of September 2023 alone. These posts, disturbing in both content and reach, have attracted significant engagement on the platform, with many receiving millions of likes. For instance, a single video featuring a Hitler speech has been liked over 56,700 times. Some of the comments associated with these posts are equally shocking, with users expressing support for the dictator, including statements such as “modern society absolutely needs him” and “we miss you,” demonstrating the dark undercurrent of extremist sympathies among certain segments of TikTok users.
The Role of Sounds in TikTok’s Ecosystem
TikTok’s structure as a content platform relies heavily on the use of ‘sounds’—audio clips that users can incorporate into their own videos. These sounds form the basis for many viral trends and challenges on the app. Nazi-related content, including speeches and Nazi-associated music like the German marching song “Erika,” has been repurposed into such sounds, allowing users to spread these clips widely across the platform.
“Erika,” although not explicitly political in its lyrics, has strong associations with the Nazi regime, having been composed by Herms Niel, a member of the Nazi party and conductor for the SS and other military branches. The song’s dark history has not been lost on those using it on TikTok, with videos featuring images of Hitler and swastikas often accompanying the sound. In one particularly macabre example, a version of the song attached to 405 videos displayed a grinning skeleton resembling Hitler in front of a German flag. Another version, used in over 8,800 videos, was linked with an edited photograph of Hitler.
It was found that the popularity of Nazi-related sounds is staggering. A review of the five most used sounds made from Nazi speeches, combined with the top five “Erika” sounds, revealed a total of 21 million likes. The most popular post featuring a Hitler speech alone accumulated over 2.5 million likes. These figures demonstrate the wide reach and high level of engagement these disturbing videos have managed to attain.
Artists’ Unwitting Involvement
Some of the music used to overlay these hate-filled speeches was created by independent artists, who were unaware that their work had been repurposed in such a manner. One such artist is Pastel Ghost, who expressed her shock and disgust upon learning that her music had been used to promote Nazi ideology. In a statement, she said: “I was not previously aware that my music was being used in this way, and I find it shocking and deplorable.”
Her experience highlights a significant issue: many of these sounds are being created without the knowledge or permission of the original artists. It illustrates how far extremist elements are willing to go in adapting popular culture for their hateful messages. Pastel Ghost and her team are now actively working to remove instances where her work has been misappropriated for such purposes.
The Challenge for Moderation
One of the primary challenges that platforms like TikTok face in moderating hate speech and extremist content is the nature of audio content. While videos and text can be easily flagged for hateful language or imagery, audio presents a more complex issue. As Dr Joe Ondrak, a research, technology, and policy lead at Logically, explained, audio content is notoriously difficult to moderate. Extremist content creators have long recognised this weakness and have adapted their strategies accordingly.
Most sounds incorporating Nazi propaganda do not have overtly explicit titles, making them difficult to detect through traditional moderation methods. However, once a user clicks on one of these sounds, they are immediately directed to a large number of videos using the same sound. This system allows hate speech to spread relatively undetected, with the problematic content often only coming to light when specific posts are flagged.
This content not only spreads Nazi propaganda but also serves to gather like-minded individuals in online spaces where they can express their hateful ideologies with relative anonymity. While mainstream platforms have taken steps to prevent this kind of radicalisation, the use of covert and subtle content-sharing strategies makes moderating such material even more difficult.
Broader Implications and Efforts to Combat the Issue
The continued presence of Nazi content on TikTok raises serious questions about the effectiveness of the platform’s moderation policies. While TikTok has made efforts to address hate speech, including removing 91% of hateful content before it is reported, the platform clearly struggles to fully contain the spread of extremist material. Following Sky News’ findings, TikTok acted swiftly to remove the content identified as violating their guidelines. A spokesperson for the platform stated: “This content was immediately removed for breaching our strict policies against hate speech.” However, the fact that such content amassed millions of views before being flagged shows the limitations of TikTok’s moderation systems.
Hannah Rose, a hate and extremism analyst at the Institute for Strategic Dialogue, noted that while this content is shocking, it is not entirely surprising. She explained that the issue lies in platforms failing to adequately moderate the scale of hatred and extremism present on their sites. “We know and have known for a number of years that platforms have not adequately moderated the scale of hatred and extremist content on their platforms,” she said, pointing out that this is not limited to fringe websites but also mainstream platforms like TikTok.
The broader issue is that extremist content is not just confined to niche corners of the internet. It is now seeping into mainstream digital spaces, exposing wider audiences to hateful ideologies and potentially radicalising individuals who might not have been initially drawn to such content. This presents a grave danger, especially when considering the immense popularity of platforms like TikTok among younger users.
A Sobering Reminder
The integration of Nazi propaganda into popular trends on TikTok is a sobering reminder of how easily hate speech can permeate online spaces, even those designed for entertainment. The use of viral sounds, music, and imagery to cloak extremist messages in a format palatable to young audiences demonstrates the evolving tactics of hate groups in the digital age. While TikTok and other platforms have made strides in combating hate speech, this situation underscores the ongoing challenge of moderating content in a world where technology evolves faster than the regulations designed to control it.
It is crucial that social media platforms continue to refine their moderation systems, particularly concerning audio content. At the same time, there is a growing need for public awareness and education on the dangers of online extremism. Young users must be taught to recognise and reject hateful content, and governments and tech companies alike must collaborate to prevent the internet from becoming a breeding ground for extremist ideologies. The fight against digital hate is far from over, but it is one that society cannot afford to lose.
KEEP US ALIVE and join us in helping to bring reality and decency back by SUBSCRIBING to our Youtube channel: https://www.youtube.com/channel/UCQ1Ll1ylCg8U19AhNl-NoTg AND SUPPORTING US where you can: Award Winning Independent Citizen Media Needs Your Help. PLEASE SUPPORT US FOR JUST £2 A MONTH https://dorseteye.com/donate/