In 2015, a white supremacist shot and killed nine African American worshippers at a bible study in Charleston, South Carolina. The attack was uncommon at the time in that the shooter had been radicalised predominantly online. His example has been followed with increasing frequency across the world, and terrorists often cite his actions as inspiration. This was seen most recently in devastating mosque shootings in Christchurch, New Zealand. Nowadays, it is no longer surprising to many that racist and murderous views are expressed online and that such a platform plays a dominant role in the coordination of far-right action. However, the culture from which the shooter emerged, and the means by which this ideology is disseminated online, is still underestimated. Furthermore, the fact that many internet users share aspects of this culture, giving space for extremist ideas to permeate into more mainstream outlets, poses a worrying threat.
Following the Charleston attack, Brad Griffin, the founder of alt-right website Occidental Dissent, told journalist Jacob Siegel that, “By 2001 the [white nationalist] movement was already mostly online and it’s been that way ever since”. When the Charleston shooting took place, the dark reality of this community became clear and in the years following, in which we have seen numerous accounts of domestic terrorism throughout the world, its influence cannot be doubted. The manner by which neo-Nazis express their tropes and ideas through “trolling”, however, is a quagmire of misinformation and misdirection. In his Daily Beast article, Siegel described it as a “dominant form of online sensibility”, breeding a competitive edge and eye for cruel provocation. In one form, the acts of trolling or “shitposting” are simply wind-ups, in-jokes designed to obfuscate at the expense of the recipient; in another guise, they act as devices to hide evil ideas and inspire mass murder. Yet, the murderous and ultimately genocidal component of the message is hidden by what Siegel called a “pretence of tricksterism”.
Such a pretence falls away when someone decides to take their “shitposting” from the online realm and turn it into “effortposting”, a phrase used by members of the /pol/ board on the anonymous discussion group website, 8chan. Indeed, the man who walked into two mosques on 15 March in Christchurch and gunned down 50 worshippers expressed this very sentiment. Before he started the attack, streamed lived onto Facebook, the shooter called on the “anons” (8chan users) “to stop shitposting” and “make a real life effort”, much to the delight of /pol/ onlookers. Before the massacre, the shooter told his viewers to “subscribe to PewDiePie”, referencing the Swedish YouTube star with 95 million subscribers and a history of controversy. His weapon was marked with a host of memes and neo-Nazi symbols, whilst the tribute song to the war criminal Radovan Karadžić, “Remove Kebab”, played in the background. The boundaries between meme and murder were blurred beyond recognition, revealing the power of the former to carry meaning and ultimately inspiration to do the latter.
The complicated relationship between memes and far-right extremism is very much a deliberate creation of the culture that emanates from discussion boards like /pol/. It can be shown in the hand gesture made by the Christchurch shooter upon entering court to be charged. Placing his index finger on his thumb and turning his hand upside down, the shooter inverted the “OK” gesture to symbolize a W and P, for “white power”. This appropriation of a common hand signal began in 2017 when 4chan users encouraged its use as a white supremacist symbol, calling it “Operation O-KKK”. It was a hoax – an attempt to load a gesture with a meaning that it had never before been associated with. The purpose was to troll “Leftists”, producing outrage at the use of the symbol and exemplifying the absurdity of left-wing “snowflakes”.
The author of Troll Nation, Amanda Marcotte, explained on Twitter last year how the “OK” gesture served both as a “white supremacist symbol and also one that is just ordinary enough looking that when liberals expressed outrage, the white supremacist could play the victim of liberal hysteria.” And it worked. In July 2018 an investigation was launched into four police officers in Alabama who flashed the symbol in a newspaper photo. Likewise, in September, a US Coast Guard was caught making the symbol in the background of a TV interview. On both occasions, the authorities were unable to identify the actual intent behind their actions. Were they white supremacists that posed a lethal threat to minorities or simply trolls in on the (tasteless) joke?
For a long time, such jokes haven’t been taken seriously because they are just that and that is what makes them so slippery. They are devices for the trolls to say, “lighten up, it’s just a joke, we don’t mean it seriously”. But now these jokes are being used as “content” to accompany mass murder (“effortposting”). The Christchurch shooter’s use of the symbol breaks down the pretence, showing how the façade of internet culture can hide neo-Nazism and co-opt those who think they are trolling the “normies” into its fold. It is likely that the member of the coast guard and the four police officers are simply conservatives looking to get in on “owning the libs”, but their use of the “OK” symbol in conjunction with the Christchurch shooter displays the blurring of the right and far-right.
Consider the “Subscribe to PewDiePie” meme: it gained prominence initially as an anti-corporate stance, ensuring that PewDiePie, a.k.a Felix Kjellberg, retained his position as the most subscribed channel on YouTube against T-Series, an Indian film and music studio. But as devoted fans have spread the “Subscribe” message, often in extreme ways, it has become a code for something more – a nod to one’s understanding of internet culture. Now that the Christchurch shooter has used it, does it have an association with neo-Nazism? It seems particularly fitting that Kjellberg has a history of racist slurs and that he has openly promoted an anti-semitic YouTube channel. But, this is the point. It is what Joan Donovan, Director of the Technology and Social Change Research Project at Harvard, has called “bait” and it leads “down far-right rabbit holes”.
Writing for the Bellingcat, the journalist Robert Evans has argued that the shooter’s use of memes, both on his stream and in his “manifesto”, was a provocation, a mix of ironic misinformation and real beliefs designed to send journalists eagerly seeking clues in often absurd directions. By referencing PewDiePie and crediting the right-wing activist, Candace Owens, with his radicalisation, the shooter sought to sow discord into mainstream discourse and “to distract attention from his more honest points”. As has been noted, being aware of a reference at the expense of someone else’s ignorance is part of the hilarity of shitposting. When memes are taken as the vehicle through which both trolling and “honest” thought are disseminated, trying to identify the latter only becomes more complex.
This is the problem posed by the Christchurch shooter and his “manifesto”. The far-right are using a shared internet culture alongside a large proportion of people who are “extremely online”. They inhabit the same space and this means that those extremely online individuals who are vulnerable to radicalisation will come across the evil eventually. Yet, it will be expressed in a language that they recognise, hidden behind a “pretence of tricksterism” that invites a private, invite-only tone of hilarity, rather than a reaction of disgust. The danger of the “Subscribe to PewDiePie” meme is its mainstream nature; it originated from the fans of a YouTube channel, not the relative obscurity of the /pol/ board on 8chan. It bridges a gap between these two sites of online activity: the first, which attempts, however unsuccessfully, to police the bile of hatred; and the second, in which anything goes.
It is here that the network of online conservative and alt-right commentators plays a role, helping to bring far right ideas into mainstream internet forums. A report by Data & Society, written by Rebecca Lewis, has identified an “alternative influence network” of 65 scholars, media pundits and internet celebrities who promote a wide range of opinions, from the conservatism of Ben Shapiro and Dave Rubin, to the white nationalism of Richard Spencer. Lewis calls it “an alternative media system that adopts the techniques of brand influencers to build audiences and “sell” them political ideology.” Members of this network operate in a similar way to neo-Nazis trolls, deliberately using provocation as a means of engagement. The issue is that they do so on platforms where engagement is the central component of the website’s business model.
Generating clicks is how YouTube makes money and members of the “alternative influence network” do this very successfully. Provocation and internet-savvy irony sell well and algorithms push videos further into user’s recommendations. A Buzzfeed investigation has found that users clicking on nonpartisan videos could reach videos with extremist content in just six clicks. Even channels that are not associated with this network have begun to resort to its methods to maintain a stream of profitable content. The popular YouTube personality Logan Paul recently invited the right-wing conspiracy theorist Alex Jones onto his live-streamed chat show, currently online with over 700,000 views. Jones has repeatedly peddled the lie that the Sandy Hook shooting in 2012, in which 20 schoolchildren were killed, was a hoax. The majority of Logan Paul’s fan base are young teenagers and his channel currently has over 19 million subscribers. The “Super Chat” feature on the type of show hosted by Paul allows users to pay for their comments to be highlighted. As Lewis notes, this “often incentivizes ‘shocking’ content.”
The string of attacks perpetrated against minorities across the world has shown the inadequacy of big tech to respond to the content that accompanies them. The Christchurch shooter knew that his live stream would be removed from Facebook, but within the first 24 hours of the attack, the site had to remove 1.5 million individual posts of the video. Algorithms are a problem; they are about engagement rather than curbing the spread of hatred. Some have suggested incorporating a human editorial role on YouTube to combat this. Likewise, algorithms work against those with the best intentions. Whitney Phillips, an academic at Syracuse University, has written about “The Oxygen of Amplification” for Data & Society and the role that journalists play in spreading ideology for mass murderers. One method is to avoid referring to a shooter’s name, as this article has done. Despite this, many of the examples referred to in this piece will lead down “far-right rabbit holes” upon further investigation.
The far-right is well established on the internet, its members embedded in a method of engaging with ideology that is both confusing and accessible, depending on one’s exposure to internet culture. It will never be possible to remove its influence from the internet, especially from the obscure websites on which its most committed members foment hatred. When companies like Cloudflare, which keeps 8chan running, regard its continuation as a matter of free speech, it is clear that these corners will not be the site on which the battle against extremism will be fought. Rather, tackling the creeping influence of the far-right on more mainstream internet forums and traditional mass media should be. This too has been debated widely through the lens of free speech and the power of rational thought over extremities. Whilst attempting to defeat ideas through robust debate may seem admirable, when one side’s presence hinges on the opportunity of accessing a wider audience, their goal will not be an intellectual discussion that sees truth emerge, but a sound bite filled with controversy – greater engagement. When it becomes the norm to see a far-right mouthpiece on television, it will no longer be necessary for them to hold up the pretence, as we will have been tricked already.