On 30 December 2020, Irish-Nigerian George Nkencho was shot dead by Irish Gardai, following his assault on a shopkeeper earlier that day. His death was almost immediately seized upon by anonymous far-right users of the messaging app, Telegram. Anti-immigrant and conspiracy groups falsely accused Nkencho of a violent and abusive criminal past, whilst instructing their members to incite political and racial tension by spreading disinformation and to “get trolling.”
To some extent, the trolling was successful. In their recent report for the Institute for Strategic Dialogue on the Irish far-right, Aoife Gallagher and Ciaran O’Connor describe how Nkencho’s family “did not believe the shooting was race-related but rather class-related”. Despite this, the targeted far-right campaign sought to inflame ethnic tensions around Nkencho’s death. Through anonymous online profiles, they led a strategic campaign to “play both sides of the story,” masquerading as black rights activists whilst simultaneously infusing the debate with racist and inflammatory memes. A protest movement quickly grew, fueled by further targeted racist abuse, and on 21 May the BBC published a report on Nkencho’s death likening him to George Floyd.
These developments highlight the profitability of progressive protest movements for radical groups; the strategy is to augment tension, before framing this tension as the result of diverse societies – using it as a lightning rod to further political support for nationalist and anti-immigrant groups.
Telegram’s role here is key, underscoring the need for a more cohesive regulation of our online spaces. The capacity for poorly regulated anonymous messaging apps to galvanize radical far-right protests by facilitating disinformation is worrisome, and Telegram was the main organizing base for the U.S. attacks on the Capitol. It is crucial to realize that Telegram’s recent popularity only arose through the tightening of Whatsapp regulation. Allowing people spreading disinformation to migrate to the next available platform undermines the purpose of any regulation.
The challenge, of course, lies in balancing regulation with rights to free speech. While profiteers of inflammatory disinformation like Alex Jones and Katy Hopkins may unsurprisingly label digital regulation as an attack upon freedom of expression, there are many shades of grey between ‘regulated’ and ‘unregulated’. One such middle path is to build in ‘de-amplifying’ buffers on rapidly accelerating content, allowing regulators the time to fact-check potential disinformation.
However, for any of this to be effective, the regulation must be widely applied. The reaction to George Nkencho’s death captures in microcosm the mechanics of a far greater societal problem- the intersections between misinformation, digital messaging apps, and political tension. A “whack-a-mole” approach to regulating extremist content is clearly insufficient; regulation must instead be cohesive, applying to platforms old and new. As Western societies continue to deal with a rising far-right tide, a better understanding of how these groups operate is crucial.