Report warns that far-right extremists are using gaming platforms to radicalize teenagers.

Report warns that far-right extremists are using gaming platforms to radicalize teenagers.

A new report warns that far-right extremists are using gaming livestream platforms to target and radicalize teenage players. Published in Frontiers in Psychology, the research shows how extremist groups exploit platforms where users chat and stream while gaming to recruit vulnerable young people, mostly boys.

UK counter-terrorism and crime agencies have urged parents to stay vigilant during the summer holidays, as online predators may take advantage of reduced supervision. Last week, Counter Terrorism Policing, MI5, and the National Crime Agency jointly warned that offenders could exploit the break to groom young people when support systems are weaker.

Dr. William Allchorn, who co-authored the study with Dr. Elisa Orofino, described these platforms as “digital playgrounds” for extremism. Teenagers are being deliberately steered from mainstream social media to gaming sites, where the sheer volume and nature of content make policing difficult. Far-right ideology dominates, with posts glorifying violence and school shootings.

A recent case involved Felix Winter, an 18-year-old jailed for six years after threatening a mass shooting at his Edinburgh school. The court heard he had been radicalized online, spending over 1,000 hours in a pro-Nazi Discord group.

Allchorn noted that far-right groups like Patriotic Alternative have intensified recruitment efforts through gaming since the pandemic. With many extremists banned from mainstream platforms, they now lurk in public groups on sites like Discord, identifying potential recruits. While some teens engage with extreme content for shock value, this makes them easy targets.

As platforms crack down, extremists have grown more subtle, focusing on building rapport rather than overt ideological pitches. Moderators told researchers they struggle with inconsistent enforcement policies and the challenge of identifying harmful content, such as hidden symbols used to evade automated filters (e.g., symbols forming a swastika).

Allchorn stressed the need for better digital literacy among parents and law enforcement to understand these platforms and subcultures. MI5 director Ken McCallum recently revealed that 13% of terrorism suspects in the UK are under 18—a threefold increase in three years.

While AI helps moderate content, it struggles with ambiguous language, sarcasm, and memes, leaving gaps in detection.