Getting your Trinity Audio player ready...
|
Incels have become increasingly extreme by posting “dehumanizing labels” and violence online, warns research.
The largest study of internet incel language by the University of Exeter researchers discovered on the main forum one in every 100 words was either a dehumanizing label or a direct depiction of violence.
In 2016, the count on the same site was zero out of a hundred.
Incels are people who define themselves as unable to get a romantic or sexual partner despite desiring one.
Experts trawled through millions of incel posts between 2014 and 2022, aided by their own custom dictionary of violent extremist language.
They found the self-proclaimed “involuntary celibates” are migrating from Reddit to alternative online forums where violent extremist language can be four times greater than other online incel spaces.
The conglomeration of online spaces, known as the “incelosphere”, has become steadily more extreme over the past six years and broached into other extremist territories.
Pandemic restrictions and a major incel terrorist attack drastically expanded the level of violence and language used.
The team’s “Incel Violent Extremism Dictionary” revealed the new forums often have site-specific lexicons featuring increasingly extreme misogyny, racism, dehumanizing and violent aggression.
Professor Stephane Baele, of Exeter University, said: “We have found clear evidence of a greater volume of incel discussion online over time, including an increasing use of dehumanizing labels and words depicting violence.”
While new forums have spurred extremist violence, Reddit – the initial host of the extreme end of the community – has cracked down.
By implementing terms of use on hate speech and bullying, and removing major sub-Reddit Reddit/r/Braincels, the social media platform has tamed the language used.
When it closed in 2019 the Braincels forum had more than one million posts.
The incelosphere has subsequently migrated to alternative platforms such as online forums and blogs, fearing mainstream sites will be shut down.
There they post far more extreme content.
New sites regularly pop up and are far more challenging to remove than Reddits.
The number of posts has decreased on some of the key channels, leading the researchers to suggest the incelosphere stretches beyond the remit of their major investigation.
Dr. Lewys Brace, also of Exeter University, said: “Current activity on Reddit shows users have toned down the more extreme conversation to avoid having the board shut down.
“While Reddit initially hosted communities that increasingly adopted violent extremist language, the platform’s actions seem to have partially tamed the discussions.
“Reddit’s implementation of terms of use on hate speech and bullying and its quarantine policy have made it a more unstable place for incel communities.
“As a result, the Reddit region of the incelosophere has produced a series of different, shorter-lived communities with overlapping yet not identical membership.”
In 2018, in Toronto, Canada, a man plowed a van onto a pavement killing 11.
The months following the attack, and the first three months after Covid restrictions were imposed in the UK, saw a sharp spike in violent extremist language.
This led experts to conclude the incelosphere responds to real-world events.
Baele said: “Discussions hosted by the incelosphere have displayed increasingly violent extremism over time at the ecosystem level, but this evolution has not been uniform.
“Forums now host significantly more violent extremist discussions than sub-Reddits.
“The main lineage of incel online discussion has worryingly hosted, over time, an increasing proportion of dehumanizing outgroups labels and words depicting violence.”
He added: “The decrease in the daily number of posts for the platforms in our dataset, combined with an increase in daily posts to other sites, offers anecdotal evidence that there might now be other prominent online incel spaces that do not feature in our dataset.
“Suggesting that the incelosphere is not just a series of online spaces dedicated purely to incel ideology and discussions, but a dynamic and continually evolving ecosystem connected to neighboring ones.”
For the study, published by Taylor and Francis Online, the team collected content using custom-built web scrapers.
Now the research team wants to evaluate violent extremist ideation based on images, such as avatars containing pictures of killers or Nazi iconography.
Produced in association with SWNS Talker
“What’s the latest with Florida Man?”
Get news, handpicked just for you, in your box.