On one of the worst days in Toronto’s history — as victims lay in hospital beds, and families received devastating news about loved ones who didn’t survive — they cheered.
Across the internet members of a deeply misogynistic subculture who call themselves incels, short for involuntary celibates, welcomed the news of what’s now known as the van attack.
One even hailed the killer as their “new saint,” according to images of posts provided to the Star by the , a U.S. legal advocacy non-profit.
“Joyous day,” said another.
The 26-year-old driver of the white van who plowed down pedestrians on a busy North York stretch of Yonge Street that day in April 2018, identified himself as part of this “movement” without borders, and said he was “radicalized” online and inspired by its cult figures to complete his “mission,” according to a transcript of an interview with police recently made public. Alek Minassian will stand trial without a jury in pc28on 10 counts of murder and 16 counts of attempted murder next year.
Incels, experts say, are a rising threat, part of a global far-right ecosystem of angry young men who have been radicalized online and committed a rash of recent attacks from Christchurch, New Zealand to El Paso, Texas. Not all of these men are incels, but they are part of this larger web.
And, experts argue, not enough has changed since the van attack on the part of tech companies and platforms who are providing the forums for this hatred.
The Southern Poverty Law Center started tracking incels, under the banner of male supremacy, in 2018. There have always been misogynists, says its intelligence project director Heidi Beirich. But this is something new.
“What has happened over recent years is that members of the white supremacist milieu or the alt-right, whatever you want to call it, have increasingly come to their extremism by an online pathway that almost always involves extreme misogyny,” she said over the phone from the centre’s head office in Montgomery, Alabama.
“You now have this embedded radicalization of largely, young white men, but not entirely, who harbour deep hatred of women and often that’s coupled with a lot of extremist beliefs,” she adds, “this is becoming an increasing domestic terrorism threat.”
Neo-Nazis in past eras, while not recognizing women as their equals, saw them as worthy of protection. “This new crop of extremists is definitely not doing that,” notes Beirich.
The term incel was coined by a Canadian woman in the early 1990s who created a community of lonely people struggling to find connection. But it’s now been overtaken by men with a deep hatred of women who find solace online.
They believe that women who refuse to have sex with them deserve violence. The Stacys (attractive women) prefer the Chads (attractive men) over the incels, who are at the bottom but deserve to be at the top. In the middle are the “normies,” regular people.
This ideology has “accelerated into multiple attacks” in recent years, starting with self-proclaimed incel Elliot Rodger, who killed six people and injured 14 near the University of California, Santa Barbara, in 2014, says Beirich.
Police also found the 29-year-old killer in the summer 2018 Danforth shooting had a possible interest in incel culture, though they didn’t find a clear motive or association with terrorist or hate groups. A nearly yearlong investigation into the shooting revealed Faisal Hussain had a copy of a misogynistic manifesto Rodger left behind.
There’s been a surge in attacks tied to the broader network of far-right internet hate over the last few months. These men did not call themselves incels, but are part of the larger trend of angry young men radicalized online.
The members of this diffuse global community meet, communicate and inspire each other online, on forums ranging from niche message boards to mainstream sites used by billions.
The Christchurch shooter — who killed 51 people — streamed part of his March attack on two mosques on Facebook Live. He also penned a white supremacist manifesto that was shared on Twitter and 8chan, an anonymous message board popular with racists and misogynists.
The following month a man in Poway, Calif., posted a racist anti-Semitic letter on 8chan before allegedly killing one and injuring three in a shooting at a synagogue on the last day of Passover.
Then in August 2019, a Dallas man drove to a busy Walmart in El Paso. It’s been reported in U.S. media that police believe he was inspired by Christchurch, deliberately targeted Hispanic people and posted a racist anti-immigrant manifesto on 8chan before the attack. Twenty-two people were killed, and 24 injured, including a two month old baby whose parents died trying to shield him from the bullets. The shooting is being investigated by the FBI as possible domestic terrorist attack and hate crime.
After El Paso, 8chan’s own founder called for it to be shut down, according to the . Cloudflare, an internet security company, cut off its support in August and the site, described on its Twitter profile as “The Darkest Reaches of the internet,” is now offline.
Just a week later, a 21-year-old Norwegian man allegedly killed his sister and stormed a local mosque, wounding one person. that, this time, he left messages on a new message board called Endchan, saying he was inspired by Christchurch and El Paso. It’s being investigated as an act of terrorism. Endchan has been offline in recent days. After the attack, its administrators tweeted they’d been recently hit by “a large influx of 8chan refugees ... drastically changing the pace in which the site operates.”
It can be hard to squash every smaller site that takes in “people who’ve been booted off Facebook and Twitter with hateful views,” Beirich says. As soon as one cracks down or goes dark, the worst people on it pop up somewhere else.
But smaller sites have fewer users and are often “preaching to the choir,” Beirich adds. More mainstream sites like Facebook, Twitter, and Google (which owns YouTube) can have a huge impact and reach billions.
It’s those bigger tech companies that need to step up, as they’re the places where new people will be recruited and radicalized, she says.
Until the August 2017 white supremacist rally in Charlottesville, Virginia, the tech companies were not recognizing white and male supremacist hate as a problem, Beirich adds. But “now the kind of conversation we’re having is, why is your implementation so terrible?”
The worst posts cheering on the killer on the day of the van attack were from a now defunct niche website called incel.me. Minassian said in the police interview that he was “radicalized”on Reddit and 4chan. Hours before the attack he posted on 4chan using coded incel language announcing an imminent attack, hoping to inspire others. But it was on the much more mainstream Facebook that he left his last message.
Reddit took steps to curtail incels in November 2017, by taking down a forum devoted to them, and earlier that fall announced a new policy to ban content that incites, encourages, or glorifies violence.
“Communities focused on this content and users who post such content will be banned from the site,” added a spokesperson for the company.
Representatives from 4chan did not respond to requests from the Star for comment. Requests for comment to an administrator email and Twitter account associated with 8chan were not returned.
A Facebook spokesperson responded that “individuals and organizations who spread hate, attack, or call for the exclusion of others on the basis of who they are have no place on our services.”
The social network’s policy on dangerous individuals and organizations states that they do not allow those “who are engaged in ‘organized hate.’ ” It continues to review “individuals, pages, groups and content” that breaches its community standards.
YouTube Canada spokesperson Nicole Bell wrote in an email that hate speech and content that promotes violence has “no place” on the platform, and the company has “heavily invested” in both humans and technology to quickly detect, review and remove this content.
“Since the pc28van attack in 2018, we’ve been taking a close look at our approach toward hateful content in consultation with dozens of experts in subjects like violent extremism, supremacist, civil rights, and free speech, and as a result of that consultation we to tackle these issues,” she added.
A spokesperson for Twitter referred the Star to their global policy strategist’s U.S. congressional testimony from June 2019 where he explained they’ve suspended more than 1.5 million accounts for violations related to terrorism from August 2015-2018 and have seen a steady decrease in terrorist organizations trying to use their service over the years.
Stephanie Carvin, an assistant professor of International Relations at Carleton University, acknowledges finding, reviewing and removing this content can be tough. But, she notes, it’s been done before.
“There’s always going to be dark corners of the internet, but we were petty successful about taking down Islamic State propaganda,” she says.
“The far right is far more affluent and it’s far less cohesive. But still, it should be easy to identify the nodes of these networks.”
Carvin says anonymous online communities provide a form for these men that pushes them toward violence.
“You’re daring each other to do more and more extreme things,” she says.
“These individuals are carrying out attacks. They’re killing lots of people, there’s transnational links, they’re inspiring each other.”
She says social media companies need to do a better job of enforcing their own terms and conditions. “It’s hard but you run a business, is this how you want your business being used?” she asks.
The companies depend on user-reporting, artificial intelligence and human judgment calls by moderators, to enforce their policies. But there have been multiple reports that this is not done consistently.
found “uneven” enforcement of Facebook’s hate speech policies, and after asking the social media giant about its handling of 49 offensive posts, the company acknowledged its content reviewers had made the wrong call on almost half of them. Another investigation by the U.S. non-profit earlier that year found Facebook’s policies tend to favour governments and elites over individuals with less power. more than 1,000 examples of posts, comments and pornographic images attacking the Rohingya and other Muslims that were still on Facebook in 2018, despite Mark Zuckerberg’s assurances that the company was cracking down.
CNBC reported in August that Twitter users have been switching their country location to Germany, where local laws require companies to pull down Nazi content quickly, in order to escape online anti-Semitism and racism they are still experiencing on the site.
In his interview with police, Minassian said he did not deliberately target women. He said he just saw a crowded area and decided to “go for it.” But he referred to two incel mass killers in the interview transcript: Elliot Rodger and Chris Harper-Mercer, who killed 10 people at an Oregon community college in 2015. He said he communicated with both online, and himself inspired a man in Edmonton to commit an attack.
These claims have not been independently verified by the Star.
In the last year or so since the van attack, law enforcement agencies have begun to recognize incels as a new public safety threat, Carvin says.
CSIS referred to the van attack, as well as the 2016 Quebec mosque shooting, in its 2018 published in June, under the heading of “Right Wing Extremism.”
The move, Carvin says, signals a new priority.
While Christchurch catalyzed this shift, the van attack “may have been the start of the momentum,” she says.
The incel ideology is not as clear-cut as that of other terrorist movements, “and more of a collection of random grievances aimed at women generally,” she adds.. But, she notes, under the acts can be committed “in whole or in part for a political, religious or ideological purpose, objective or cause … with the intention of intimidating the public.”
“These guys are drawing their ideas in part” online, she says.
That this threat is not being taken more seriously, by companies and society as a whole, is because of the normalization of rape culture and violence against women, says Nicolette Little, a critical media studies researcher at the University of Calgary.
“If you look at some of the uproar that might happen around what people more standardly think of as a terrorist attack and compare it to the kind of uproar or lack thereof around this kind of thing, I think that’s a really interesting point to consider,” she says.
“It seems like these events happen, like this van attack, and there is sort of the willingness to do something but then it just fades so quickly.”
She worries about copycat attacks, and notes that the van attacker evoked Rodger in his Facebook post.
“Digital media is a beast that we’re really having trouble controlling and understanding how to control,” she says.
“It’s not any one fight, it’s not any one forum, it’s a rather vast linked network of women-haters online.”
At the same time, she cautions against giving these men too much oxygen, and even questions the continued use of the name they’ve given themselves. Some western leaders and media started using the term Daesh instead of the Islamic State or ISIS, because it’s a derogatory label that delegitimizes the terrorist group, she notes.
“I think we might want to step away from the term incel,” Little says.
“And start calling them what they are, which is really angry loathsome misogynists, who are doing terrible things out of a strange mix of self-loathing and hatred of women.”