Facebook’s announcement this week that it will remove any groups and accounts affiliated with the baseless conspiracy theory QAnon is a welcome move, experts say, but may not be enough to address the damage already done.
“They’ve allowed their platform to be used for the spread of this incredibly poisonous conspiracy theory,” Matthew McGregor, campaigns director of the British advocacy group Hope Not Hate, told CBC’s Thomas Daigle.
“So it is welcome, but it is incredibly frustrating that, yet again, Facebook is actually so slow in taking action against hate on their platform.”
QAnon followers promote an intertwined series of beliefs, based on anonymous web postings from a user identified as “Q,” who claims to have insider knowledge of the Trump administration. A core tenet of the conspiracy theory — which has been amplified on Twitter, Facebook, Instagram and YouTube — is that U.S. President Donald Trump is secretly fighting a cabal of child-sex predators that includes prominent Democrats, Hollywood elites and “deep state” allies.
“QAnon supporters use coded language to try and kind of filter their real beliefs,” McGregor said. “When it comes down to it, this is a poisonous, far-right conspiracy theory rooted in anti-Semitism. But when you see the content online, it’s really about opposing child abuse and hashtags like ‘Save Our Children.’ Those are attempts to get around these bans and attempts to kind of suck people into the conspiracy.”
Less than two months ago, Facebook said it would stop surfacing content from the group and its adherents, although it faltered with spotty enforcement. It said it would only remove QAnon groups if they promote violence. That is no longer the case under a broader policy the company started enforcing Tuesday aimed at rooting out all QAnon content.
The company cautioned that the effort “will take time and will continue in the coming days and weeks.”
Barbara Perry, director of Ontario Tech University’s Centre on Hate, Bias and Extremism in Oshawa, Ont., anticipates that it will be a challenging undertaking — especially when it comes to the more covert posts on Facebook.
“Those that are especially canny have been very careful in couching their language in ways that just fall short of the community standards or even the legal boundaries around hate speech or misinformation,” Perry said.
“I think we need to continue to engage experts, both internal to the organization and external, to help with that interpretation, if you will, that translation of the emerging terminology so that they know which new posters … which new phrases and terminology needs to be flagged.”
WATCH | What is QAnon?:
Ghayda Hassan, director of the Canadian Practitioners Network for Prevention of Radicalization and Extremist Violence (CPN-PREV), is wary of the commitment from companies like Facebook as well as the ripple effect from its announcement.
“Big tech companies have shown some interest so far, but to my belief, to my knowledge, have not engaged seriously in conversation and also in the efforts,” said Hassan, a psychologist and professor at L’Université du Québec à Montréal (UQAM).
“We know that strong censorship may also produce backlash. There needs to be a global initiative. Many players have to embark and not just one.…
“It’s definitely not the only way to go and not enough.”
Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy in Cambridge, Mass., says Facebook’s announcement was aimed at targeting the wider QAnon network.
“I think they realized that these groups are able to really spawn more and more pages if you don’t remove practically the entire network at once,” she said.
“These groups are highly motivated to stay on broad social media platforms. So they already have places in which they gather and they talk that are off of Facebook and Twitter and YouTube. But if they’re going to recruit new folks or they’re going to reach new audiences, they have to be in these more public places.”
Taking hold amid the pandemic
Investigations have shown that social media recommendation algorithms can drive people who show an interest in conspiracy theories toward more material. A report by the Institute for Strategic Dialogue (ISD) in July found that the number of users engaging in discussion of QAnon on Twitter and Facebook has surged this year, with membership of QAnon groups on Facebook growing by 120 per cent in March.
In June, another ISD report identified more than 6,600 online pages, accounts or groups where Canadians were involved in spreading white supremacist, misogynistic or other radical views.
On Wednesday, Facebook Canada said it removed Radio-Québec, one of the province’s most prominent QAnon advocates, as part of its new efforts.
WATCH | Facebook targets QAnon group Radio-Québec:
The ongoing COVID-19 pandemic has presented an opportunity for these groups to take hold in mainstream online discourse, as people spend more time online, particularly on social media, and have fewer face-to-face conversations that allow more opportunity for some of the more outlandish theories to be directly confronted.
“We have some very significant information challenges. And unfortunately, the internet, the way that it’s built, is built to surface things that are popular and fresh rather than are true and correct. And so truth is really at a disadvantage in this moment.”
Normalization of conspiracies
Donovan says it’s important to understand how conspiracy theories such as QAnon become normalized. At least one Republican candidate who espouses QAnon beliefs is on track to earn a seat in the U.S. House of Representatives: Marjorie Taylor Greene, who won a Georgia primary runoff in August for a heavily Republican congressional district.
“There are aspects of the QAnon conspiracy theory that haven’t been normalized, pieces of it that are at its root and where it got started that are incredibly anti-Semitic, that haven’t really broken through,” Donovan said.
The U.S. House of Representatives voted to condemn QAnon last week, with Republican Rep. Denver Riggleman, who co-sponsorted the resolution, saying the anti-Semitic posts on social media “should cause concern for everyone.”
However, Donovan noted that other aspects of the conspiracy theory “have gripped people,” particularly with regard to the unfounded allegations of pedophilia and sex trafficking. Recently, QAnon proponents organized protests against child trafficking and were involved in a pro-police demonstration in Portland, Ore.
“That’s why we need platform companies to be thinking more strategically about what is the communication they want to support, what are the groups in the communities that they believe will benefit from using their products and how are they going to then moderate these other groups that are really acting almost like a parasite,” Donovan said.
“They’re attaching themselves to other groups and then over time really taking over the host.”
LISTEN | What’s needed to tackle conspiracies and extremism online:
Spark12:48Understanding online extremism begins with ‘whole society’ approach, expert says
‘Damaging impact’ on presidential election
McGregor says the onus is on social media platforms to clamp down on groups and accounts that spread QAnon conspiracies, as they have “allowed this to grow to a point where damage has already been done and damage will continue to be done.”
“To have done this three weeks out from the election rather than six months out from the election, it genuinely has made a very, very damaging impact on the election itself,” he said.
Hassan, meanwhile, says a multi-pronged approach, including fact-check and media literacy initiatives by news organizations and groups such as the Canadian non-profit Media Smarts, is critical to properly addressing the issue.
“I think actions should be at different levels, and at a global prevention level, we must multiply initiatives around critical literacy, around checking information before sharing,” she said.
What other companies have done
Facebook’s announcement comes after other social media companies previously announced efforts to weed out QAnon content.
A spokesperson for the short-form video app TikTok told Reuters that it has blocked dozens of QAnon hashtags while a Reddit spokeswoman told Reuters the site has removed QAnon communities that repeatedly violated its rules since 2018, when it took down forums such as r/greatawakening. A YouTube spokesperson said it has removed tens of thousands of Q-related videos and terminated hundreds of Q-related channels for violating its rules since updating its hate speech policy in June 2019.
“The disinformation around the election, around COVID, there are a number of things coming together at the same time … to finally sort of make them recognize the impact of not just hate speech but disinformation on community practice and community safety,” Perry said.
Beyond social media, e-commerce site Etsy said it was removing all QAnon merchandise for purchase. CBC News reached out to Amazon and eBay representatives on Tuesday to ask whether they would do the same but did not receive a response.
“Companies are incentivized by the number of people that visit their platforms. They make most of their money from advertising,” McGregor said. “And what they need is a counter to that — the counter of consumers, other users, activist groups, politicians saying this is not acceptable.
“If they’re not under pressure to act, the dollars will keep clicking up and the incentive really isn’t there.”