Washington: Facebook and Twitter promised to stop encouraging the growth of the baseless conspiracy theory QAnon, which fashions President Donald Trump as a secret warrior against a supposed child-trafficking ring run by celebrities and government officials, after it reached an audience of millions on their platforms this year. But the social media companies still aren’t enforcing even the limited restrictions they’ve recently put in place to stem the tide of dangerous QAnon material, a review by The Associated Press found.
Both platforms have vowed to stop suggesting QAnon material to users, a powerful way of introducing QAnon to new people. But neither has actually succeeded at that. On Wednesday, hours after a chaotic debate between Trump and Democratic presidential nominee Joe Biden, a video from a QAnon account that falsely claimed Biden wore a wire to cheat during the event was trending on Twitter, for example.
Twitter is even still running ads against QAnon material, in effect profiting off the type of tweets that it has vowed to limit. In some cases Facebook is still automatically directing users to follow public and secret QAnon pages or groups, the AP found.
Their algorithm worked to radicalize people and really gave this conspiracy theory a megaphone with which to expand, Sophie Bjork-James, an anthropologist at Vanderbilt University who studies QAnon, said of social platforms. They are responsible for shutting down that megaphone. And time and time again they are proving unwilling. The QAnon phenomenon sprawls across a patchwork of secret Facebook groups, Twitter accounts and YouTube videos.
QAnon has been linked to real-world violence such as criminal reports of kidnapping and dangerous claims that the coronavirus is a hoax. But the conspiracy theory has also seeped into mainstream politics. Several Republican running for Congress this year are QAnon-friendly. Although restricted to the backwaters of the internet for years, QAnon posts reached millions of people via social media this year.
Interactions — primarily likes and comments — with public Facebook and Instagram posts that included QAnon terms began climbing in March. By July, they received more attention than at any other point in the last year, according to an AP analysis of data from CrowdTangle, a Facebook-owned tool that helps track material on the platforms.
That month, public posts on Facebook-owned Instagram featuring the #QAnon hashtag received an average of 1.27 million likes and comments every week, according to the analysis. Some of those posts included news stories about QAnon. But the majority of the most popular Instagram posts during July were expressing support for the conspiracy theory, President Donald Trump, or far-right conservative causes, the AP found.
One post that used the QAnon hashtag, which raked up nearly 20,000 likes, claimed that no one has died from coronavirus. Another was a photo of a Donald Trump that called him One of God’s Finest Warriors. Twitter didn’t limit the conspiracy theory until July 21, when it announced it was kicking off 7,000 QAnon accounts and promised to stop promoting or recommending QAnon. Facebook introduced its new rules on August 18, pledging to stop encouraging users to join QAnon groups, banning QAnon hashtags and kicking off thousands of QAnon groups that encouraged violence.
Unfortunately, it was too late and not enough, Bjork-James said. The AP also discovered more than a dozen popular QAnon accounts on Twitter that collectively maintain a following of nearly 1.5 million users, almost all of which were recommended to users who followed other QAnon accounts.
And Twitter appears to be profiting from those QAnon accounts. Nearly all of the accounts the AP identified had ads showing in their feeds for big brand names that sell everything from beer to toilet paper. That doesn’t mean the brands intentionally placed their ads in the feeds of the accounts, although it does suggest that Twitter isn’t preventing the ads from appearing next to QAnon material.