More than 1 in 50 tweets about voting in the 2020 elections in August and September were posted by QAnon accounts, according to research released Friday by Advance Democracy, a nonpartisan nonprofit.
The research also found that 2 in every 25 tweets using the hashtag #voterfraud originated from QAnon accounts, a sign of how ubiquitous the conspiracy theory has become on Twitter, one of the last remaining major social media platforms to allow QAnon content.
While still a small percentage of the broader discussion, the data shows how the QAnon movement has succeeded in becoming an outsize influence on election discourse. Researchers have warned that Trump-friendly social media networks and far-right media are poised to amplify any of the president's claims about the election.
QAnon is a conspiracy theory that falsely claims the world is run by a secret cabal of child-eating Democrats and Hollywood celebrities who worship Satan, and that President Donald Trump is fighting a secret war to stop them.
YouTube and Facebook have both banned QAnon content from their services in the last two weeks, citing a steady stream of violent real-world incidents caused by the conspiracy theory. Some QAnon believers have recently begun joining local armed militant groups, and one of the 13 men who plotted to kidnap Michigan Gov. Gretchen Whitmer posted about his belief in the conspiracy theory.
“The fact that one in 50 tweets about voting in the 2020 elections are coming from QAnon accounts illustrates just how far social media companies still need to go to combat this dangerous conspiracy theory,” said Daniel J. Jones, the president of Advance Democracy and a former investigator for the United States Senate and the FBI.
“I hope this is a wake-up call for those doubting the breadth and depth of QAnon-related activity on social media," he added.
Advance Democracy conducted the research using the social media analytics tool Zignal, which combs the platform for tweets using specific parameters. It defined QAnon-related accounts as any account with QAnon or QAnon-specific catchphrases in the account’s bio, like #WWG1WGA, an acronym for the QAnon motto “where we go one, we go all.”
The data included tweets that featured terms like “vote” and “vote by mail” and excluded accounts that were talking about voting for nonpresidential election-related contests, like fans of the K-Pop band BTS, who regularly game online polls.
The research was conducted from Aug. 15 to Sept. 30, two months after Twitter said it would take steps to limit the reach of QAnon accounts on its platform.
On June 21, Twitter said it would work to limit QAnon accounts from appearing prominently in searches and trending topics, and ban QAnon accounts that participated in online harassment. The company said last month that the new policy cut the amount of tweets about QAnon in half.
Still, QAnon accounts are appearing prominently on Twitter and their conspiracy theories are reaching the president. On Wednesday, Trump retweeted a false conspiracy theory from a QAnon account that former President Barack Obama had accidentally ordered the killing of Osama bin Laden’s body double, and that former Vice President Joe Biden ordered the killing of SEAL Team 6 to cover it up. Robert J. O’Neill, a member of the team that found bin Laden, immediately disputed the theory on Twitter and in an appearance on CNN.
That QAnon account was banned by Twitter after it was retweeted by the president. During Thursday's town hall, Trump declined to denounce QAnon.
Advance Democracy said that at least 100,000 accounts on Twitter with QAnon-related terms in their bios remain on the platform.
“It’s important that Facebook and Twitter have taken action to curb the spread of conspiracy groups like QAnon that are promoting disinformation -- and, in some cases, violence -- on their platforms,” Jones said. “But as this research shows, the social media platforms need to do more.”