Archived link.

On Jan. 6, 2021, QAnon conspiracy theorists played a significant role in inciting Donald Trump supporters to storm the Capitol building in D.C., hoping to overturn the 2020 election in favor of Trump.

Days later, Twitter suspended tens of thousands of QAnon accounts, effectively banning most users who promote the far-right conspiracy theory.

Now, a new study from Newsguard has uncovered that since Elon Musk acquired the company, QAnon has had a resurgence on X, formerly Twitter, over the past year.

QAnon grows on X

Tracking commonly used QAnon phrases like “QSentMe,” “TheGreatAwakening,” and “WWG1WGA” (which stands for “Where We Go One, We Go All”), Newsguard found that these QAnon-related slogans and hashtags have increased a whopping 1,283 percent on X under Musk.

From May 1, 2023 to May 1, 2024, there were 1.12 million mentions of these QAnon supporter phrases on X. This was a huge uptick from the 81,100 mentions just one year earlier from May 1, 2022 to May 1, 2023.

One of the most viral QAnon-related posts of the year, on the “Frazzledrip” conspiracy, has received more than 21.8 million views, according to the report. Most concerning, however, is that it was posted by a right-wing influencer who has specifically received support from Musk.

The Jan. 2024 tweet was posted by @dom_lucre, a user with more than 1.2 million followers who commonly posts far-right conspiracy theories. In July 2023, @dom_lucre was suspended on then-Twitter. Responding to @dom_lucre’s supporters, Musk shared at the time that @dom_lucre was “suspended for posting child exploitation pictures.”

Sharing child sexual abuse material or CSAM would result in a permanent ban on most platforms. However, Musk decided to personally intervene in favor of @dom_lucre and reinstated his account.

Since then, @dom_lucre has posted about how he earns thousands of dollars directly from X. The company allows him to monetize his conspiratorial posts via the platform’s official creator monetization program.

Musk has also previously voiced his support for Jacob Chansely, a QAnon follower known as the “QAnon Shaman,” who served prison time for his role in the Jan. 6 riot at the Capitol.

The dangers of QAnon

QAnon’s adherents follow a number of far-right conspiracy theories, but broadly (and falsely) believe that former President Trump has been secretly battling against a global cabal of Satanic baby-eating traffickers, who just happen to primarily be made up of Democratic Party politicians and Hollywood elites.

Unfortunately, these beliefs have too often turned deadly. Numerous QAnon followers have been involved in killings fueled by their beliefs. In 2022, one Michigan man killed his wife before being fatally shot in a standoff with police. His daughter said her father spiraled out of control as he fell into the QAnon conspiracies. In 2021, another QAnon conspiracy theorists killed his two young children, claiming that his wife had “Serpent DNA” and his children were monsters.

Of course, QAnon never completely disappeared from social media platforms. Its followers still espoused their beliefs albeit in a more coded manner over the past few years to circumvent social media platforms’ policies. Now, though, QAnon believers are once again being more open about their radical theories.

The looming November 2024 Presidential election likely plays a role in the sudden resurgence of QAnon on X, as QAnon-believing Trump supporters look to help their chosen candidate. However, Musk and X have actively welcomed these users to their social media service, eagerly providing them with a platform to spread their dangerous falsehoods.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    I don’t know, it just seemed to me they might have had in mind that whoever is trying to “control the narrative” would find competing disinformation campaigns just as unwelcome.

    • mozz@mbin.grits.dev
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      I have this surreal experience sometimes where I’ll say something like “I don’t think the US government is alone in wanting that gone so they can control the narrative instead” and then find someone lecturing me about how exactly what I just got done saying might be true.

      That said, the person I was talking to was clearly implying that banning Tiktok would be a bad thing because people can get unfiltered information through it. You can try to say they were saying something else that’s more sensible, if you want. I won’t stop you. They don’t seem to want to clarify it themselves, so it’s hard to say.

      • orca@orcas.enjoying.yachts
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        The TikTok ban is a weird gray area. It’s one of those things where I don’t like any of these tech companies or the fact that they are essentially monetizing terror, and I will never have allegiance to them or the entities that influence them to deceive and profit themselves, but it’s also a pipeline for information either way—which is something we are lacking more and more of every day with the constant deaths of journalists and journalist outlets. People need to remember that no one is immune to propaganda and we should take all of the information with a grain of salt.

        But if we’re talking about the things happening in Gaza, any eyes we can get add value. It’s not like this is some new event that we don’t have documented history on. We’re witnessing the next stage of the genocide, and TikTok (hate it or love it) has helped expose a lot of the atrocities (in the same way that Twitter used to).

        It’s really not a stretch to say that anti-imperialist governments and individuals benefit from this because it exposes the atrocities the US and Britain have been perpetuating for years. Of course that’s a benefit to them. It’s also a benefit to the people forced to live under the governments committing those atrocities.