Summary List Placement
The QAnon conspiracy theory was born on the internet, and while it’s spread to real-life rallies in the US and abroad, it’s continued to thrive and spread online. And yet social-media platforms, where the conspiracy theory gains power and radicalizes people in the US abroad, have been generally slow to act on banning it.
The conspiracy-theory movement made news again last week, when many of its followers were seen participating in the deadly riot in the US Capitol. The insurrection was fueled by voter-fraud conspiracy theories popularized by the president himself, but QAnon influencers, among other far-right figures, helped spread the theories.
QAnon is a baseless far-right conspiracy theory that claims President Donald Trump is fighting a deep-state cabal of elite figures who are involved with human trafficking. It is unfounded, and yet its followers — estimated to be in the millions — have reportedly been linked to several alleged crimes, including killings and attempted kidnappings. In 2019, a bulletin from the FBI field office in Phoenix warned that the conspiracy theory movement could become a domestic terrorism threat.
Here’s how major tech companies have handled the spread of the QAnon conspiracy theory online.
This article was updated to include how companies adjusted their policies in the wake of the Capitol siege, which was linked to QAnon.
Facebook said its companies are cracking down on QAnon.
On October 6, Facebook announced it would remove all pages, groups, and Instagram accounts that promoted QAnon.
The ban, which the company said would be enacted gradually, comes after the platform previously announced over the summer that it had removed 790 QAnon Facebook groups.
Extremism researchers are tracking how the new ban will play out, as the movement has spread rapidly on Facebook and on Instagram, where many are using “Save the Children” rhetoric to further propagate the movement’s misguided focus on human trafficking conspiracy theories.
Facebook has been criticized for its slowness in acting against QAnon.
Twitter announced a moderation plan on QAnon in July. After the Capitol riot, the platform said it removed 70,000 QAnon-associated accounts.
The New York Times reported on Monday that Twitter had removed more than 70,000 accounts that had recently promoted QAnon.
“These accounts were engaged in sharing harmful QAnon-associated content at scale and were primarily dedicated to the propagation of this conspiracy theory across the service,” Twitter said in a blog post.
Twitter announced its first sweeping ban on QAnon in July, suspending accounts that were “engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension.”
The platform said it would also stop recommending QAnon-related accounts and trends and block URLs associated with QAnon from being shared on Twitter.
In the past, critics have said the platform was slow to act on the movement and hasn’t moderated the community enough. On October 3, The Washington Post reported that there were still 93,000 active Twitter accounts referencing QAnon in their profiles, citing research from Advance …read more
Source:: Business Insider