More

    QAnon: The virality of a baseless conspiracy theory

    There exists a certain look that has taken over everything, be it a certain colour palette, font, or a particular design. This visual style sells meal kits, clothes, newspaper subscriptions, and just about anything else. This is a phenomenon particularly witnessed on social media. Such is the story of QAnon. What began as a seemingly harmless, trendy Instagram hashtag about child trafficking, #savethechildren, now has over 8,00,000 posts – from influencers and regular users alike. 

    Following this, membership on Facebook pages and groups branded about anti-child trafficking grew by 3000% between July and September, and by the end of August, in-person rallies began to take place in cities across the world. Ending child trafficking is not controversial, but behind that surge in growth is the baseless conspiracy theory known as QAnon. An ideology once confined to the more obscure parts of the Internet is now finding its way mainstream – one Instagram post at a time. 

    On 28 October 2017, the first of a series of posts by an anonymous user emerged on the 4chan message board /pol/. The user was nicknamed Q. The community that followed and believed those anonymous postings became known as QAnon, and they developed an elaborate conspiracy theory that Donald Trump is fighting a global child-trafficking network led by satanic, cannibalistic left-wing pedophile elites. Their theories have repeatedly been falsified, but that has not prevented the community from spreading, from 4chan to other forums. As it spread, the posts, videos, and memes explaining its ideology became more accessible and digestible. Now, QAnon is a giant entanglement of conspiracy theories – with several offshoots – which invites different kinds of conspiratorial thinking.  

    On Facebook, QAnon-related activity grew steadily for years without consequences. In March 2019, three leading QAnon Facebook groups saw their membership rise from under 50,000 to over 300,000. By August, an internal investigation at Facebook found that a number of QAnon groups and pages had more than 3 million followers. On 19 August, Facebook announced that it would be banning hundreds of QAnon pages and groups. Consequently, traffic for QAnon phrases and hashtags fell. But membership in groups posing as anti-child trafficking groups exploded. In those groups, users were still largely spreading QAnon content. QAnon followers had simply moved to another hashtag to improve their image, one that was already being used for a fundraising campaign by a UK-based charity called Save the Children. 

    One cannot accurately trace how the #savethechildren hashtag jumped from QAnon Facebook groups to mainstream accounts on Instagram, but, by July, high-profile accounts boosted the hashtag with inaccurate and misleading statistics. Perhaps the most damaging part of the #savethechildren movement is how inaccurate information is making it harder to fight actual trafficking. There is no reliable data on how many people are trafficked in the US each year. But stereotypical kidnappings are not what trafficking usually looks like. Instead, it often takes the form of forced labour or wage theft, most commonly in agriculture, domestic work, or sex work. The people most at risk are those who are already vulnerable: youth experiencing homelessness, in foster care, or in unstable housing; LGBTQ+ youth who have been ostracised from their homes or communities; or young migrants. But the hysteria caused by distorted numbers had led to a deluge of calls and outreach from concerned QAnon followers, and it is overwhelming the organisations fighting actual child trafficking. 

    As of October 2020, the current number of QAnon supporters is largely unclear, but the theory still has a substantial online following. In June 2020, Q urged adherents to take a “digital soldiers oath,” which many did, through a Twitter hashtag, #TakeTheOath. In July 2020, Twitter banned thousands of accounts and hashtags that were affiliated to QAnon and changed its algorithms to reduce the spread of the conspiracy theory. A Facebook internal analysis report from August found millions of followers across thousands of pages and groups. After Facebook took actions to prevent QAnon activity, followers began to use other message boards, such as 8chan and EndChan, where they planned to use information warfare to influence the 2020 United States presidential election. 

    Any searchable hashtags with typical QAnon language on Instagram will not be found anymore. But because #savethechildren is not an inherently harmful tagline, moderating it has proven particularly difficult, and tackling misinformation on Instagram can be harder than on Facebook, since it is more difficult to train an algorithm to recognize misleading text in slideshow images than it is in Facebook posts or comments. The real danger is that people need not believe, or even be aware of, the entirety of a conspiracy theory for it to start influencing their decisions. While the movement itself is not particularly organised, there does exist a danger of inspiring and inciting vigilante violence and further divisions within the country. 

    Latest articles

    Related articles

    3 Comments

    Leave a reply

    Please enter your comment!
    Please enter your name here