I get asked quite a lot about the difference between the original US QAnon movement and the recent rise of a new, soft, global QAnon. Here's how I understand it: The original QAnon, until this year, was primarily an American movement deeply steeped into US culture and politics.
President Trump, the US culture war, partisan party politics and religious narratives of good vs evil and God vs Satan were central to the original QAnon movement. But the Covid-19 pandemic was a game changer. Suddenly, millions of people who'd previously barely heard of QAnon
found themselves in lockdown with hours and hours of time to spend on social media. Some people lost their jobs, were frightened by the impact of a virus about which we knew little, anxious about their loved ones, the wider community and the economy. And they found online content
that acknowledged their fears about lockdown, vaccines, masks, social distancing, jobs, civil liberties and the economy. Naturally, some of that content came from the US QAnon movement, who believed the virus was a plot by the deep state cabal and/or hostile enemies like China
to put an end to the Q operation, Trump presidency and the ensuing "storm". So US QAnon suddenly found a whole new global audience of Covid sceptics who might not necessarily have been interested in internal US politics and culture. And then the big shift happened in June/July,
when social media companies began restricting the famous QAnon terms, phrases and hashtags on their platforms. Suddenly, the reach of QAnon narratives and its ability ro recruit new believers was weakened, and therefore they came up with the idea of hijacking some
well-known, established hashtags and phrases like #SaveTheChildren and #SaveOurChildren. This was such a clever move. Millions of people around the world saw these hashtags pop up on their social media feeds. Who can possibly disagree with the idea of saving children and
opposing child abuse and trafficking? That's something literally all of us, regardless of our politics and personal views, can get behind. This is precisely why global "Save Our Children" marches have become popular, featuring diverse crowds from all walks of life/backgrounds.
Posts, memes and videos about the plight of hundreds of thousands of children around the world resonated with ordinary people in different countries. While some political, religious or cultural aspect of US QAnon might not have been too appealing to these people,
the secret paedo global elite aspect, brought to their attention by #SaveOurChildren, was. This is what I would describe as soft QAnon. And it probably explains why women and young people are heavily involved in these new rallies we are seeing in different parts of the world.
I spoke to people in a London "Save Our Children" march. While most were QAnon followers, some knew little about it or the nitty gritty of US politics, and were only there to campaign for children being trafficked by elites. However, the organisers are proper QAnon believers.
This is a distinction we need to make in our reporting if we want to understand the movement better. Not everyone who posts #SaveOurChildren on social media is necessarily a hardcore QAnon believer. And as QAnon spreads globally, the specifics will differ from one country
to another. So to sum up, two things happened this year which gave rise to US QAnon and made it a global movement with soft QAnon marches around the world: Covid-19 and the hijacking of #SaveOurChildren after social media companies clamped down on original QAnon terms.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Immediately after the Southport attack, baseless rumours began spreading online.
The main source of rumours has been a report by an obscure US "news" website that falsely claims the suspect is an "asylum seeker" named "Ali Al-Shakati", who "arrived in the UK by boat last year".
Merseyside Police has confirmed that the suspect was born in Cardiff, and has yet to identify the 17-year-old.
The report also adds that the suspect was "on MI6 watch list", despite the fact that it is MI5, not MI6, that deals with domestic counter-terrorism cases.
The name "Ali Al-Shakati" has since been widely shared online in misleading posts viewed by millions.
Some other outlets, including Russia's RT news channel, have also reported this name, citing the US-based website.
Pro-Kremlin influencers claim the captain of the Dali ship is a Ukrainian.
But online records show a Ukrainian man was the Dali's captain from March to July 2016. The ship that hit the bridge reportedly had an all-Indian crew.
Claims by influencers such as Alex Jones and Andrew Tate that the Baltimore Bridge collapsed due to a "cyber-attack" have been viewed millions of times.
Maryland Governor Wes Moore has said the early investigation points to an accident, with "no evidence of a terrorist attack".
This video, viewed 1.4 million times, claims to show evidence of pre-installed explosives causing the Baltimore Bridge collapse.
What the video shows is not explosives, but most likely electrical wires catching sparks.
DC Weekly, a website founded by a former US Marine now living in Russia, has fuelled disinformation stories about Zelensky and Ukraine, including a fake story that he bought two luxury yachts with US aid money, later repeated by some members of Congress.
These are just a few of the disinformation stories published by DC Weekly about Zelensky and Ukraine recently.
They all follow the same pattern: an obscure YouTube video featuring false claims, an article on DC Weekly referring to that video, and viral posts on social media.
All of those articles featuing false claims about Zelensky and Ukraine are written by Jessica Devlin. According to DC Weekly, she's a "highly acclaimed journalist" from NYC.
Except, that's the image of author Judy Batalion. Jessica Devlin is a fake persona. She doesn't exist.
A vast Russian influence operation on TikTok involving 12,800 fake accounts spreading disinformation about the war in Ukraine to millions of users in Germany, France, Italy, Poland, Israel and Ukraine, has been uncovered by BBC Verify and @DFRLab.
Back in the summer, this video, featuing an AI-generated voice, racked up millions of views on TikTok and later on Twitter.
It falsely accused Ukraine's former defence minister Oleksiy Reznikov and his daughter Anastasiya Shteinhauz of buying a a €7m villa in Cannes, France.
We debunked the viral video back in July. The villa seen in the video wasn't bought by Reznikov, and was actually up for sale.
So, @O_Rob1nson, @adkrobinson and I tried to find out more about the account that originally posted that video to TikTok.