Thread: Online misinformation is rampant following the escalation of violence between Israel and Hamas today.
This video of a tower block in Gaza being hit by a missile is from May 2021, not today. It was captured live during a BBC Arabic broadcast at the time.
This video of a house in Gaza being destroyed by a strike is genuine, but it's actually from May of this year, and not from the fresh escalation in hostilities today.
This video of a house being struck by Israel is also from May of this year, not today.
It was filmed in Beit Hanoun, Gaza; as geolocated by @ChrisOsieck at the time.
This video, shared by right-wing influencers Charlie Kirk and Ian Miles Cheong and viewed millions of times, actually shows Israeli police and special forces outside a house, as is easily identifiable by their uniforms, not Hamas militants.
This is another video of a house in Gaza being targeted by an IDF strike in May of this year. It's not from today.
This video ceratinly doesn't show Hamas shooting down two Israeli helicopters, because it's actually from the video game Arma 3.
This is complete and utter nonsense, shared for nothing but engagement. Israel hasn't authorised a nuclear strike on Gaza, and the footage shows a US nuclear test from the 1950s.
This video, viewed 230,000 times, is not footage of a Hamas militant shooting down an Israeli helicopter. It's from the video game Arma 3.
This video, viewed more than 3 million times, does not show a building in Israel.
It shows an Israeli strike on Gaza's Palestine Tower, which houses Hamas radio stations on the rooftop and also holds a cinema, earlier today.
A fake document is being widely shared online claiming to show the authorisation of $8bn in military aid to Israel by President Biden.
It's a doctored version of a 25 July document detailing $400m in aid to Ukraine authorised by President Biden; fact-checked by @Info_Rosalie.
This is a fake Jerusalem Post account falsely reporting that Prime Minister Benjamin Netanyahu has been taken to hospital, and somehow racking up over 600,000 views.
The post has now got a Community Note attached to it.
If you've come across reports that the Taliban has asked Iran and Iraq for permission to send fighters to Israel, the source for those reports is this viral tweet by what most likely is a fake Taliban PR account with a bluck tick.
I particularly like that hand-drawn arrow.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Pro-Kremlin influencers claim the captain of the Dali ship is a Ukrainian.
But online records show a Ukrainian man was the Dali's captain from March to July 2016. The ship that hit the bridge reportedly had an all-Indian crew.
Claims by influencers such as Alex Jones and Andrew Tate that the Baltimore Bridge collapsed due to a "cyber-attack" have been viewed millions of times.
Maryland Governor Wes Moore has said the early investigation points to an accident, with "no evidence of a terrorist attack".
This video, viewed 1.4 million times, claims to show evidence of pre-installed explosives causing the Baltimore Bridge collapse.
What the video shows is not explosives, but most likely electrical wires catching sparks.
DC Weekly, a website founded by a former US Marine now living in Russia, has fuelled disinformation stories about Zelensky and Ukraine, including a fake story that he bought two luxury yachts with US aid money, later repeated by some members of Congress.
These are just a few of the disinformation stories published by DC Weekly about Zelensky and Ukraine recently.
They all follow the same pattern: an obscure YouTube video featuring false claims, an article on DC Weekly referring to that video, and viral posts on social media.
All of those articles featuing false claims about Zelensky and Ukraine are written by Jessica Devlin. According to DC Weekly, she's a "highly acclaimed journalist" from NYC.
Except, that's the image of author Judy Batalion. Jessica Devlin is a fake persona. She doesn't exist.
A vast Russian influence operation on TikTok involving 12,800 fake accounts spreading disinformation about the war in Ukraine to millions of users in Germany, France, Italy, Poland, Israel and Ukraine, has been uncovered by BBC Verify and @DFRLab.
Back in the summer, this video, featuing an AI-generated voice, racked up millions of views on TikTok and later on Twitter.
It falsely accused Ukraine's former defence minister Oleksiy Reznikov and his daughter Anastasiya Shteinhauz of buying a a €7m villa in Cannes, France.
We debunked the viral video back in July. The villa seen in the video wasn't bought by Reznikov, and was actually up for sale.
So, @O_Rob1nson, @adkrobinson and I tried to find out more about the account that originally posted that video to TikTok.
The meme shared by Elon Musk about the pizzagate conspiracy theory is itself based on the completely false claim that James Gordon Meek, a journalist who recently pleaded guilty to possessing child pornography, had debunked pizzagate. Meek never reported on pizzagate.
The completely false claim that James Gordon Meek had debunked pizzagate was spread back in the summer by QAnon followers, like this blue tick account.
The New York Post has never published such a story about Meek. It's a totally fake image and a made up headline.
Elon Musk has once again fallen for a completely false claim, this time based on a fabricated New York Post headline pushed months ago by conspiracy theorists on his own platform.
If he'd done a simple check before tweeting, he'd have found out the whole thing was false.
This video, viewed over 3 million times, claims to show an Israeli settler run over protesters.
The video's from 9 September, during a protest in Tel Aviv against the government's judicial reforms, and involves no settlers.
This video claims to show two "terrorist" Palestinian journalists reporting near a rocket launcher.
The two are in fact Syrian journalists and the video is from 7 October. They reported retaliatory strikes against the Syrian government, after it killed 65 civilians in Idlib.
While Gaza's Al-Shifa hospital has been described by the WHO as a "death zone", the claim that all the premature babies there have died is inaccurate.
Two premature babies tragically died over the weekend, while 31 have now been transferred to an Emirati hospital in Rafah.
Hannah Abutbul, an Israeli influencer, is being falsely doxxed as the woman in the misleading video of a supposed nurse at Al-Shifa hospital speaking out against Hamas.
I spoke to Hannah earlier. She's not the woman in the video.
This digitally altered video, viewed 13 million times, falsely claims to show an advertising in New York in which "support Ukraine" is replaced by "support Israel".
No such ad exists. The real ad right now is about the upcoming Trolls film, via @macrinawang @RitornellaNYC.
WARNING: GRAPHIC
This image, viewed 370,000 times, falsely claims to show a Palestinian child shot dead by Israeli troops while fetching water.
This tragic incident happened in Yemen in 2020, and the child was allegedly shot by a Houthi sniper. She reportedly survived.