Social media is awash with false or misleading images, some of which get millions of engagements.
So, here's a simple guide on ways you can quickly check the veracity of an image you see on your social media feeds.
Reverse image search is the most fundamental part of content verification - the process of searching to find if, and when, an image has appeared on the internet before, and in what context.
Google Lens, Yandex, TinEye and Bing are among the free tools that allow you to do this.
Lens is Google's excellent tool for checking online content.
Here's a tweet by US conspiracy theorist Stew Peters claiming this cloud was seen in Turkey just before the recent earthquake.
On Chrome, simply right-click on the image and select "search image with Google".
Google Lens will bring up a range of relevant results.
If you click on the first link, you will see a Guardian report clarifying this was a lenticular cloud spotted in Bursa, Turkey, on 19 January - almost three weeks before the earthquake.
You've got your answer. But if you're keen to find the first source of the image, you can keep looking through Google Lens results for the earliest date.
You'll find this Instagram post from 19 January. The user confirms she took the image at 8 o'clock that day.
Look for notable signs or landmarks in an image and use the crop feature to narrow down your search.
In this image, those tower blocks in blue, yellow and green are a distinct feature. Crop the image on Lens and it'll quickly tell you that is Bogota, Colombia.
Google Lens results are not chronological, which means you sometimes have to scroll through several pages of results to find the earliest date.
This image went viral days after the Russian invasion, claiming to show children seeing off Ukrainian troops.
Let's check it on Lens.
Crop the image and click on "find image source".
The first few pages of results are all from 2022 and 2023, but on page four you'll see the date 13 May 2016. Click on the link.
You'll see the image on Flickr posted by Ukraine's Ministry of Defence, meaning it's an old photo.
Lens also helps you quickly translate text in another language, say a street or shop sign in an image, or spot text added to a doctored image.
If you click on text in Lens for this alleged Zelensky image, it'll bring up fact-checks which show the photo has been manipulated.
The other platforms work pretty much the same.
Yandex, a Russian search engine, used to be regarded as the most powerful of all reverse search tools. That's no longer the case.
But it's still widely used by journalists, particularly for the Ukraine war. yandex.com/images/
Here's a viral tweet claiming a Kyiv tower block was never hit.
Yandex works best if you save an image on your hard drive and upload it.
Once you've done that, it'll provide you with a series of images and web pages confirming the block was indeed hit on 26 February 2022.
TinEye is another well-known, free reverse image search engine.
While it may not be as comprehensive as Google Lens, TinEye has a very unique feature that I really appreciate: it allows you to filter results in a chronological order.
Install the @InVID_EU Chrome extension, right click on any online image, select InVid debunker and it'll bring up a range of direct reverse search options with multiple tools for you.
We'll talk more about @InVID_EU in my next thread about video verification.
If you liked this thread, please feel free to check my other thread on verifying fake or manipulated screenshots of tweets and social media posts.
We'll try to learn how to verify online videos in an upcoming thread.
Pro-Kremlin influencers claim the captain of the Dali ship is a Ukrainian.
But online records show a Ukrainian man was the Dali's captain from March to July 2016. The ship that hit the bridge reportedly had an all-Indian crew.
Claims by influencers such as Alex Jones and Andrew Tate that the Baltimore Bridge collapsed due to a "cyber-attack" have been viewed millions of times.
Maryland Governor Wes Moore has said the early investigation points to an accident, with "no evidence of a terrorist attack".
This video, viewed 1.4 million times, claims to show evidence of pre-installed explosives causing the Baltimore Bridge collapse.
What the video shows is not explosives, but most likely electrical wires catching sparks.
DC Weekly, a website founded by a former US Marine now living in Russia, has fuelled disinformation stories about Zelensky and Ukraine, including a fake story that he bought two luxury yachts with US aid money, later repeated by some members of Congress.
These are just a few of the disinformation stories published by DC Weekly about Zelensky and Ukraine recently.
They all follow the same pattern: an obscure YouTube video featuring false claims, an article on DC Weekly referring to that video, and viral posts on social media.
All of those articles featuing false claims about Zelensky and Ukraine are written by Jessica Devlin. According to DC Weekly, she's a "highly acclaimed journalist" from NYC.
Except, that's the image of author Judy Batalion. Jessica Devlin is a fake persona. She doesn't exist.
A vast Russian influence operation on TikTok involving 12,800 fake accounts spreading disinformation about the war in Ukraine to millions of users in Germany, France, Italy, Poland, Israel and Ukraine, has been uncovered by BBC Verify and @DFRLab.
Back in the summer, this video, featuing an AI-generated voice, racked up millions of views on TikTok and later on Twitter.
It falsely accused Ukraine's former defence minister Oleksiy Reznikov and his daughter Anastasiya Shteinhauz of buying a a €7m villa in Cannes, France.
We debunked the viral video back in July. The villa seen in the video wasn't bought by Reznikov, and was actually up for sale.
So, @O_Rob1nson, @adkrobinson and I tried to find out more about the account that originally posted that video to TikTok.
The meme shared by Elon Musk about the pizzagate conspiracy theory is itself based on the completely false claim that James Gordon Meek, a journalist who recently pleaded guilty to possessing child pornography, had debunked pizzagate. Meek never reported on pizzagate.
The completely false claim that James Gordon Meek had debunked pizzagate was spread back in the summer by QAnon followers, like this blue tick account.
The New York Post has never published such a story about Meek. It's a totally fake image and a made up headline.
Elon Musk has once again fallen for a completely false claim, this time based on a fabricated New York Post headline pushed months ago by conspiracy theorists on his own platform.
If he'd done a simple check before tweeting, he'd have found out the whole thing was false.
This video, viewed over 3 million times, claims to show an Israeli settler run over protesters.
The video's from 9 September, during a protest in Tel Aviv against the government's judicial reforms, and involves no settlers.
This video claims to show two "terrorist" Palestinian journalists reporting near a rocket launcher.
The two are in fact Syrian journalists and the video is from 7 October. They reported retaliatory strikes against the Syrian government, after it killed 65 civilians in Idlib.
While Gaza's Al-Shifa hospital has been described by the WHO as a "death zone", the claim that all the premature babies there have died is inaccurate.
Two premature babies tragically died over the weekend, while 31 have now been transferred to an Emirati hospital in Rafah.
Hannah Abutbul, an Israeli influencer, is being falsely doxxed as the woman in the misleading video of a supposed nurse at Al-Shifa hospital speaking out against Hamas.
I spoke to Hannah earlier. She's not the woman in the video.
This digitally altered video, viewed 13 million times, falsely claims to show an advertising in New York in which "support Ukraine" is replaced by "support Israel".
No such ad exists. The real ad right now is about the upcoming Trolls film, via @macrinawang @RitornellaNYC.
WARNING: GRAPHIC
This image, viewed 370,000 times, falsely claims to show a Palestinian child shot dead by Israeli troops while fetching water.
This tragic incident happened in Yemen in 2020, and the child was allegedly shot by a Houthi sniper. She reportedly survived.