Reverse image search using Google Lens is often superior to Google Reverse Image Search. Officially, Google Lens is only available in iOS and Android apps, not on the desktop. How do you get it running on your PC or Mac? Next tweets! (1/....)
For Linux and Windows users, download google.com/chrome/beta/ and click Experiments. Next to 'Search your screen, with Google Lens', select Right click and select 'Search Part of the Page with Google Lens' from the context menu. (2/....)
Follow the rest of the instructions here: bit.ly/3l25lXW . For Mac, it's more complicated. Download Android emulator first, via bluestacks.com/index.html and install Google App, click on Lens symbol (3/3)
Forget what I just said! Found a way, by accident, that is way faster and works for Windows and Mac! Will post it in few minutes (4/4)
Yup, it works! Glad to announce a way to use Google Lens on desktop via a very simple method.
Step 1: type in straight into Chrome chrome://flags
(5/...)
Now search with command-F or Control-F for the word LENS and ENABLE it. Restart the browser (6/..)
Right-click on any photo on a webpage !!! Have fun.
(7/7)
Oh, wow, now something else to share with you. In Chrome, the search will lead to a link. That link works in many browsers! Can y help me and tell me if it works? (8/..) lens.google.com/search?p=ASQ0R…
Found something else to share. Now you have Google Lens, you maybe notice you can't upload your own photos! Just from other webpages. How do you fix that? (9/...)
Save your photo on your hard disk. Open it with Google Chrome like this: (10/...)
Voila! From now on, you can use any photo, also your own, to be scrutinised by Google Lens on your desktop. (11/11)
However, bear in mind that neither Google Lens nor Google Image Search are good in face recognition anymore as a result of GDPR. Use Lens for objects, landscapes, and any other type of geolocation. Have fun you all ! (12/12)
Oh my, we can even improve the last tip (how to use Google Lens for your own photos) thanks to @StefanVoss, I adapted the tip slightly. Here is the shortcut. Type lens.google.com/search?p in your browser. Ignore the error. Just drag your image into the screen (13/13)
Why bother? You can always ask me. This is why. Google reversed image search does not recognize this still from stream from state TV Syria (grainy, poor quality). However, Google lens correctly identifies the city: Rojava! (14/14)
So the moment @StefanVoss pointed out that you can upload it via Computer, I tried some other stuff. And found the fastest way: you can just drag and drop your photo into the error screen.
The only downside is that Google Lens desktop doesn’t recognize text as good as Google Lens via the app on mobile or tablet does. So if you want to research text, go for the app. But for geolocation, it’s great in all platforms (15/15)
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Introducing the first public version of a versatile, still experimental tool for image detection. What can you do with it? It's aimed at reporters, writers and investigators. (1/6) Detectai.Live
Upload a picture and get web links, when found. Nice feature is that it will also detect public faces, something that Google doesn't do that well anymore. We run Google Vision under the hood for this, with Vertrext (2/6)
If the picture could be fake (= ai generated) then you will get after 15 seconds a verdict that helps you to judge it. If we find links, we want you to have a look yourself. If not, it will try to give a verdict in a percentage, and explain why. (3/6)
ChatGPT quietly scrubbed today nearly 50,000 shared conversations from Google's index after our investigation. They thought they'd solved the problem. They were wrong. (1/5)
A new Digital Digging investigation, conducted with @osint77760, has uncovered 110,000 ChatGPT conversations preserved in 's Wayback Machine—a digital time capsule OpenAI can't touch. (2/5)Archive.org
@osint77760 While OpenAI scrambled to de-index conversations from Google, they forgot the internet's most basic rule—nothing truly disappears. had already captured everything. (3/5)Archive.org
New investigation: when I asked chatbots for something behind a paywall, they delivered the complete story—including quotes that should have been impossible to access without paying. This isn't a bug. It's a feature digitaldigging.org/p/how-ai-bots-…
ChatGPT, Perplexity & Grok systematically access subscriber-only content from major publications—often requiring just 2-5 follow-up questions to extract complete articles. Gemini and Claude are less intrusive. (2/5)
Another finding: @Musk's new Grok 4 (SuperGrok) has paywall-busting direct X integration. It systematically mines social media discussions, screenshots, and quoted excerpts that users share from the paywalled content. (3/5)
I decided to use my lunch time to show you how easy it is to make a fake news story in 30 minutes with Veo3 (I didn't try to perfect it). First: the footage. A mayor comes with a crazy idea and people hate it: (1/10) #verification #ai
What's needed to create this fake uproar in 28 minutes? Access to Google's Veo3. The same AI tool Google promotes for 'creative content' can fabricate creative misinformation. 🧵2/10
The 7-second limit creates a huge challenge: how do you maintain continuity? Solution: Split single interviews across multiple clips, use different camera angles of the same 'event,' create matching audio that bridges scenes. 🧵3/10
Volgograd Oil Refinery is bombarded by Ukraine. Actual footage generated by ….. #AI based on just one picture (2) I gave it and the prompt “add explosions” #verification #geolocation #fakenews via It’s not that easy to detect that this is a fake video (1/2)vidu.studio
Sharpness: The lowest sharpness is 4.76, which is a very low value. This suggests some frames may have been smoothed or blurred, possibly as a result of manipulation or Al generation. Natural textures may have been lost in these frames.
Texture Changes: The variation between the highest (597.85) and lowest (166.14) gradient magnitudes indicates some frames have significantly less texture, further supporting the possibility of manipulation.
In this Twitter storm of 30 tweets, I'll unveil how Google's algorithms operate. Just confirmed: the March 2024 #googleleaks of Google's API docs on GitHub & Hexdocs are real. They reveal Google's tactics in market control, search influence, and data handling, raising privacy and ethical concerns. (1/30)
1. An API called NavBoost uses click data to adjust rankings. Popular results get boosted, distorting relevance for anyone who wants to go deeper. #GoogleLeaks #MarketManipulation
2. Google’s scores image aesthetics to influence search results. Visual appeal can outweigh relevance. #GoogleLeaks