Last week I spent 5 hours digging into how ChatGPT stores your conversations.
What I found is absolutely terrifying.
Here's a breakdown of what you're actually handing over when you use ChatGPT (with examples): 🧵
Everyone assumes ChatGPT conversations are secure.
If you think your business data, passwords, and private thoughts stay between you and the AI, you're about to learn otherwise...
But first, here's what most people don't realize about AI data storage.
First, what exactly does ChatGPT collect about you?
According to their own privacy policy:
• Every conversation you have (stored indefinitely)
• Your email address and payment info
• Your IP address and device information
• Everything you upload (documents, images, files)
• Your location data
That's a lot more than most people realize.
But the data collection goes much deeper.
ChatGPT also tracks:
• How you phrase questions
• Your response times and reading patterns
• When you're most active online
• What topics you discuss most
• Your conversation deletion habits
They're building a comprehensive behavioral profile of every user.
2024 EU audit:
63% of ChatGPT conversations contain personal info, but only 22% of users know they can opt out.
People unknowingly share intimate details about their lives, work, and relationships.
Google DeepMind researchers proved just how vulnerable this data is.
They spent $200 and extracted over 10,000 real examples of ChatGPT's training data.
Real email addresses. Phone numbers. Personal conversations.
All by simply asking ChatGPT to repeat the word "poem" over and over.
The scope of data sharing is staggering:
• Shares user data with vendors & service providers
• Can aggregate your info for third parties
• Must honor government data requests
• Deleted data stays 30+ days
• Your private thoughts aren't private
• Can use it in court
For business users, it's even worse:
• 11% of ChatGPT inputs contain confidential data
• 4% of employees share sensitive info with AI weekly
• Source code, client data, trade secrets leak regularly
• One conversation could expose everything
According to our security assessments:
🥇 Moonshot AI Kimi K2 ranks highest for security (99/100)
🥈 OpenAI GPT-4o comes in second (87.3/100)
🥉 Claude Sonnet 4 takes third place (86.2/100)
Even the most "trusted" AI platforms carry significant risks.TrustModel.ai
• Every major AI company is racing to collect as much user data as possible
• Your conversations are their competitive advantage
• Your personal information trains their next billion-dollar model
• And most users have no idea this is the real business model
The truth:
• Every major AI company is racing to collect as much user data as possible
• Your conversations are their competitive advantage
• Your personal information trains their next billion-dollar model
• And most users have no idea this is the real business model
The bigger picture:
This isn't about demonizing AI, it's about informed consent.
ChatGPT provides incredible value, but users deserve to know exactly what they're trading their data for.
Most privacy violations happen because people don't understand the risks.
Thanks for reading.
If you enjoyed this post, follow @karlmehta for more content on AI and politics.
Repost the first tweet to help more people see it: