I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
People have asked if we'll adopt this system for WhatsApp. The answer is no.
Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. faq.whatsapp.com/general/how-wh…
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy.
We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications …” apple.com/customer-lette…
…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
This is a really important article from @DarrenLoucaides@WIRED about Telegram. If you think Telegram is secure, you should read this article and understand the truth - especially before you use it for anything private.
Telegram is not end-to-end encrypted by default and offers no e2ee for groups. From the article: “Telegram has the capacity to share nearly any confidential information a government requests”
Their e2ee protocols lack independent verification: “most disturbingly, some activists have found their “secret chats”—Telegram’s purportedly ironclad, end-to-end encrypted feature—behaving strangely, in ways that suggest an unwelcome third party might be eavesdropping.”
Communities on @WhatsApp is starting to roll out globally! With Communities admins can connect related groups for their organizations, neighborhoods, and workplaces. To get started tap on the brand new Communities tab. Check it out:
We’re also adding more great features: polls, 32 person video calls, and groups with up to 1024 people. Like reactions, larger file sharing, and admin delete - we think these features will be particularly helpful for Communities but are available in any group!
We’ve made significant upgrades to the chat experience while maintaining end-to-end encryption. This is a huge improvement over other apps and services that force organizations to share a copy of their private messages.
Three great new privacy features for @WhatsApp users coming soon:
You'll be able to leave group chats without having to tell everyone. Only the admins will be notified. Almost like leaving a party quietly and only informing the host 😀.
Being online may not always mean being available 💬, so you'll now be able to control who can and can't see when you are online.
Reminder to @WhatsApp users that downloading a fake or modified version of WhatsApp is never a good idea. These apps sound harmless but they may work around WhatsApp privacy and security guarantees. A thread:
Recently our security team discovered hidden malware within apps – offered outside of Google Play - from a developer called “HeyMods” that included "Hey WhatsApp" and others.
These apps promised new features but were just a scam to steal personal information stored on people’s phones. We’ve shared what we found with Google and worked with them to combat the malicious apps.
Today we’re very excited to share our vision for a feature we’re calling WhatsApp Communities. This is new functionality we’re building to support the many organizations that use WhatsApp to communicate in a private and secure way.
We’ve heard from many workplaces, non-profits, and local organizations that have been using WhatsApp for their private communication, and there’s a lot we can improve to make WhatsApp work better for groups like these.
Communities will make it possible for admins to organize groups under one umbrella, send announcements, and decide which groups can be part of their community to help make group conversations work for their organization.
This paper is definitely worth reading. It's from some of the leading minds on computer security and it goes into great detail on why client side scanning (CSS) -- which @WhatsApp opposes -- would be very dangerous for us all.
They explain in clear terms the many problems with client side scanning proposals, concluding the security risks they would create for everyone would make "us all less safe and less secure."
"CSS has been promoted as a magical technological fix for the conflict between the privacy of people’s data and communications and the desire by intelligence and law enforcement agencies for more comprehensive investigative tools...