I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.

People have asked if we'll adopt this system for WhatsApp. The answer is no.
Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.
We've worked hard to ban and report people who traffic in it based on appropriate measures, like making it easy for people to report when it's shared. We reported more than 400,000 cases to NCMEC last year from @WhatsApp, all without breaking encryption. faq.whatsapp.com/general/how-wh…
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone -- even photos you haven't shared with anyone. That's not privacy.
We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable.
Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy?
What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
Apple once said “We believe it would be in the best interest of everyone to step back and consider the implications …” apple.com/customer-lette…
…”it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with Will Cathcart

Will Cathcart Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @wcathcart

18 Jul
This groundbreaking reporting from @Guardian, @WashingtonPost, and many others demonstrates what we and others have been saying for years: NSO’s dangerous spyware is used to commit horrible human rights abuses all around the world and it must be stopped.
Human rights defenders, tech companies and governments must work together to increase security and hold the abusers of spyware accountable. Microsoft was bold in their actions last week blogs.microsoft.com/on-the-issues/…
In 2019, @WhatsApp discovered and defeated an attack from NSO.  They rely on unknown vulnerabilities in mobile OSes, which is one of the reasons why we felt it was so important to raise awareness of what we'd found. washingtonpost.com/opinions/2019/…
Read 8 tweets
8 Jan
I've been watching a bunch of discussion this week about the privacy policy update we’re in the process of making @WhatsApp and wanted to share some thoughts.

Thread 👇
I want to share how committed everyone @WhatsApp is to providing private communication for two billion people around the world. At our core, that’s the ability to message or call loved ones freely protected by end-to-end encryption and that’s not changing.
With end-to-end encryption, we cannot see your private chats or calls and neither can Facebook. We’re committed to this technology and committed to defending it globally. You can read more here: whatsapp.com/security/
Read 10 tweets
11 Mar 20
This morning the U.S. Senate Judiciary Committee held a hearing on the "EARN IT" Act. While not directly mandating a backdoor, as written, this act would form a commission that could have the power to require services like @WhatsApp to stop offering end-to-end encryption. 1/
Absent clear protections for encryption, EARN IT has the potential to make people less safe, not more, by reducing the security of the over 2 billion people who use WhatsApp to communicate, not to mention all the other encrypted services as well. 2/
It was great to see several Senators stand up for end-to-end encryption - this sends a powerful message to the world that end-to-end encryption helps protect people and we hope to see that affirmed in the text of the bill itself 3/
Read 17 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!