The thesis of this article is that Britain “tamed big tech,” but the actual legislation seems to require a few privacy switches for kids — switches that should be on for everyone under a reasonable regulatory regime. wired.co.uk/article/age-ap…
“Strange women lying in ponds is no basis for a system of government.”
The major observation here is that tech firms will do all sorts of things to “protect children” as long as they’re (1) relatively inexpensive, (2) don’t substantially harm their own financial interests. Which generally means doing ineffective things.
This is the kind of statement that might have been laughable 15 years ago. But if you replace “the Internet” with “ten popular platforms and two app stores,” it doesn’t sound so crazy.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
A lot of pro-CSAM scanning arguments take the following form: “phones already have lots of opportunities (real and potential) for privacy abuse, so you’re stupid for minding when we try to introduce a new and highly-scalable one.”
And to some extent this argument is correct! It’s hard to believe this, but the iPhone (and mass-market smartphones) only launched in 2007. In 14 short years these devices have done more to erode user privacy than 114 years of all other electronic technologies combined.
From my perspective the only reasonable reaction for any technologist observing this incredible privacy collapse is: to wake up in the morning trying to fix it, and collapse into bed at night having spent the entire day trying to undo the damage.
The story here, for those who may have forgotten 2015 (it was a long time ago!) is that the NSA inserted a backdoor into a major encryption standard and then leaned on manufacturers to install it. Thread. 1/
The backdoor was in a pseudorandom number generator called Dual EC. It wasn’t terribly subtle but it was *deniable*. You could say to yourself “well, that could be horribly exploitable but nobody would do that.” Lots of serious people said that, in fact. But they did. 2/
My 11 y/o is making friendship bracelets to support K9 Partners for Life. Also happy to take donations direct or at the Venmo below, since it would make her happy and we’re getting zero foot traffic on this 95 degree Baltimore day :) k94life.org
You can also send Zcash to zs1ztg7nnjqr99k4xn0g8fjw24at3nm95w864hlfk2ujq9mpumrwal2mtqe54985774whk9vvv9js8 but I can’t promise it will be tax deductible :)
Wow. $464.33 total raised for Canine Partners For Life. Thanks to everyone who donated, including the Zcash folks :)
I’m glad that Apple is feeling the heat and changing their policy. But this illustrates something important: in building this system, the *only limiting principle* is how much heat Apple can tolerate before it changes its policies. reuters.com/technology/aft…
I’m grateful that Apple has been so open and positive to the technical community. I wish they’d done this before they launched their unpopular service, not after. Some of us have been talking about these issues for two years.
Everyone keeps writing these doomed takes about how “the US government is going to force tech companies to comply with surveillance, so they might as well just give in preemptively.” Like it’s inevitable and we should just hope for what scraps of privacy we can.
Even I was pessimistic last week. What I’ve seen in the past week has renewed my faith in my fellow countrymen — or at least made me realize how tired and fed up of invasive tech surveillance they really are.
People are really mad. They know that they used to be able to have private family photo albums and letters, and they could use computers without thinking about who else had their information. And they’re looking for someone to blame for the fact that this has changed.
I’m not denying that there’s a CSAM problem in the sense that there is a certain small population of users who promote this terrible stuff, and that there is awful abuse that drives it. But when we say there’s a “problem”, we’re implying it’s getting rapidly worse.
The actually truth here is that we have no idea how bad the underlying problem is. What we have are increasingly powerful automated tools that detect the stuff. As those tools get better, they generate overwhelming numbers of reports.