After further reflection I think Twitter has made a mistake censoring the NY Post article. It's garbage journalism, but that's not why they censored it.
Twitter is claiming that it contains hacked content and linking to it violates its policies. 1/n
First, let's note that Twitter has consistently penalized accounts for linking to hacked content. Their actions are at least consistent when viewed at face value.
The question for me then is this: does this constitute "hacked content?" I really don't think it does. 2/
If you take the story at face value, this is data recovered from abandoned property. Imagine you see a computer in a public trash can. You take it and extract data from the drive. Is that hacked data? More importantly, would Twitter censor a story with that data? 3/
But this story is crazier than that. If the computer was dropped off for service, the transfer of the property was certainly governed by the terms and conditions of the shop. Those CERTAINLY dictate that abandoned equipment becomes their property. 4/
Following that logic, the claimed computer repair shop owner now owns the equipment, presumably including any data on the devices.
Now I'm sure the former owner could file suit claiming damages based on the transfer of the data to others. But we're talking about censorship. 5/
From where I sit, this article shouldn't be censored. There's a non-zero chance that the evidence is fabricated. But I don't want to see platforms censoring content based on the mere possibility of manufactured evidence. That's a slippery slope that should scare us all. 6/
A few closing thoughts: 1. NY Post took evidence from known shady actors and failed to forensically validate it. Then they failed to alert the reader of that fact (and significance). That's bad journalism in my opinion. 2. The computer repair shop owner is a bad actor as well. 7/
Even at face value, this is not a national security issue. Assuming the story is true, I can't get behind handing over someone's personal data (as clearly happened) to anyone other than law enforcement (and I have some issues even there). Bottom line: he acted in bad faith. 8/
3. Consume media critically. Especially with misinformation campaigns, you have to ask the hard questions. In this case, one question would be "why has no other outlet run with the story?" The answer is that it's a credibility issue. /FIN
• • •
Missing some Tweet in this thread? You can try to
force a refresh
This is also several degrees of bad. It's not "swastika might mean something else" (what???) when you are putting someone wearing a Jewish symbol in an oven. I don't think this has any place on the platform, but that's up to the platform and advertisers who support it. 2/
On the broader question of censorship, content platforms have a choice for what they wish to allow.
But they have a responsibility to not push offensive and radicalizing content to those who don't ask to see it. Driving dangerous content because people engage is unacceptable. 3/
Garmin is in a unique position with their ransomware incident. They are both a manufacturer AND hold regulated data. The value of their devices is directly tied to the availability of their apps and the personal data they hold.
I don't see that Garmin has a choice but to pay. 1/
The fact that a single incident seems to have taken down their data service AND their manufacturing indicates very loose trusts or very flat networks. Neither is good from a security perspective, but I'm also confident that either will be quickly corrected, no big deal to me. 2/
What IS a big deal to me is my personal data. Many ransomware groups exfiltrate data before encrypting and demand extortion payments from victims, lest they release this data. That's almost certainly the case here.
If Garmin refuses to pay, I don't see things going well. 3/
All right stop
Just intubate and listen
Ice is back with misguided intention
COVID grabs a hold of me tightly
Shortness of breath both daily and nightly
Will it ever stop?
Yo, I don't know
Turn off the lights, put a tag on my toe
To the extreme, I rock ICU like I'm comatose
Light up a room, I'm a chump, experiencing new lows
Cough
Heck yeah, the ventilator goes woosh
Hypoxia killing my brain like a I'm a selfish douche
Deadly, the EKG beeps a dope melody
Performing this concert should have been a felony
Large groups in public?
That's not okay
You better wear a mask
'cause COVID don't play
I'm creating a problem
But I won't solve it
Check out my vent while the respiratory therapist resolves it
Facebook paid a third party firm to develop an 0-day exploit customized for Tails and then gave it to the FBI to target a cyber criminal operating on their platform. I've been thinking about this all morning and I think I support the action. 1/ vice.com/en_us/article/…
But this one is really thorny to be sure. A critical point is that Tails was removing the vulnerable feature in a not-yet-released version, so that limited the time the vuln could be used by the FBI. There's no question this monster was targeting children and had to be stopped 2/
If Facebook used an OPSEC mistake and turned that data over to the FBI, this would be a non-story. The only reason we care is that Facebook subsidized the exploitation of another platform. Critically, FB notes they wouldn't have introduced risk for all users to aid the FBI. 3/
24 hours later and I've heard lots of "leaders" say they want to start community review boards and the like. Review/advisory boards are not what's needed.
At a minimum, these are needed:
State level registries of officer violence/complaints
Citizen recertification boards 1/
Funds from abuse lawsuit awards come from pension or department discretionary funds
Residency requirements for at least the majority of the force
Body cameras worn by all and on at all times during interface with the public, video and audio
Review of current officer's records
2/
Should the registry of abuse be national? Sure, that would be better. But governors can't do that. They CAN each do it at the state level. They could even share that data...
What about body cameras? Yeah, not having privacy sucks. But lots of people give up privacy for a job. 3/
The open vs. closed debate really boils down to whether you think there's a vaccine or effective prophylactic coming in the next 12-18 months.
If a vaccine never comes and hospitals aren't overwhelmed and immunity is achieved post-infection, there's some first mover advantage 1/
Unfortunately, today this relies on unknowns. First, we don't understand the role of antibodies or the long term impacts yet (I think more is known about this than some are letting on, but...). If longer term immunity isn't a thing, herd immunity can't be reached. 2/
Then there's the question of long term impacts. If herd immunity means that 75%+ of the population loses a percentage of lung capacity or risks other serious complications, are we okay with that? I don't think we should be if we have a choice. 3/