Google has a new concept called "known victims" for revenge porn & people serially attacked on slander sites. Once a person requests removal of these results from a search of their name, Google will automatically suppress similar content from resurfacing. nytimes.com/2021/06/10/tec…
One of the surprising things about working on the slander series is how few people in the field, even experts, know that Google voluntarily removes some search results. (No court order needed!) You have to visit this generic url: support.google.com/websearch/trou…
Because so few people know about it, "reputation managers" are charging people like $500 a pop to "remove damaging information from Google results." And ALL THEY DO is fill out that form for free. Someone tried to hawk this service to my husband after he came under attack.
The reason Google changes are meaningful (assuming they work) is that when someone comes after you online, it tends to be repeatedly. Plus sites scrape each other, compounding the damage, so you fill out these forms again & again & again. "Known victims" theoretically fixes that.
Google blog post on the changes: "Improving Search to better protect people from harassment" blog.google/products/searc…

It is WILD how much the internet has changed in a decade.

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Kashmir Hill

Kashmir Hill Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @kashhill

25 Apr
There is a constellation of sites online that exist for the sole purpose of destroying people's reputations. @Aaron_Krolik and I wanted to figure out who was making money off them and how. nytimes.com/interactive/20…
Investigating the reputation extortion industry was incredibly challenging! The reportorial equivalent of walking into a dark room, turning on the lights, and watching roaches scatter. Fake companies, false identities, and lies lies lies.
We had an idea to help make sense of it. We'd slander a made-up person. But that had ethical issues. So instead the fearless @Aaron_Krolik volunteered to destroy his own reputation. I kept asking, ARE YOU SURE???? "Will do anything for attention" he wrote about himself.
Read 5 tweets
18 Mar
After I reported the existence of Clearview AI in January 2020, the company's world exploded: lawsuits, international investigations, letters from senators. I've been talking to company CEO Hoan Ton-That through it all for this @nytmag cover story: nytimes.com/interactive/20…
Clearview is still attracting new customers and new funding but it is under siege in Illinois, which has a law that says you can't use people's faceprints without consent. Clearview’s lawyer Floyd Abrams is arguing the law violates the company’s First Amendment rights.
I've also continued to investigate Clearview AI and how its facial recognition app is being used, such as one of the first times ICE used it to solve a crime, eventually leading to the agency's $224,000 contract with the company.
Read 6 tweets
30 Jan
I've been working on this internet horror story for months, but the people in it have been living it for more than a decade. nytimes.com/2021/01/30/tec…
In September 2018, Guy Babcock discovered that he and his entire extended family had been branded pedophiles, scammers, thieves and sexual deviants online. When he investigated, he discovered a 25-year-old grudge. nytimes.com/2021/01/30/tec…
How one woman destroyed the online reputations of more than 145 people and why it's been impossible to fix them: nytimes.com/2021/01/30/tec…
Read 9 tweets
7 Aug 20
Given how sensitive therapy sessions are, I started looking into start-ups that do therapy via text & then keep the transcripts. But the investigation turned up so much more than I expected: nytimes.com/2020/08/07/tec…
Talkspace, a text therapy app made famous by Michael Phelps ads, keeps transcripts for about 7 to 10 years because they're medical records—and data-mines them, of course. But all the other stuff going on there was WILD. nytimes.com/2020/08/07/tec…
A Talkspace employee who later sued the company had his therapy logs read aloud at an all-hands meeting "anonymously" but soon everyone knew it was him. Basically your workplace nightmare. nytimes.com/2020/08/07/tec… Image
Read 10 tweets
24 Jun 20
In January, in the first known case of its kind, a man in Michigan was arrested for a crime he did not commit due to a flawed algorithmic facial recognition match. I told his story here: nytimes.com/2020/06/24/tec…
Robert Williams initially thought the call at work from the police, telling him to come in to be arrested, was a prank. But when he got home, he was handcuffed on his front lawn in front of his wife and two young, distraught daughters.
He was held overnight, had fingerprints, mugshot, DNA taken. During interrogation, detectives showed him a surveillance still of a shoplifter who stole 5 watches, asking if it was Robert. “No, this is not me,” he said, holding it to his face. “You think all Black men look alike?”
Read 17 tweets
18 Jan 20
The privacy paranoid among us have long worried that all of our online photos would be scraped to create a universal face recognition app. My friends, it happened and it’s here: nytimes.com/2020/01/18/tec…
I'm not sure which is scarier/more desirable. An app that puts a name to a face in seconds, or an app that shows you all the online photos of you that you didn't realize were there. This app does both, but only law enforcement has access to it, for now.
When I first started looking into Clearview AI, it had a nonexistent office address on its website & one fake employee on LinkedIn, and no one from company would return my calls. But they knew about me and were monitoring for cops who uploaded my photo to their app.
Read 6 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!

:(