This story—about a Substack publication outing a priest with location data from Grindr—shows how easy it is for anyone to take advantage of data brokers’ stores to cause real harm. washingtonpost.com/religion/2021/…
This is not the first time Grindr has been in the spotlight for sharing user information with third-party data brokers. The Norwegian Consumer Council singled it out in a 2020 report… forbrukerradet.no/undersokelse/n…
...before the Norwegian Data Protection Authority fined Grindr earlier this year, specifically warning that the app’s data-mining practices could put users at serious risk in places where homosexuality is illegal. nytimes.com/2021/01/25/bus…
But Grindr is just one of countless apps engaging in this exact kind of data sharing. The real problem is the many data brokers and ad tech companies that amass and sell this sensitive data without anything resembling real users’ consent. eff.org/deeplinks/2020…
Apps and data brokers claim they are only sharing so-called “anonymized” data. But that’s simply not possible. Data brokers sell rich profiles with more than enough information to link sensitive data to real people, even if the brokers don’t include a legal name.
In particular, there’s no such thing as “anonymous” location data. Datapoints like one’s home or workplace are identifiers themselves, and a malicious observer can connect movements to these and other destinations. In this case, that includes gay bars and private residences.
Another piece of the puzzle is the ad ID, another so-called “anonymous" label that identifies a device. Apps share ad IDs with third parties, and an entire industry of “identity resolution” companies can readily link ad IDs to real people at scale. vice.com/en/article/epn…
All of this underlines just how harmful a collection of mundane-seeming data points can become in the wrong hands.
That’s why the U.S. needs comprehensive data privacy regulation more than ever. This kind of abuse is not inevitable, and it must not become the norm. eff.org/deeplinks/2019…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
One year ago, we launched the Atlas of Surveillance. It’s the largest public database of known police surveillance technologies that have been used across the country. Check it out: atlasofsurveillance.org
Have you used the Atlas of Surveillance to look up what surveillance tech police in your area are using?
Do local police have Real-Time Crime Centers? Face Recognition? Cell-site simulators? Find out at atlasofsurveillance.org!
In Nestlé v. Doe, the Supreme Court narrowed one of the few tools that can be used to bring justice to victims of human rights abuses enabled by surveillance and censorship tools sold by U.S. companies.
The Court held that making “operational decisions” in the U.S. is not a sufficient basis for jurisdiction against a company under the Alien Tort Statute, a law that specifically incorporates international law into U.S. law.
EFF filed an amicus brief warning that governments around the world have relied on U.S. technology companies like Cisco and Sandvine to build the tools of repression.
Victory! The Supreme Court ruled today that California can’t require organizations to hand donor names over to the state, saying doing so threatens groups’ First Amendment rights. We filed a brief with the court making that argument. eff.org/deeplinks/2021…
Groups that challenge or oppose state policies have legitimate fears that members and donors, or their businesses, could become targets of harassment or retaliation by the government itself, we said. @EFF
The court's ruling today was the right call. Everyone should be able to express and promote their viewpoints through associational affiliations without personally exposing themselves to a political firestorm or even governmental retaliation. @EFF
Victory! A federal court has prevented Florida from enforcing a law that prohibits online services from deplatforming political candidates, ruling that it violates the First Amendment in several respects. eff.org/document/netch…
The Florida law prohibited platforms from engaging in content moderation of politicians' speech, privileging their voices over other users. We filed a brief in the case explaining why the law violated the First Amendment. eff.org/press/releases…
As our brief explains, we're no fans of online services' content moderation practices, which often remove legitimate speech and disproportionately harm disempowered voices and communities. eff.org/press/releases…
The built environment of surveillance—in, over, and under our cities—makes it an entwined problem that must be combatted through entwined solutions. To make it easier to see, we've visualized it in a cross-section of the average city block: eff.org/deeplinks/2021…
We've created downloadable, shareable graphics to show how the varying surveillance technologies and legal authorities overlap, how they disproportionately impact the lives of marginalized communities, and what tools we have at our disposal to halt or mitigate their harms.
It's hard to understand what the many types of surveillance can mean for your daily life—from local and state police, to federal law enforcement, to the growing cooperation between private tech companies and the government.
Share these images to help get the message across.