"You don't give us any data. We don't give anyone your data. We connect data without sharing it" ...magic! hello.infosum.com/hubfs/Webinar/…
Personal data laundering via 'clean rooms'
The claimed thresholds and limits don't do much, noise could help in theory. But first, I don't buy it. Second, what is it about? Reducing already bad match rates by another 1%? And third, senders/recipients still process extensive personal data co-controlled by the 'clean room'.
'Virtual' household matching via a 'virtual' household mapping file. Perhaps even the IDs and IPs and timestamps are very virtual, too? #personaldata#laundering
• • •
Missing some Tweet in this thread? You can try to
force a refresh
So, Microsoft exploits activity data from Outlook, Teams, Word etc across customers for its own promotional purposes, including on meetings, file usage and the seconds until emails are read.
Microsoft states that the analysis on the seconds until emails were read excludes EU data. Activity data from Outlook, Teams, Word etc, however, seems to include EU data.
What's their legal basis? This is also personal data on employees. And, are business customers fine with it?
Should cloud-based software vendors exploit personal data on users of their services, including private persons and employees of business customers, how they see fit?
I don't think so.
Not even for public-interest research, at least not without academic process and IRB review.
Some more findings from our investigation of LiveRamp's ID graph system (), which maintains identity records about entire populations in many countries, including name, address, email and phone, and aims to link these records with all kinds of digital IDs:crackedlabs.org/en/identity-su…
Identity data might seem boring, but if a company knows all kinds of identifying info about everyone, from home address to email to device IDs, it is in a powerful position to recognize persons and link profile data scattered across many databases, and this is what LiveRamp does.
LiveRamp aims to provide clients with the ability to recognize a person who left some digital trace in one context as the same person who later left some trace elsewhere.
It has built a sophisticated system to do this, no matter how comprehensive it can recognize the person.
As part of our new report on RTB as a security threat and previously unreported, we reveal 'Patternz', a private mass surveillance system that harvests digital advertising data on behalf of 'national security agencies'.
5 billion user profiles, data from 87 adtech firms. Thread:
'Patternz' in the report by @johnnyryan and me published today:
Patternz is operated by a company based in Israel and/or Singapore. I came across it some time ago, received internal docs. Two docs are available online.
Here's how Patternz can be used to track and profile individuals, their location history, home address, interests, information about 'people nearby', 'co-workers' and even 'family members', according to information available online:
, a 'social risk intelligence platform' that provides digital profiles about named individuals regarding financial strain, food insecurity, housing instability etc for healthcare purposes.
"It calculates risk scores for each risk domain for each person", according to the promotional video, and offers "clarity and granularity for the entire US".
Not redlining, though. They color it green.
Making decisions based on these metrics about individuals and groups seems to be highly questionable and irresponsible bs.
Bazze, a US data broker that purchases smartphone location data from mobile apps and advertising firms, and sells to the US Dept of Defense, according to the WSJ (), openly promotes a commercial location mass surveillance system for 'government customers'. wsj.com/tech/cybersecu…
I extracted information about mobile location data they claim to sell per country from their website:
New WSJ report found that 'Near', a consumer data broker based in India, Singapore and the US with an office in France, obtained massive location data via digital advertising firms like OpenX, Smaato and AdColony and sold it to US defense/intel agencies: wsj.com/tech/cybersecu…
Near's general counsel and chief privacy officer:
The US govt "gets our illegal EU data twice per day", a "massive illegal data dump".
"We sell geolocation data for which we do not have consent to do so", "we sell data outside the EU for which we do not have consent to do so"
If this isn't reason for EU data protection authorities to take urgent action than I don't know what is.