- How many install it without a MS account? 5%? Less?
- Lots of #darkpattern 'choices' regarding personal data
- 'Tailored experiences with diagnostic data'
Digital profiling for personalization+ads based on 'diagnostic data' …seriously?
'If you have selected Full, personalization is also based on information about the websites you browse, how you use apps and features' #preselected
So, Win10 tricks users into massive digital profiling based on everything they do? This is not what an operating system should do.
And it gets even better. Ultimately, Microsoft asks users to 'let apps use' their 'advertising ID', which is a unique identifier for each person using a Win10 device.
This identifier is much better suited to link profile data across many different companies than a name.
'App developers' and 'advertising networks' (= all kinds of real-time data brokers) can 'associate personal data they collect about you with your advertising ID' to provide 'more relevant advertising' (=ads based on massive digital profiling) and 'other personalized experiences'.
Now take a look at this video by a major Australian media network that claims to use data from '15 million Microsoft registered users to collect over 6 million behavioral markers …every minute'. Does this even include data from Win10 or MS Office users?
What about Microsoft's recent acquisition of Drawbridge, a company that helps other companies spy on 1 billion consumers and 3 billion devices across everyday life?
"The Microsoft audience graph consists of 120 million Office365 subscribers, 1.5 billion Windows users and 500 million LinkedIn users. LinkedIn professional data is a unique element in the mix. There’s also data from Outlook and Skype users"
"We do not share your personal data with any [third parties] except for …hashed or device identifiers" or "data already visible to any users of the Services".
So, Microsoft may actually share nearly any kind of personal data with others? Totally shady.
When MS acquired LinkedIn in 2016, TC outlined how MS might integrate LinkedIn+data into Office and other products including for CRM/sales, recruitment and 'talent management' purposes; to increase engagement+subscriptions+ 'open the door' for advertising: techcrunch.com/2016/06/13/how…
This new examination commissioned by the Dutch government found that while the April 2019 (enterprise!) version of Office 365 ProPlus doesn't routinely scan Word docs to detect resumes anymore, its transmission of 'diagnostic' data is still concerning: rijksoverheid.nl/binaries/rijks…
There's a whole set of brand new data protection impact assessments of Microsoft enterprise products commissioned by the Dutch government.
The Win10 assessment examines the risks at the 'Security' level of diagnostic data transfer via telemetry, which is not available for many users.
We urgently need similar data protection assessments for standard MS products with Basic/Full telemetry and other settings enabled…
This inspection of 'Office 365 Online and mobile Office apps' for the Dutch government found that 3 "iOS apps (Word, PowerPoint and Excel) secretly send diagnostic data to the US-based marketing company Braze, without any information about the existence …of this data processing"
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Some more findings from our investigation of LiveRamp's ID graph system (), which maintains identity records about entire populations in many countries, including name, address, email and phone, and aims to link these records with all kinds of digital IDs:crackedlabs.org/en/identity-su…
Identity data might seem boring, but if a company knows all kinds of identifying info about everyone, from home address to email to device IDs, it is in a powerful position to recognize persons and link profile data scattered across many databases, and this is what LiveRamp does.
LiveRamp aims to provide clients with the ability to recognize a person who left some digital trace in one context as the same person who later left some trace elsewhere.
It has built a sophisticated system to do this, no matter how comprehensive it can recognize the person.
As part of our new report on RTB as a security threat and previously unreported, we reveal 'Patternz', a private mass surveillance system that harvests digital advertising data on behalf of 'national security agencies'.
5 billion user profiles, data from 87 adtech firms. Thread:
'Patternz' in the report by @johnnyryan and me published today:
Patternz is operated by a company based in Israel and/or Singapore. I came across it some time ago, received internal docs. Two docs are available online.
Here's how Patternz can be used to track and profile individuals, their location history, home address, interests, information about 'people nearby', 'co-workers' and even 'family members', according to information available online:
, a 'social risk intelligence platform' that provides digital profiles about named individuals regarding financial strain, food insecurity, housing instability etc for healthcare purposes.
"It calculates risk scores for each risk domain for each person", according to the promotional video, and offers "clarity and granularity for the entire US".
Not redlining, though. They color it green.
Making decisions based on these metrics about individuals and groups seems to be highly questionable and irresponsible bs.
Bazze, a US data broker that purchases smartphone location data from mobile apps and advertising firms, and sells to the US Dept of Defense, according to the WSJ (), openly promotes a commercial location mass surveillance system for 'government customers'. wsj.com/tech/cybersecu…
I extracted information about mobile location data they claim to sell per country from their website:
New WSJ report found that 'Near', a consumer data broker based in India, Singapore and the US with an office in France, obtained massive location data via digital advertising firms like OpenX, Smaato and AdColony and sold it to US defense/intel agencies: wsj.com/tech/cybersecu…
Near's general counsel and chief privacy officer:
The US govt "gets our illegal EU data twice per day", a "massive illegal data dump".
"We sell geolocation data for which we do not have consent to do so", "we sell data outside the EU for which we do not have consent to do so"
If this isn't reason for EU data protection authorities to take urgent action than I don't know what is.
Yesterday, I published a case study that examines enterprise software for process mining, workflow automation and algorithmic management.
I identified a list of mechanisms that involve personal data processing and can affect workers individually (right) or collectively (center).
I guess rarely anyone has ever examined this kind of software at such a level of detail, from a worker perspective.
The case study explores how employers can exploit worker data based on enterprise software docs. The chart is an excerpt from section 7: crackedlabs.org/en/data-work/p…
The case study is largely based on an analysis of enterprise software docs from a single vendor and its partners, which has its limitations. It's the third in a series of case studies, which are part of a larger project that aims to map how employers use personal data on workers.