Android apps from dating to fertility to selfie editors share personal data with the Chinese company Jiguang via its SDK that is embedded in the apps, including GPS locations, immutable device identifiers and info on all apps installed on a phone.
Jiguang, also known as Aurora Mobile, claims to be present in >1 million apps and >26 billion mobile devices. Which seems wildly exaggerated. jiguang.cn/en/
Anyway, researchers found Jiguang's SDK in about 400 apps, some of them with hundreds of millions of installs.
According to the paper, Jiguang’s SDK is "particularly concerning because this code can run silently in the background without the consumer ever using the app in which it is embedded". Also, the SDK uses several methods to "obfuscate and hide" its "behavior and network activity".
While many "previous research efforts focused on SDKs specialized in analytics and advertising services, the results of our analysis call for the need of analyzing and regulating …the whole third-party SDK ecosystem due to their privacy and consumer protection implications"
Please note that there are hundreds, if not thousands of data companies based in the US, Europe, Russia, Singapore, India or in other countries that are doing similar stuff.
Nevertheless, Jiguang/Aurora is special in some way.
Like many other mobile data brokers all across the world, they sell so-called audience data, i.e. extensive user profiles with hundreds of attributes that can be used for surveillance-based advertising: jiguang.cn/en/iaudience
On the same website they openly suggest that customers can use eight years of accumulated data on '1 billion+ monthly active mobile users' also for 'financial risk control': jiguang.cn/en/fintech
They say they identify the 'user's risk level' for 'corporate lending', 'relying on years of data accumulation'.
'JIGUANG’s data service objectively reflects user’s repayment willingness' ... 'highly correlative to the overdue behavior of loan clients' jiguang.cn/en/anti-fraud
They also sell data for other purposes not related to advertising. jiguang.cn/en/izone
According to a press release, they provide data solutions for 'targeted marketing, financial risk management, market intelligence and location-based intelligence': globenewswire.com/news-release/2…
Is this a purely Chinese enterprise? Not at all. Aurora Mobile is listed on Nasdaq.
According to a SEC filing, "Jiguang” is the brand and Aurora consists of a network of companies based in China, the British Virgin Islands, Cayman Islands and Hong Kong: ir.jiguang.cn/static-files/c…
(like many of their 'western' counterparts)
There's lots of interesting info in the SEC filing. Aurora emphasizes to 'assist financial institutions and financial technology companies in making informed lending and credit decisions' based on data from its 'developer services'.
"We develop the risk features based on anonymous* device-level mobile behavioral data… We believe [the] risk features we offer, such as those relating to payment behaviors and usage of consumer finance mobile apps, are most relevant to credit assessments"
*most likely not true
Listed companies must disclose all kinds of information including business risks.
Here they disclose to the SEC that they may have to 'obtain approval or license for personal credit reporting business' in China to 'continue offering its financial risk management solutions'.
"Due to the lack of further interpretations of the current regulations governing personal credit reporting businesses, the exact definition and scope of 'information related to credit standing' and 'personal credit reporting business' ... are unclear"
…same issues everywhere 🤖
One more SEC filing tidbit, Aurora states it has "accumulated data from over 33.6 billion installations of [its] software development kits (SDKs)". So, they're counting every app install that contained their SDKs. And they claim to harvest data from 90% of Chinese mobile devices.
Enough for today.
TL;DR Chinese data companies provide cheap services to app vendors, harvest personal data on hundreds of millions without their knowledge, and exploit it for all kinds of business purposes, in many ways very similar to companies in the US and in other regions.
Btw. Aurora/Juguang lists the huge US-based data broker Nielsen as a customer: jiguang.cn/en/izone
• • •
Missing some Tweet in this thread? You can try to
force a refresh
I took another look at Snowden docs that mention browser/cookie IDs.
It's breathtaking how the surveillance marketing industry has still managed to claim for many years that unique personal IDs processed in the web browser are somehow 'anonymous', and sometimes still does.
Another 2011 doc indicates that the GCHQ operated a kind of probabilistic ID graph that aims to link cookie/browser IDs, device IDs, email addresses and other 'target detection identifiers' (TDIs) based on communication, timing and geolocation behavior:
Btw. What inspired me to revisit these docs is @ByronTau's book Means of Control, which not only details how US agencies buy commercial data from digital marketing but also provides deep historical context, tracing back to early-2000s debates on Total Information Awareness (TIA).
Die digitale Werbeindustrie verkauft Smartphone-Standortdaten und Bewegungsprofile von Millionen Menschen in Deutschland, darunter Privatpersonen und sensibles Personal.
Große Recherche von und BR, die einen riesigen Datensatz als "Muster" erhalten haben. netzpolitik.org
Sie haben Menschen identifiziert, die Entzugskliniken, Swinger-Clubs oder Bordelle besucht haben, aber auch Personal von Ministerien, Bundeswehr, BND, Polizei.
Fast alle Smartphone-Apps sind heute mit zwielichtigen Datensammeltechnologien "verwanzt".
Völlig unkontrollierte Datenmarktplätze, u.a. die Firma Datarade mit Sitz in Berlin, bieten Standort- und andere Verhaltensdaten über ganze Bevölkerungen aus vielen Ländern zum Verkauf an.
So, Microsoft exploits activity data from Outlook, Teams, Word etc across customers for its own promotional purposes, including on meetings, file usage and the seconds until emails are read.
Microsoft states that the analysis on the seconds until emails were read excludes EU data. Activity data from Outlook, Teams, Word etc, however, seems to include EU data.
What's their legal basis? This is also personal data on employees. And, are business customers fine with it?
Should cloud-based software vendors exploit personal data on users of their services, including private persons and employees of business customers, how they see fit?
I don't think so.
Not even for public-interest research, at least not without academic process and IRB review.
Some more findings from our investigation of LiveRamp's ID graph system (), which maintains identity records about entire populations in many countries, including name, address, email and phone, and aims to link these records with all kinds of digital IDs:crackedlabs.org/en/identity-su…
Identity data might seem boring, but if a company knows all kinds of identifying info about everyone, from home address to email to device IDs, it is in a powerful position to recognize persons and link profile data scattered across many databases, and this is what LiveRamp does.
LiveRamp aims to provide clients with the ability to recognize a person who left some digital trace in one context as the same person who later left some trace elsewhere.
It has built a sophisticated system to do this, no matter how comprehensive it can recognize the person.
As part of our new report on RTB as a security threat and previously unreported, we reveal 'Patternz', a private mass surveillance system that harvests digital advertising data on behalf of 'national security agencies'.
5 billion user profiles, data from 87 adtech firms. Thread:
'Patternz' in the report by @johnnyryan and me published today:
Patternz is operated by a company based in Israel and/or Singapore. I came across it some time ago, received internal docs. Two docs are available online.
Here's how Patternz can be used to track and profile individuals, their location history, home address, interests, information about 'people nearby', 'co-workers' and even 'family members', according to information available online:
, a 'social risk intelligence platform' that provides digital profiles about named individuals regarding financial strain, food insecurity, housing instability etc for healthcare purposes.
"It calculates risk scores for each risk domain for each person", according to the promotional video, and offers "clarity and granularity for the entire US".
Not redlining, though. They color it green.
Making decisions based on these metrics about individuals and groups seems to be highly questionable and irresponsible bs.