Are there any proposals to sandbox the mobile address book via iOS or Android so wild mobile apps like Clubhouse can't "go viral" and then encourage millions of Americans to share their personal user graphs and personally harvested contact information of friends/colleagues? ⚖️🧵
There are odd legal exposure issues related to a For-Profit Business requesting access to a Personal Contact Book from a non-business / person -- here's the flow imo:

Data Controller requests consent + marketing purpose to ingest Contact Address Book from non-covered entity
a Data Controller requesting 100% access to a personal Address Book, has ingested *user data, without consent from the users who the data belongs, to process it*

imo the phone APIs from iOS / Android that ingest + share address books violate Data Controller Frameworks
Both Apple and Android increase the amount of developers building "Social Network" mobile apps for their App Stores (Clubhouse is ioS only) when they make it flippant to ingest an address book. This maps friend of friend + biz networks in extremely valuable and dangerous ways.
Apple and Google both have broken any semblance of "Contact Book Consent" where on both platforms, as soon as you add an email account to the phone, it tries to start to grow the phone address book from emails and calls / texts. Users "leak themselves" by engaging w/ these flows.
To continue the exposure flow:
1) Device manufacturer makes address book syncing easy & unsafe
2) Apple / Android build address book sharing grease fires to encourage app developers to build social networks for their mobile platforms
3) Apps ingest address books via APIs
4) Apps that ingest address books become defacto Controllers & Processors for illegal data that the app user couldn't "Consent for Other Users" -- there is no legal concept where XYZ user can "Consent for other users" and then share their info to a for-profit business.
5) Imagine if the CEO of a business could "consent for other people" and then just bought 100 phones, loaded 50k contact records into each, synced them as "personal devices" and then "legally ingested contact records for business use" - this breaks consent flows...
6) So if a CEO can't just use their own personal devices as a data mining play to ingest millions of records that they put on their own phones, how can other users of a platform do this? How can you incentivize an end user, to breach other users, and then use this data legally?
7) Clubhouse ingested hundreds of millions of non-Clubhouse user records, through the Apple iOS Address Book importing process -- which provides no notice or consent for users who were added into an address book prior to their info being shared to a for-profit social network.
8) How can Apple create data flow systems where emails & text messages, or just uploaded contact lists, can then be "legally sent" to an app maker and social network, when the vast majority of the end-users did not consent to have their data collected through this Apple API?
9) If Apple sends user data to a new company, Apple is the joint-controller with that new company - or the Data Controller and the App is the Data Processor (w/ the app given no restrictions, total portability breach by Apple).

Apple's contact book API creates risks for Apple.
10) If Apple is the data controller for data it scrapes from a user integrating an inbox or phone service through their hardware, the controller must provide revocability in their data supply chain - portable exports of non-consented user data violates GDPR & other frameworks.
11) As it stands, neither Apple nor Google puts any proper control or consent around an Address book. Users who engage with another unsafe user, are automatically "opted-into" any social network address book sharing the unsafe user engages w/ on that device/platform. (wtf)
12) Mobile address book sharing is the #1 source of "free user data" for app makers - and it's the largest user data breaches outside of online advertising systems.

A friend of friend user graph is step #1 in intelligence + social network building. Don't be flippant w/ it.
13) Finally, if Apple & Google don't dramatically change the permissions & portability of their address contact books, both companies will face the largest fines and investigations either of them have ever seen. They both are flagrantly ignoring data privacy laws & real concerns.
14)* I've had my corporate email loaded into a Clubhouse account for the past ~month and many folks have added me as a friend from an obvious contact importing process (not proactively searching me to add or engaging with me in any other way).

🌩️Stop sharing your address book.

• • •

Missing some Tweet in this thread? You can try to force a refresh

Keep Current with ℨ𝔞𝔠𝔥 𝔈𝔡𝔴𝔞𝔯𝔡𝔰

ℨ𝔞𝔠𝔥 𝔈𝔡𝔴𝔞𝔯𝔡𝔰 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!


Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @thezedwards

14 Feb
Congress rarely provides justice or reform. It's a bastion of conflicts & procedural rules.

But for 18 months after a Presidential election, an agenda can be set.

& Congress can't chew gum and walk - they fuck that up bad. We could get 1 trial, or debates on a bunch of issues.
If President Biden had demanded Congress hold a trial, with witnesses and tons of subcommittee hearings, he could have easily done that. And he could have put so much pressure that today could have easily been a different outcome. Now, why didn't Biden put all his chips on this?
A U.S. President has about 18 months after a Presidential Election to get something important done. From 1990's healthcare reform attempts, Bush tax cuts, Obamacare, Trump's tax efforts -- and Biden *could have chosen* to spend his time/political capital on a trial.
Read 6 tweets
11 Feb
Imagine you are in charge of security for the Pentagon web portals - you've got a specific website to control where both external contractors + internal staff access it.

One day, you wake up & a Chrome Extension claims to "support your users" w/ XYZ features you didn't make 🧵
To make matters worse, you've discovered that dozens of your users have installed the extension within days of the extension being released - & you find out that extension developer has been paying the extension store to promote this dangerous extension on search & video sites.
Now, what do you do? Do you initiate an internal meeting to audit the extensions in order to try and break the features that are unsafe? Do you contact the extension store to demand the extension be taken down? Contact the dev? Do you warn your users or disable their accounts?
Read 9 tweets
29 Oct 20
This is some of the worst ad tech research I’ve ever seen. The markup doesn’t have access to the actual bidding details of either campaign - they don’t have exclusion data either.

A few FB buying facts:

1) Exclusion audiences save money when high-bid pages are in an audience.
2) custom audiences cost less than native FB targeting of page interests/likes

3) lookalikes cost less than custom audiences, and less than native FB targeting

4) campaigns bid against each other - hugely popular states like Florida has tons of competition
5) it’s possible to attack the CPM rates by buying ads against XYZ fan page. Take 40 ads accounts you control, bid on only fan pages (Obama/Biden,hrc) & bid very high. Biden’s optimization choices for a campaign could then be used to push his CPM rates in some markets sky-high.
Read 8 tweets
22 Oct 20
1) I've reviewed the "Evaluation of Cohort Algorithms for the FLoC API" @… & have thoughts..🧵

high-level takeaway is that both methods Google tested *require an anonymity server* to filter cohorts that are too small.

This is *not* a deal breaker* Image
2) *Google tested methods that required an anonymity server because they don't have federated learning built into Chrome.

So Google tested "Centralized cohort building/filtering" vs "Pseudo-on-device cohort building/filtering" - the privacy safe version was 85% of the quality.
3) Differential privacy ≠ K-anonymity / We should focus on K-score to protect users (& merge cohorts) - it's a subtle difference but K-scores are more easily integrated into a "minimum viable cohort size" to be built into an open source anonymity server or federated Learning.. ImageImage
Read 12 tweets
19 Jul 20
I think there is another big twitter hack going on, but not to verified accounts... rayban DM spam going really big
Here's my threads w/ screen shots of users complaining about this recent twitter hack / DM mass messages pushing RayBans... (Seems like rate limits are being stomped imo)

Read 5 tweets
10 Feb 20
There is a data supply thread going around with experts dunking on people who believe microphones are listening to them & using their conversations to show targeted ads.

Experts claim it’s impossible, but they are wrong...and numerous companies have tech to do it. Thread time..
Numerous companies have been testing this Speech->Keyword->Ad Segments for over a decade. I’m sharing 14 patents from major companies that do *exactly this*

Sony submitted for a patent for Speech to Keyword to Advertising tech in 2007 (received in 2011) @…
AT&T submitted for a patent for Speech to Keyword to Advertising tech in 2008 (received in 2015) @…
Read 24 tweets

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!