#Facebook facilitates "human exploitation and trafficking" according to its own internal documentation!
The "three stages of the human exploitation lifecycle" enabled by Facebook are: recruitment, facilitation and exploitation. #FacebookFiles 5/9
When Facebook says it prioritises “meaningful social interactions”, it means that the algorithm prioritises content withs a higher probability of getting any interaction (like reactions & shares).
However, it is precisely these priorities that increase divisive content. 6/9
If you thought that Facebook's "community guidelines" applied equally to all its users, irrespective of reach or prominence, you were wrong.
In a document titled “Whitelisting”, Facebook admits that many "pages, profiles and entities [were] exempted from enforcement”. 7/9
Now revealed beyond doubt: Facebook spends most of its attention and global budget against misinformation and hate speech on English-speaking countries in the Western world, *even though* most of its users exist in other areas around the globe. #FacebookFiles 8/9
And if these issues weren't enough, Facebook also lied to its investors & shareholders about: its user base, advertising reach + content production on the platform.
Follow us to stay updated on the latest analyses around #FacebookFiles in India. With its largest user base in the country, Facebook owes answers to India and Indian authorities.
Keep our work going! Invest in India's digital rights today 🗣
We filed RTIs with every state and Union Territory to find out how they’re dealing with online “misinformation.”
What we found is deeply disturbing: police crackdowns, secretive fact-check bodies, and widespread use of RTI exemptions, revealing a quiet but dangerous escalation in digital censorship. (1/5)
In several states, it’s the police who are deciding what counts as “misinformation.” Some districts in Karnataka run proactive surveillance while others wait for complaints. With no publicly available criteria outlining what constitutes "harmful" or "misleading" content, there are risks of subjective interpretation and misuse. (2/5)
In Faridabad, authorities admitted to removing content but said they don’t keep any records. Arunachal Pradesh refused to explain why 55 social media profiles were taken down. Andaman and Nicobar authorities opt for internet shutdowns to stifle public discourse. (3/5)
🚨 Massive Victory! 🚨
@CCI_India has imposed a historic penalty of ₹213.14 cr (approx. $25.25 M) on Meta for abusing its dominant position via WhatsApp's 2021 Privacy Policy. IFF submitted expert information as an informant. Let’s break it down 🧵👇1/10 internetfreedom.in/statement-cci-…
The 2021 policy update by WhatsApp, implemented on a 'take-it-or-leave-it' change, forcing users to accept expanded data collection & sharing within the Meta group—without any real opt-out option. 2/10
The CCI concluded that this constituted:
✅ Unfair conditions under Indian competition law
✅ A violation of user autonomy, given the lack of effective alternatives to WhatsApp
✅ An abuse of Meta’s dominant position, contravening Section 4(2)(a)(i) of the Competition Act. 3/10
Here’s how your beloved DigiYatra uses facial recognition technology (FRT) Content warning: ***DYSTOPIAN USES*** ⚠️⚠️ 1/10
Now that we have your attention, here are the recent ways in which Indian public authorities and police forces used (and abused) facial recognition systems, jeorpardising the human rights and data privacy of millions of Indian citizens without much accountability. 2/10 🧵
1️⃣ @tnpoliceoffl suffered a massive data leak in its FRT portal, making 8,00,000 lines of data vulnerable. This incl. personal data of policemen & FRT reports on thousands of accused persons. IFF called for a total ban on use of FRT by police forces. 3/10
🚨 On May 4, 2024, a massive breach in @tnpoliceoffl’s Facial Recognition (FRT) Portal exposed over 8,00,000 lines of data—which include 50,000 facial IDs, personal information of police officers, & details of crimes, police stations, & FIRs filed. 🚨🧵1/8
The FRT software, developed by CDAC-Kolkata and hosted on TNSDC, which was storing facial images alongside personal details of suspected, accused, & incarcerated persons, was compromised—and the list of data leaked from it is disturbingly long. ⬇️ 2/8
FRT is an extremely invasive & dangerous surveillance tool which poses direct threats to privacy, especially at the hands of law enforcement. Police forces are able to amass & process large volumes of sensitive facial data without any checks, consent, transparency, or procedural safeguards. 3/8
Been hearing some chatter around #DigiYatra? As scary questions about ownership, transparency, and data flow emerge, here is a quick rundown of everything we know about the service, and more importantly, everything we don’t. 😶🌫️🧵1/7
1️⃣Who owns DigiYatra?
In 2019, @MoCA_GoI passed on DigiYatra's operations & data ecosystem to a *private company* created for this very purpose – DigiYatra Foundation. DYF is a joint venture between 5 Indian airports (public-private, 74% stake) & @AAI_Official (public, 26%). 2/7
2️⃣ Such a public-private venture must be answerable to citizens?
Not exactly. Neither DYF nor its security audit agency @IndianCERT fall under the RTI Act. It cannot, technically, be forced to disclose any information on its data practices & security. 3/7 medianama.com/2023/03/223-ci…
Were you among the millions of @WhatsApp users who got a DM from ‘Viksit Bharat Sampark’? 🫠🫠
The account, seeking feedback on government initiatives, is now barred by the Election Commission from sending messages.
But several concerns persist… (1/10) internetfreedom.in/whatsapp-messa…
The message, accompanied by a letter from the PM, listed the various schemes and initiatives introduced by the incumbent government and was, in many cases, sent after the ECI released its Model Code of Conduct for upcoming elections. (2/10)
It stirred a storm and how…
First, we wonder how exactly did MeitY secure the contact information of such a large number of people and when/how did it begin using this information for outreach purposes? (3/10)