As part of IFF’s RTI work on facial recognition projects in India here is a thread of responses we have received on the use of facial recognition for voter verification projects in Telangana. #facialrecognition#projectpanoptic
(1/19)
On 24.1.2020, IFF filed RTI requests with the Telangana State Election Commission (TSEC) and Telangana State Technology Services (TSTS) pursuant to the use of Facial Recognition Technology for voter verification in the urban local body elections in Kompally on 22.1.2020.
(2/19)
We asked them the following questions:
1. What is the legislation under which the authorities are using FRT? 2. Was there a standard operating procedure for the pilot project?
(3/19)
3. What were the results of the pilot projects by TSTS including error rate and bugs reported?
4. What is the total expenditure on the pilot project?
(4/19)
On 5.2.2020, IFF filed additional RTI requests with TSEC and TSTS in which we asked for the project report and questioned the accuracy of the technology. We also asked whether any future use of the technology was planned.
(5/19)
We’ve received multiple replies from both these authorities. In a reply dated 18.2.2020, TSEC states that it is authorised by Article 243-ZA of the Constitution of India to use FRT. They also sent a copy of the letter which details the SOP of the project.
(6/19)
Article 243 ZA of the Constitution states that the State Election Commission is in charge of all matters related to the conduct of elections to Municipalities. This, however, is not a sufficient basis for FRT use for voter verification.
(7/19)
Such FRT projects are being implemented in India in a legal vacuum. This violates the decision in Puttaswamy vs UOI which states that certain standards have to be met in order to justify intrusion by the State into the right to privacy.
(8/19)
These standards are:
✅ legality (existence of a law)
✅ legitimate goal/state aim
✅ proportionality between the objects and the means adopted to achieve them
✅ procedural guarantees to check against the abuse of state interference
(9/19)
FRT is a highly invasive and dangerous technology which should not be used without proper dialogue as to its consequences. This is because it could lead to exclusion and bias if not implemented properly.
(10/19)
According to the SOP, TSTS sent its officials to the selected polling stations in Kompally to assist in implementing the pilot project. Identification and authentication was carried out through a mobile app which contained the technology.
(11/19)
TSTS ensured that all data will be deleted and will not be used for any other purpose.
(12/19)
The total expenditure incurred was Rs. 10,200 per polling station. The pilot project was carried out in 10 polling stations. Thus, the total expenditure is Rs. 1,02,000, i.e., one lakh and two thousand rupees.
(13/19)
In a reply dated 27.02.2020, the TSEC says that no decision has been taken about the future use of FRT. Also included was a letter from TSTS dated 4.2.2020 which provides the result of the project including the accuracy rates of the FRT. The average accuracy rate was 78%
(14/19)
A 78% accuracy rate is not satisfactory. This is because exclusion as a result of non-verification would lead to a loss of rights. In this case, the right to vote.
(15/19)
It was also stated in the letter that low results in some polling stations were due to bad lighting and network issues. It was also stated that all the data including the photos were deleted from the servers.
(16/19)
In a reply dated 11.3.2020, the TSTS provided information about their previous use of FRT for authentication of pensioners. The accuracy rate was 94% and no bugs were reported. The FRT use in Telangana is illegal.
(17/19)
IFF has called for a 3 year moratorium on such use by central and state governments.
(18/19)
In the meantime, we suggest a comprehensive public consultation on the use of such technologies informed by international debate on the topic, which can balance technological advancement with people’s civil liberties. #projectpanoptic
(19/19)
Here’s how your beloved DigiYatra uses facial recognition technology (FRT) Content warning: ***DYSTOPIAN USES*** ⚠️⚠️ 1/10
Now that we have your attention, here are the recent ways in which Indian public authorities and police forces used (and abused) facial recognition systems, jeorpardising the human rights and data privacy of millions of Indian citizens without much accountability. 2/10 🧵
1️⃣ @tnpoliceoffl suffered a massive data leak in its FRT portal, making 8,00,000 lines of data vulnerable. This incl. personal data of policemen & FRT reports on thousands of accused persons. IFF called for a total ban on use of FRT by police forces. 3/10
🚨 On May 4, 2024, a massive breach in @tnpoliceoffl’s Facial Recognition (FRT) Portal exposed over 8,00,000 lines of data—which include 50,000 facial IDs, personal information of police officers, & details of crimes, police stations, & FIRs filed. 🚨🧵1/8
The FRT software, developed by CDAC-Kolkata and hosted on TNSDC, which was storing facial images alongside personal details of suspected, accused, & incarcerated persons, was compromised—and the list of data leaked from it is disturbingly long. ⬇️ 2/8
FRT is an extremely invasive & dangerous surveillance tool which poses direct threats to privacy, especially at the hands of law enforcement. Police forces are able to amass & process large volumes of sensitive facial data without any checks, consent, transparency, or procedural safeguards. 3/8
Been hearing some chatter around #DigiYatra? As scary questions about ownership, transparency, and data flow emerge, here is a quick rundown of everything we know about the service, and more importantly, everything we don’t. 😶🌫️🧵1/7
1️⃣Who owns DigiYatra?
In 2019, @MoCA_GoI passed on DigiYatra's operations & data ecosystem to a *private company* created for this very purpose – DigiYatra Foundation. DYF is a joint venture between 5 Indian airports (public-private, 74% stake) & @AAI_Official (public, 26%). 2/7
2️⃣ Such a public-private venture must be answerable to citizens?
Not exactly. Neither DYF nor its security audit agency @IndianCERT fall under the RTI Act. It cannot, technically, be forced to disclose any information on its data practices & security. 3/7 medianama.com/2023/03/223-ci…
Were you among the millions of @WhatsApp users who got a DM from ‘Viksit Bharat Sampark’? 🫠🫠
The account, seeking feedback on government initiatives, is now barred by the Election Commission from sending messages.
But several concerns persist… (1/10) internetfreedom.in/whatsapp-messa…
The message, accompanied by a letter from the PM, listed the various schemes and initiatives introduced by the incumbent government and was, in many cases, sent after the ECI released its Model Code of Conduct for upcoming elections. (2/10)
It stirred a storm and how…
First, we wonder how exactly did MeitY secure the contact information of such a large number of people and when/how did it begin using this information for outreach purposes? (3/10)
@GoI_MeitY has notified the @PIBFactCheck of the @MIB_India as the fact-checking unit (FCU) under the IT Amendment Rules, 2023.
The notified FCU will be empowered to flag online “false”, “fake”, or “misleading” information related to the Union govt. 1/9 🧵
The establishment of the FCU less than a month before the country heads for the #GeneralElections2024 could vastly affect the nature of free speech on the internet as it holds the potential to be (mis)used for proactive censorship, most importantly in the context of dissent. 2/9
This notification follows the March 13 decision of the Bombay HC, where the Bench refused to restrain the setting up of an FCU until the third Judge decides on the constitutionality of the 2023 Amendment.
This effectively allowed the Union govt to operationalise the FCU, despite its constitutionality being under deliberation before the High Court. 3/9
‼️Indian Railways has floated a tender for the installation of 3.3L facial recognition-enabled CCTV cameras inside railway coaches with a central face-matching server to surveil & identify passengers (adults & children) to ‘curb crime’. (1/7)
@amofficialCRIS invited tech providers to install FRT-enabled tech with ‘video analytics’ to identify faces & send them to a ‘face matching server’. This lacks legal safeguards, creates a fertile land for misidentification & staff bias, & fails Puttaswamy criteria. (2/7)
CCTV systems enabled with FRT operate in regulatory limbo as there is no legislative basis to use it to combat crime. & there’s the currently inoperational DPDPA which may not adequately regulate CCTV and FRT, leaving individuals' sensitive facial data vulnerable to misuse. (3/7)