My Authors
Read all threads
We will be live-tweeting the roundtable on Algorithmic Accountability in India, hosted by Divij Joshi
@divijualsuspect. You can also tune into the livestream on our YouTube channel:
.@divijualsuspect said today's discussion is on what algorithms imply for our lives and for our democracy.
Implementation and use of AI varies across states. Second, the private sector is involved in all govt projects around AI. For instance, Punjab's AI system is being developed by a Gurgaon firm.

@BasuArindrajit of @cis_india
We mapped several sectors where govt is trying to use AI. The first, law enforcement, is covered by @VidushiMarda and Shivan Narayan's paper on Delhi Police. The other sectors are:

— School dropouts are monitored using AI in Andhra Pradesh...
Govt has been enthusiastic to use AI in ed-tech.
— Defence.
— Agriculture: Karnataka has partnered with Microsoft to alert farmers on soil quality and when they should sow.

@BasuArindrajit
I define "digital" as data-based decision making. Is there a need for a digital state? If not, it means that the state remains in the industrial age while the rest of the world moves on.

— Parminder Jeet Singh from @ITforChange
Public education is considered a state responsibility. We need to save some of these spaces for the public sector, instead of letting the private sector in on it, Singh said.
The non-personal data framework is badly framed right now. Some kind of NPD is required for the govt to carry out some decisions.

— Parminder Jeet Singh
Q. What is the govt thinking when it tries to implement automated or data-based systems?

Dr Anupam Sarafh has worked in Smart Cities. He says that people engage with markets, banking system, education systems. They bring together different people for common purposes.
If those who are governing the relationships do not share the consequences equally, you are laying ground for creating injustice. Life is about these relationships, not dividing people between the governed and government, says Dr *Saraph.
Current decisions the driven by businesses rather than the people choosing to make the decision to help each other. The plight of people in COVID19 cannot be replicated into orders of govt...
...It's important to recognise that the reality of relationships with migrants have failed. Those who govern us have failed to understand the relation of migrant workers with their employers, where they live. Those who govern us do not share the same consequences...
...When you give away to algorithms, we are oblivious to the ethics of the participants. We are writing away the dignity of the people for whom we are taking decisions...
We don't recognise that we're creating an unforgiving and irreversible system, not subject to appeal or human compassion. But somehow we feel that we are making progress.

— Dr Anupam Saraph (@AnupamSaraph)
.@VidushiMarda: During my and Shivangi's paper on Delhi police, we learnt that you cannot study a system that is not open to sharing. We wanted to understand the models, how they were optimised, accuracy rates. But.. (1/n)
...we realised that there is breakdown of accountability (RTIs), we had to study the institution itself. We studied the automation of toilet cleaning in Pune Smart city. We found that it's just a surveillance mechanism designed by the company. (2/n)
When we approached the MC, we were told that the company decides that because it has to serve their bottomline. This leaves us at trying to understand where transparency comes from, we can try and understand what transparency looks like on the field. (3/n)
By the time the project is rolled out, it's already to late to build in accountability into the system. (END)

@VidushiMarda
.@urvashi_aneja of @tandem_research: There’s a lot of tension where you have local governments with problems of capacity and then you have the private sector defining problems and developing solutions for the state. (1/n)
We need to build state capacity in the interim the Pvt sector is defining a lot of problems. The lines between state governments and the private sector is becoming much blurry. (2/n)

@urvashi_aneja
Algorithms should be held accountable, but more than that, we need the entire system that deploys these algorithms to be held accountable as well. (END)

@urvashi_aneja
Watch the livestream of the discussion on Algorithmic Accountability here:
If we *really* speak of AI/ML, the rules have changed, which is why need transparency and accountability. Coding a neural network 6-7 yrs ago was a pain. Now anyone with little knowledge can download a package and start a neural network. (1/n)

@DeoSahil of @cpceu
It's not as if product teams who develop this do think about this. We feel that our interventions are trying to equip engineers who stay away from social science and law to think about transparency issues. (end)
@DeoSahil of @cpceu
The BMTC's transport algorithm tells you when the bus comes, and claims to be accurate. When you do a comparative analysis, you will realise when the bus actually arrives. This is a simple example. (1/n)

@digitaldutta
But imagine when is used for penalising people for speeding. People have found that they are penalised when they did NOT speed. Autowallas copy registration numbers so that another person gets fined. The only way to find out whats going on is reverse engineer some of it. (2/n)
These systems are illegal, they have no legal backing. And there is no mechanism to challenge this. (3/n)
Aadhaar data was being used to link Voter IDs and delete duplicates, starting in Telangana. The state had all the data of its citizens, collected without informing citizens... directly and from other sources. (4/n)
Eventually, 70 lakh people were deleted from voter rolls. people only found out when they could not vote. Everyone, including @UIDAI, came out and passed the buck. (5/n)
In Telangana, they assumed that the people who did not have Aadhaar but had Voter ID are dead, ghosts, or are duplicates. And they deleted all of them. So you have accuracy, but what does accurate mean? Nobody knows what algorithms or software they used. (END)

@digitaldutta
While tech may or may not be neutral, there are always intentions behind how its designed and installed. It's important to understand how the bureaucrats think, the best way to do it via RTIs and govt sources.

@Kum_Sambhav
It took me another 3 months to figure out the idea behind it. Even bureaucrats did not understand the implications of use of databases and algorithms. The bureaucrat who designed and proposed himself said that local bureaucracy fudges data, that the Natioanl Social Registry...
....will be a good idea. Eventually he realised that this was not the case. Now the hypothetical fears around Aadhaar, this become a reality with the National Social Registry.

@Kum_Sambhav
Journalist @gopalsathe, formerly with @HuffPostIndia
, said that these black boxes affect people around lending and insurance.

"You realise that everyone is harvesting as much as they can .. there are also brokerages. There is an economy of you data out there." (1/n)
"The govt systems is at least open to some public scrutiny. The pvt sector can clam up, and *they do*. It becomes harder to get the info you need since the govt is now working with the pvt sector more and more. It takes months to uncover info, establish links, and so on." (2/n)
"With news rooms being gutted, its more and more difficult to justify doing this kind of work."

@gopalsathe
RTIs, parliamentary questions, and expert affidavits are how to communicate the tech and automation to the court. In the Aadhaar case, the court did not pay much attention to the petitioner's expert's submission. (1/n)

@VrindaBhandari
...The petitioner was also not allowed to cross-examine Ajay Bhushan Pandey (then UIDAI head) on his Aadhaar presentation. (2/n)
In the J&K 4G restoration case, it came down to the impact of 4G v 2G. @prateekwaghre explained the difference by running simulated tests. The conclusion he gave what how long does it take to stream videos, if you can access WHO dashboards. (3/n)
Judges are not experts. We have to apply the facts to the legal principles so that they understand. Understating helps more than overstating. There is an inherent bias among judges and people towards the technologists. They think tech is the answer.

— Rahul Narayan
And that's a wrap! Thanks @divijualsuspect for this discussion. Watch the video here:

Until next time!
Correction: Shivangi Narayan*
Missing some Tweet in this thread? You can try to force a refresh.

Keep Current with MediaNama.com

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!