Ángel Díaz Profile picture
Not monitoring this account, email me

Apr 4, 2019, 36 tweets

I'm live-tweeting from NY City Council this afternoon for an updated hearing on the #NYCAlgorithm's Transparency Task Force.

CM Koo is walking through the problems with government use of algorithms: the public doesn't know when they're being used, and they depend on biased assumptions and flawed data sets.

The ADS Task force was empaneled to study government use of algorithms, and to come up with recommendations for minimizing harms. But this is the first public hearing since the law passed a year ago.

The first panel is from the task force's co-chairs, you can see the whole task force here: www1.nyc.gov/site/adstaskfo…

The chair says it's been more difficult than anticipated to determine which "automated decision systems" the task force will work on.

Chair says the task force won't create a list of algorithms in use by the city. It'll empower agencies to do citywide assessments.

Task force is kicking off public engagement by holding 2 hearings on April 30, and May 30 (both in Manhattan), along with events during the summer. These are intended to be forums for impacted individuals to present their concerns.

CM Koo is asking why the Task Force isn't making its minutes public, and points to Vermont's approach to posting public agendas and minutes. Chairs say they're trying to create a safe space for frank conversations.

Koo is asking why City Council isn't allowed to attend. Getting similar answers. Pointing to the public forums as the space for the public to be involved. Offers to schedule City Council briefings as well.

Chairs are saying the Task Force isn't looking a individual algorithms specifically.

Basically: the task force can't agree on what an automated decision system is. Doesn't speak well for what their written report will address.

The term “automated decision system” means computerized
implementations of algorithms, including those derived from machine learning or other data processing or artificial intelligence techniques, which are used to make or assist in making decisions.

The report is due November of this year, but the Task Force is holding their first public forums this summer. Short timeline to meaningfully incorporate concerns from impacted communities.

For now, the only way the public can participate is by filling out a web form: www1.nyc.gov/site/adstaskfo…

The Task Force says they've read advocacy letters from our #NYCAlgorithms Coalition informed the membership of the task force and led to increased public engagement.

CM Holden says the task force needs to say what kind of transparency is expected, because leaving it up to agencies themselves won't be enough.

Next up is a panel with many of our coalition members: @datasociety, @AINowInstitute, and @STOPSpyingNY

@janethaven says the task force needs direct access to the algorithms used by agencies. What's fair, accountable, and transparent means different things in different settings (e.g. criminal justice, education, housing)

🚨If the city was just going to get generalities, there'd be no need for a task force.

Recs from Rashida Richardson of @AINowInstitute.
- this committee needs to be an oversight body because of how little engagement we've received
- our letters have specific recommendations, if the process doesn't go well, we hope this is a model for moving forward. (cont.)

- concerned that the task force is proceeding without context. without addressing specific algorithms, you can't make meaningful recommendations.

For example, we know the city is looking at pretrial risk assessment, but the issues that attach to every product are different.

@CahnLawNY: you cannot build a roadmap for the future if you don't know where you are today. Task Force and the public needs an understanding of how they're being used.

This is a tough subject: we need better ways of making it clear how much this touches on so many aspects of New Yorkers' lives. The Task Force hasn't taken enough steps to engage the public and make the stakes clear.

. @eric_ulrichasking about the pretrial risk assessment. Rashida says mass incarceration of black and brown NYers can impact how a given algorithm views individual defendants as a "risk."

Rashida also says the costs of these systems can be significant. For example, public benefit algorithms exist can help determine who gets SNAP. There've been lawsuits in other localities, and they're scrambling to fix the problem. omaha.com/news/nebraska/…

.@CahnLawNY: all AI is biased, the goal is to reduce it. AI as a class is no different from human decision-makers.

CM Ulrich is asking which vendors the city is using. Wouldn't it be great if the Task Force could tell us?

.@bradlander is asking about using algorithms to fight the problem of reckless driving. @CahnLawNY says that a system based on camera footage begs the question of where those cameras are, and who's being surveilled. This is the issue spotting we need agencies to consider.

We need validation or bias studies for the government use of algorithms. City agencies can require this of every vendor.

CM Koo is asking about similar task forces. Vermont created a statewide task force, there are comparable groups in Washington and Massachusetts, Pennsylvania, and California. Where we were once a leader, we've fallen behind other municipalities regarding public engagement.

.@BetaNYC raising similar concerns about lack of public engagement. Advising the task force to update the website, share more info (agendas, timelines, public events calendar). Also recommending a public glossary of terms so the public can better engage during their meetings.

Next up @ITI_TechTweets calling for sustained engagement across public and private sector, including beyond the scheduled public engagement meetings.

CM Koo asks if source code should be made available. @ITI_TechTweets doesn't like that idea, thinks source code should be protected. @BetaNYC pushes back, and wants algorithms to be accountable.

Next up two task force members issuing their frustrations with not being able to analyze specific algorithms. Recs:
- if needed, Council should change the law to allow the task force to demand access to specific systems
- amend the law to give the task force more time

Above was joint testimony from Julia Stoyanovich and Solon Barocas.

Today's hearing is a reality check that meaningful transparency and accountability from government will never come easily.

Without more pressure from the public and City Council, we will have no better understanding of any specific algorithm being used by city agencies.

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling