Australia's competition regulator found:
- Google engages in anti-competitive behavior in digital advertising, which harms consumers & businesses accc.gov.au/media-release/…
Research institutions & universities challenge Big Tech, but are desperate for funds due to austerity... It may be difficult to say no to Google funding, but we don't need to look far to see how Google treats academics & lines of inquiry it doesn’t like abc.net.au/religion/googl… 4/
Big Tech's control of resources & funding are "creating a web of conflicted relationships that threaten academic freedom and our ability to understand and regulate these corporate technologies." @mer__edith 5/
"Are we willing to trade away knowledge & capabilities
of future tech to a company we know wants to monopolize these? By allowing Google to cozy up to researchers, are we providing them legitimacy over their claims to our scientific & tech future?" 6/
Compared to ethics principles in medicine, AI ethics principles lack: 1. common aims & fiduciary duties 2. professional history & norms 3. proven methods to translate principles into practice 4. robust legal & professional accountability mechanisms
"The truly difficult part of ethics—actually translating theories, concepts & values into good practices AI practitioners can adopt—is kicked down the road like the proverbial can." @b_mittelstadt 2/
"Ethics has a cost. AI is often developed behind closed doors without public representation... It cannot be assumed that value-conscious frameworks will be meaningfully implemented in commercial processes that value efficiency, speed and profit." 3/
Many people have a false dichotomy that you are either FOR or AGAINST covid restrictions, with no nuance about the TYPE of restrictions or level of effectiveness, much less that eschewing all restrictions → hospitals collapse & lockdown more likely. 1/
There has been a lot of terrible public health messaging & contradictory government policies in the West, from the start of the pandemic, continuing now, and these erode public trust, create false expectations, & contribute to “pandemic fatigue” 2/
The “only elderly & chronically ill are at risk” was both false AND ineffective messaging. This has been clear from the VERY START of the pandemic. (I RTed @jenbrea at the time) 3/
The false hope of current approaches to explainable AI in health care: current explainability approaches can produce broad descriptions of how an AI system works in general, but for individual decisions, the explanations are unreliable or superficial 1/ thelancet.com/journals/landi…
Explainability methods of complex AI systems can provide some insight into the decision making process on a global level. However, on an individual level, the explanations we can produce are often confusing or even misleading. @MarzyehGhassemi@DrLaurenOR@AndrewLBeam 2/
Increased transparency can hamper users’ ability to detect sizable model errors and correct for them, "seemingly due to information overload." 3/
"Who benefits from data sharing in Africa? What barriers exist in the data sharing ecosystem, and for whom? If much of the data sharing practice is shaped by the Global North, how can we ensure that the narrative for Africa is controlled by Africans?" 1/
Stakeholders in the African data sharing ecosystem. Those at the top of the iceberg hold significant power & leverage in guiding data sharing practices & policy compared to those in the hidden part of the iceberg. More powerful stakeholders wield disproportionate power. 2/
Dominant narratives around data sharing in Africa often focus on lack, insufficiency, deficit.
This framing minimizes the strength, agency, and scientific & cultural contributions of communities within the continent, and overlooks community norms, values, & traditions. 3/
🧵automation of gov social services (eg food benefits, disability services, unemployment, etc) can be:
- implemented with no way to correct errors (software treated as error-free)
- smokescreen for policy changes
- justify austerity under guise of efficiency
- operate at scale 1/
In France, updates to an automated system for benefit payments caused errors, delays, & incorrect debts for at least 60,000 people
Case workers are unable to correct errors in the system. Some victims coped by *cutting back on food* 2/
Flawed algorithm in UK ignores how often ppl get paid and has led to people going hungry & falling into debt
This is not just a technical error; the government deliberately chose this method of calculation because it was easier to automate, increased efficiency, & reduced costs
At the @QUTDataScience Data Science for Social Good showcase, @oforbes22 sharing about ways to visualize spatial uncertainty for the Cancer Atlas map, using glyphs or hues & whiteness in a project with @CCQld
This has been the inaugural year for @QUTDataScience Data Science for Social Good, with grad students & recent grads partnering with 2 non-profits: @CCQld Cancer Atlas & @fareshare_aus Qld food charity