I’m at an interesting event today about #AI and the US military at the national press club
Currently listening to the Co-founder of #Deepmind talking about reliability and error and AI… now the discussion is moving to implementation of ethical principles with #AI and Dr Jane Pinelis from JAIC.
Fascinating discussion- someone is asking how DOD can “measure trust” - this is interpreted as measuring ‘reliance’ - how much they can rely on technology. They also consider how human behaviour interacts ie how military use the technology, user behaviour with it.
The Principal of WestExec advisors arguing that the US needs to involve private sector individuals in public service more - suggests for example someone might work at Google most of the year but could spend a couple of weeks a year working with and helping the military.
Quite interesting to hear the #deepmind guy stressing the US should not be rushing ahead with innovations with #AI where there still are any questions and uncertainty over safety.
Jeff Schneider talking about how DOD has kept academia at arms length in the past because of security concerns but that they are reaching out more and he encourages that developing relationship.
Entire event mostly about ‘how do we get more commercial AI technology into the military’ - interesting comment now from Chris Brose that acquisition systems haven’t changed greatly since the time when the military created most of its technologies itself.
Sorry that was Mike Brown!
Now Angel from Microsoft is saying often people at DOD don’t have a strong idea of the kinds of technologies they want or understand the technology or where it’s going - this is something I’ve heard a lot in my research. It means companies like SCL (and spin offs) are able to
spin what they do… black boxedtech companies with flashy suits too that are all selling a product with impressive slides and those in government feel they have to bring in contractors because they’re told they’re ‘left behind’ at conferences like this
by people looking to capitalise and very rarely do they have a proper grasp of whether something really is ‘innovation’ and whether that innovation is even the best way.
Michele Flournoy (WestExec) now talking about how Ukraine - Russia situation will be remembered as a real strategic mistake of Putin. But a world-changing one.
Flournay Says - One of the outcomes will be EU/US transatlantic effort to maintain competitive edge with tech, formalising a collaborative framework but this is just developing (essentially we are talking about renewed arms race on AI technology - EB).
Says that China is looking into how it would insulate itself to anticipate a situation where similar sanctions being used against Russia might be used against them. Flournoy argues it can drive such countries to lessen use of the dollar and use other currencies (crypto? -EB).
Says on new technologies in Ukraine, says after the invasion of Crimea a lot of countries including US, UK brought in special operations forces to support Ukraine with great success - says they are very ‘agile’ with new technologies.
She thinks it’s likely that Ukraine could decisively undermine Putin’s assault in Ukraine but that Putin could try to flip the table with some big decisive action (tactical nuclear weapons for eg) that shakes things up, and tries to reframe the conversation and regain dominance.
This next panel should be interesting with Chris Lynch
Flournoy is talking about how the military are trying to build in technical and scientific development, training and advancement on that basis for those who enlist with stem background as these skills were left undeveloped and unnurtured in the past. This is a major
issue and leads to a lot of this ‘buying in’ of skill because the military lacks it (failed to encourage and build it in house). This leads to a lot of those highly intelligent skilled people moving out - brain drain and revolving door. She argues also though that they need to
bring in ‘the absolute best’ cutting edge with outside private sector contracting. (ie please still hire our experts, thanks!)
She’s now arguing that those overseeing acquisition need greater ‘risk tolerance’ 🙄
Me watching people who research media systems talk about what ‘propaganda’ is.
People need to understand propaganda isn’t media bias. These are separate, different problems. There’s maybe a relationship between them, so structural issues with the media can let propaganda sneak through. But propaganda is a wholly different thing to study… if you understand
and research the media or social media or even advertising or even commercial PR or political economy of these things - you may have parts of the relevant knowledge but this isn’t the study of propaganda.
This is a fascinating insight into a rather delusional mindset “Russia cannot afford to lose, so we need a kind of a victory”: Sergey Karaganov on what Putin wants - New Statesman newstatesman.com/world/europe/u…
Interesting sociological experience at a gun show. Conservative 2A person who took me said they were concerned how political & hostile it was, rather than focused on 2A/gun brands. Lots of 3%ers merch, some QAnon
Saw disturbing patches proudly proclaiming the wearer ‘islamophobic’ next to traditional Nazi insignia too. Oddly racially diverse people shopping though.
Most interesting stall, very few visiting it. Person I was with was more terrified than I was. Very clearly this is where you buy your real Nazi helmet if that’s what you’re in to. In any other context I’d be less disturbed at historical stuff but here this was chilling.
I am incredibly worried by how the dominant methodological paradigm is skewing focus, determining what research is enabled on influence operations and the recommendations that come from it for policymakers. No-one is funding the kind of work that's needed.
Policy decisions will be based on partial understanding of the problems and if we continue this way we will face escalating abuse of the information environment driving conflicts, climate crisis and more, we will - if we continue - reach 2024 ill-prepared and more authoritarian.
I cannot stress strongly enough - many of the policy recommendations coming from the present selective focus of research, which leaves out real understanding of the actors and how they use media for campaigns will lead to an increasingly surveillant and authoritarian response.
This is an incredible graph showing vividly the decline of #transparency in the UK which few have been taking seriously enough. Congrats to @JennaCorderoy on the illuminating report behind the must-read article below. Jenna and @openDemocracy -amazing work researching this.
This is a subject I care a lot about and I also have been actively pushing back on the transparency shutdown I’ve personally repeatedly encountered from the UK Government. I recently submitted to UK Inquiry into the Cabinet Office’s implementation of FOIA: emma-briant.co.uk/reports/
Stopping the shutdown of UK transparency is vital and we need to see strong commitment from the Labour Party (esp since Tony Blair is being seen as a model to follow - Blair attacked his own introduction of FOIA because it aided journalism). bbc.co.uk/blogs/opensecr…
What I will never understand is why academics who knew nothing about Cambridge Analytica or what they did fuelled this kind of misinformation over the scandal which perpetuated the unhelpful binary about ‘snake oil’ vs ‘mind control’.
Mostly it seemed to be fuelled by their having spoken to Republican politicians and political consultants who are hardly neutral actors… or studies that didn’t actually examine in an informed way what CA actually did.
We are STILL now facing the unhelpful binary they helped to establish as ‘Disinformation Studies’ becomes a thing… turning it into a debate about snake oil vs mind control removed any ability to have a nuanced discussion about data, ethics, effect and the influence industry.