This article by @TheEconomist@shashj get it right: "We’re reaching the apex of the armour versus gun race—and armour has lost that race." Armor alone is not enough in an era of precision-guided anti-tank munitions
This trend has been building for a long time. The steady proliferation of ATGMs is a real threat to armor. Drones complicate the equation b/c can attack from above where armor is thinnest.
Tanks won't go away, but survivability will depend on more than just armor.
Concealment, stealth, signature management, and active protection systems are become increasingly important elements of survivability.
Survivability isn't just about the ability to out-see and out-shoot other tanks. Drones are making the battlefield more transparent.
Even cheap unarmed drones can be a major threat if they can help ID armored units locations which can be relayed to artillery. Russia has used these tactics to great effect in Ukraine, pairing drones with anti-armor artillery.
Bottom line is the battlefield is become more transparent and more lethal for armored formations. Tactics will have to adapt.
It's worth noting that modern day ships and aircraft don't rely on armor for survivability. They depend on stealth and active countermeasures.
The other trend is that drones have lowered the barriers to entry for less capable actors to access airpower. Tanks have always been vulnerable to attack from above without air defenses. What's different today is one doesn't need a fighter jet. A cheap drone can do it.
The US military has consistently been overly dismissive of the significance of low-cost drones because they aren't remotely comparable to a fighter jet. But they don't have to be. They are *far* more accessible and are putting airpower in the hands of many more actors.
Tanks won't be going away, but increasingly tank-on-tank direct fire engagements will be mopping up, like infantry are today, while the decisive ground engagements will be fought with robotic air and ground sensors cueing long-range precision fires.
Large AI models like ChatGPT and GPT-4 are inherently dual use.
@OpenAI's GPT-4 system card walks through several possible misuse risks, including for hacking, disinformation, and proliferation of unconventional weapons (e.g., chem/bio). cdn.openai.com/papers/gpt-4-s…
OpenAI assesses that GPT-4's cyber and chem/bio capabilities are limited today, but AI progress is discontinuous and large models frequently show emergent capabilities.
Dangerous capabilities are likely coming and we may not have much advance warning.
China is building a new model of tech-enabled authoritarianism at home.
The Chinese Communist Party has deployed 500 million surveillance cameras to monitor Chinese citizens.They increasingly use AI tools like facial and gait recognition.
China is exporting its model of digital authoritarianism abroad. At least 80 countries use Chinese surveillance and policing technology.
(Map data courtsey of @SheenaGreitens. Map by @CNASdc)