There is a new AI proposal from @aipolicyus. It should SLAM the Overton window shut.
It's the most authoritarian piece of tech legislation I've read in my entire policy career (and I've read some doozies).
Everything in the bill is aimed at creating a democratically unaccountable government jobs program for doomers who want to regulate math.
I mean, just check out this section, which in a mere six paragraphs attempts to route around any potential checks from Congress or the courts.
@aipolicyus The amount of bureaucracy this bill would unleash is staggering. The bill attempts to streamline some of this by providing a "Fast track" but the main takeaway of this is how broad the types of software that are likely to be subject to regulation are:
The proposal also allows the Administrator to require any applicant (including those Fast Track applicants, and open source applicants) to adopt "safety procautions" which is entirely open-ended. Not thorough a rule-making process or any sort of due-process-protecting mechanism, but simply as a condition of granting a permit!
Over and over, the legislation has this one-way ratchet clause: the Administrator has the freedom to make rules stricter without any evidence, but has to prove a negative to relax any rules.
Whole section on open source criteria. Again, if a project doesn't get a gov OK, it CANNOT BE CONTINUED. Except for the FastTrack app, I think an app could just sit in process for a long time without approval, preventing court review. This is how to kill open source competitors.
The review process is somewhat similar to the SEC or FTC's Administrative Law Judge process, where the Administrator can overturn what the more independent ALJs decide. Only after all this process can a case be appealed - and then, for some reason, the party seeking the permit only has 20 days to do so. Why?!
Oh, and by the way, if it wasn't clear yet, you can't do ANYTHING until the government says you can.
And if you are operating under a permit and your model gets too good, you have to stop working and stop using it until the government signs off.
The bill creates a registry for all high-performance AI hardware. If you "buy, sell, gift, receive, trade, or transport" even one covered chip without completing the required form on time, you have committed a CRIME. The Administration is directed collect all that competitively sensitive information and compile it into reports.
More wild shit: The Frontier Artificial Intelligence Systems Administration (which I've called "Administration," as in the draft) can straight up compel testimony and conduct raids for any investigation or proceeding, including speculative "proactive" investigations. This really is math cops.
I'm going to skip the civil liability section because its so bonkers I can't handle looking at it any more. This alone would bury the AI industry in an avalanche of lawsuits. (At least the private right of action is limited to alleging >$100 million in "tangible" damages.)
On criminal liability section- THERE IS A CRIMINAL LIABILITY SECTION. FOR DOING MATH. Or for attempting to do math, or for not telling the gov that you're doing math.
Also officials who don't do their jobs can be criminally prosecuted? By whom? I have never seen that before.
Section 16 is "EMERGENCY POWERS". I'm sure this one is measured .....
oh. no. The administrator can, ON HIS OWN AUTHORITY, shut down the frontier AI industry for 6 months.
Oh, and if the President initiates, the Administrator can literally sieze and destroy all the hardware and software. It puts the future in the hands of one dude, who may have formed his opinions on AI from watching the latest Mission Impossible.
Oh look the Administrator can conscript troops:
Other agencies are required to consult with the Administration if they're doing AI enforcement stuff. (And b/c the Administrator has expansive legal authorities beyond anything else in fed. law enforcement except maybe anti-terrorism, I suspect all the cases will end up there.)
And out of nowhere the bill also amends the antitrust laws to give the Administration a near veto on AI mergers. Remarkable.
Almost to the end, any more surprises? Well, funding can come from anywhere, including the fines imposed AND DONATIONS, so that should work out well. Vitalik probably still has some shitcoins laying around.
Finally, the end. No boilerplate severability clause for @aipolicyus, let's tell courts how to do their jobs.
Gotta love that the last eight words of this bill, which is a giant middle finger to the Constitution, are "to the maximum extent permitted by the Constitution."
Seriously, this bill is so authoritarian that it ought to get them laughed out of every congressional office. They might as well have proposed a Constitutional Amendment that says, "New AI Administrator can do whatever they wants not withstanding the rest of this document."
Anyhow, if you want to read the entire fantasy yourself, check it out here. One note: there are several references to Section 11 as the emergency powers portion but obviously they meant Section 16. AI could have caught that one for them. assets.caip.org/caip/RAAIA%20%…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
The proposed moratorium to slow the avalanche of state AI bills (1000+ in 2025) has really spun up folks, but they aren't making great arguments.
Take a new letter by @demandprogress (irony!) and other progressive orgs. It gets basic facts wrong and misrepresents research. 🧵
1/ This characterization of the moratorium is wrong: it isn't a total immunity because states can still enforce any general purpose law against AI system providers, including civil rights laws and consumer protection laws. In fact, the moratorium specifically says that.
2/ False. While it's not quite clear what "unaccountable to lawmakers and the public" means, it is 100% clear that traditional tort liability as well as general consumer protections and other laws would continue to apply. Deliberately designing an algorithm to cause foreseeable harm likely triggers civil and potentially criminal liability under most states' laws.
The flood of state AI regulatory proposals threatens to drown the U.S. AI industry. A late-night @HouseCommerce markup is about to discuss a moratorium on state AI regulation. We submitted a letter from twelve state-based organizations supporting this important provision. A 🧵
2/ Problem: Over 1,000 AI bills proposed in the last 4 months, most in state legislatures. This regulatory tidal wave risks drowning innovation in confusion and conflicting rules.
3/ Patchwork Alert: NY’s RAISE Act alone could force AI labs into costly, confusing inspections; imagine this duplicated across multiple states, each using different rules. Nightmare fuel for startups, boon for lawyers.
1/ Big shift in AI policy: This week Trump repealed Biden’s AI Executive Order and introduced his own Removing Barriers to American Leadership in Artificial Intelligence to shift direction. BUT which Biden-era AI actions should Trump focus on? 🧵
2/ Trump’s new executive order underscores a commitment to cutting red tape and fostering innovation. But Biden’s AI policy isn’t completely gone—it lingers in ongoing agency initiatives. Sect 5 of the EO attempts to clean up these leftovers:
3/ Over at @abundanceinst, we've been tracking all public proceedings that Biden's EO triggered. Below is a breakdown of some of the most important of those proceedings. We commented on many of them, and they now deserve the most scrutiny from the Trump admin.
This @FT op ed by Marietje Schaake pairs well with my op ed with @ckoopman. Keep Congress AND tech CEOs away from AI regulation. 😏
Not joking. A 🧵
Schaake is correct that CEOs have an interest in shaping regulation to benefit their business model. But legislation isn't the only way regulatory capture happens. All prescriptive regulation inheriently favors incumbents b/c it is written for the present. 2/
Future, and especially disruptive, business models and technologies won't fit in those regulatory boxes. Such businesses face regulatory uncertainty PLUS established incumbents who speak the regulators' language. The FCC is a great example of this happening over and over. 3/