So, the “fund Hawaii, not Ukraine” talking point has a lot of characteristics of a coordinated campaign that turned “organic” after seeding the narrative. These are a few examples.
These are the very first two tweets that used the phrase “Hawaii not Ukraine.” They were posted at 4:51 and 4:54 on 8/9, in response to a @DOTHawaii tweet.
@DOTHawaii Then this quote-tweet was posted an hour later, at 5:57 pm. Look at the engagement this tweet got compared to the one sent an hour earlier.
@DOTHawaii Then this tweet — the 4th tweet to use that specific phrase (“Hawaii, not Ukraine”) — was posted an hour later, at 6:52 pm. This was a breakthrough tweet that ranked at the top of the search results even days later.
@DOTHawaii After that ^ tweet was posted, the talking point started being picked up and tweeted more frequently. These two tweets, again using the same phrase (“Hawaii not Ukraine”) were sent just minutes after that last tweet, and from there it continued to pick up over the next two days.
This is a common format for influence/disinfo campaigns, where a talking point gets seeded by a few key accounts (amplifiers), then picked up by their networks. A big indicator of coordination here is the timing — the first 4 tweets were each sent almost exactly an hour apart.
The first account that tweeted the phrase “Hawaii not Ukraine” is interesting. It’s a small account and was created in July.
The account’s first tweet on 7/19 was a 34 second Space that was nothing but background music. The account has tweeted/RTd 136 times since then.
The account’s retweets are interesting, too. They’ve RT’d literally every single major right-wing influencer on Twitter, in what is almost certainly a purposeful/strategic process. To me, this suggests that the account is making contact for the purpose of signaling the network.
I’ll write this up with more info, but this was an interesting case study to break down in real time.
It’s also a great reminder/warning that crises & natural disasters are major attack surfaces (for malign state & non-state actors, extremists, etc) and we’re unprepared for it.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Given current events, it bears repeating that when the Trump admin spreads disinformation, they’re not doing it because they expect you to accept their lies as truth. They do it to erode the notion of truth and destroy our ability to distinguish between truth & falsehood.
The act of lying is bad enough. But selling the idea that truth doesn't matter — or doesn't even exist — is far more corrosive. Democracy rests upon a shared understanding of basic facts. We can't debate issues or hold leaders accountable w/o these agreed upon facts.
If the Trump administration can cast doubt on the very existence of an objective truth, they can also undermine the external mechanisms that we rely on to hold government officials accountable & prevent abuses of power.
You’ve probably seen Nick Shirley’s video accusing Somali-run daycares in Minnesota of fraud. Hopefully you’ve also seen some of the follow-ups showing that security footage & operating hours disprove his central claim of “no children.”
X’s new “About this account” feature just accidentally revealed a vast network of covert foreign influence accounts posing as Americans but operating from overseas — the most sweeping public exposure of covert influence on a major platform since 2016. Story is linked below.
Some of these accounts have hundreds of thousands of followers. They present themselves as American patriots, veterans, moms, truck drivers, or lifelong Republicans. Many are explicitly MAGA. But their operators are posting from overseas while shaping U.S. political narratives.
It’s not just MAGA accounts, but mostly it is. Several large anti-Trump accounts were also revealed as foreign-run, as were public health networks. The common denominator is deception: pretending to be American participants in US politics while pushing highly divisive content.
I wrote about a secret tactic shaping what you see online — one almost no one’s talking about. It’s called Moderation Sabotage, and it’s how political digital operatives overwhelm social media defenses so lies go viral before truth can catch up. Link is posted below.🧵
Imagine flooding the system so completely that moderators can’t respond in time. That’s the playbook: swamp the filters, delay enforcement, and let false or incendiary content live long enough to trend.
By the time platforms react, the damage is done.
This isn’t random chaos. It’s deliberate. Trump’s digital allies — the same architects behind Stop the Steal — have refined Moderation Sabotage into an election-year weapon. Rather than hacking the code, they’re hacking the people who keep the code honest.
NEW: AI campaigns are learning to run themselves — and using our data to do it. Without stricter safeguards, we may soon see AI controlling the very governing bodies that could enforce those safeguards in the first place..
(Link in next tweet).
I took 2 months off due to health problems, and when I returned, I expected to see the normal disinformation playbook in action. Indeed, that was waiting for me. But so was something else: AI is now running for office & pushing humans out of the process.
We’ve already seen AI playing a big role in politics, including several attempts to get an AI system elected to office in order to act as the decision-maker, while humans would simply act as the body for AI’s policies and initiatives. weaponizedspaces.substack.com/p/ai-political…
The “controversy” over Sydney Sweeney is absurd and largely fake, but there’s one thing worth paying attention to — the tried and tested formula used by the right-wing outrage machine to manufacture liberal fury and then bait the left into making it a reality.
Here’s how it works:
First, invent the outrage. This usually involves picking a neutral or mildly provocative event and finding something about it to frame as being offensive to the left. In this case, the slogan (“Sydney Sweeney has great jeans”).
Second, flood the zone. Carry out a social media blitz and manufacture the appearance of outrage by gaming the algorithm with repetitive content, which will then get pushed into trending feeds and recommended videos — creating the perception that people actually care about it.