We're seeing a lot of images of cluster munitions being used in Ukraine, so there's a few things I want to highlight to assist with reporting of the incidents. There's two types of rockets, fired by BM-27 and BM-30 multiple rocket launchers.
These things are big, this is a reconstruction of one from when they first started being used in Syria, and the diagram on the right gives you a sense of the position of the cluster munitions at the front of the rocket, and the rocket motor behind them.
Generally when these munitions deploy their submunitions in the air the rocket motor and cluster munition section separate, and land in the ground, just like in this video from Ukraine, showing the cluster section impacting the ground.
What's we're seeing across Ukraine are the remains of these rockets embedded in the ground, and often being misreported as unexploded rockets. This is evidence of a successful deployment of cluster munitions, not an unexploded weapon.
They can land several hundreds meters away from where the cluster munitions are released, and the direction of impact can also be established by which way the munition remnants are pointing, like in this example
As the original images in this thread shows, it's possible to identify the type of rocket by their width, the configuration of their tail fins, and the rows of holes inside the cluster munition section. If you know which details to look for, they are very distinct.
However, it should also be noted the tail fin section can become detached, which can lead to misidentification. This picture shows both the rocket motor and the detached tail fin section nearby.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
It's been brought to my attention that there's videos published on social media claiming I've made various statements about the US election, related to election integrity. These are part of a Russian disinformation campaign, and the quotes are fabricated, but it's nice to know the Russians hold the value of my opinions in such high regard.
I've previously discussed other videos in this campaign in the below thread:
🧵 1/7: The European Court of Human Rights has ruled in favor of Russian NGOs and media groups (including @Bellingcat), declaring Russia's "foreign agent" legislation a violation of fundamental human rights. The court found that the law imposes undue restrictions on freedom of expression & association.
2/7: The law requires NGOs & individuals receiving foreign funds to register as “foreign agents,” facing stigma, harsh reporting requirements, and severe penalties. This label implies foreign control—without proof—and misleads the public
3/7: The Court noted that the "foreign agent" label, linked to spies & traitors, damages the reputation of those designated and leads to a chilling effect on civil society and public discourse.
It's currently 9:11am, this post has 3 views, and no retweets or likes on an account with 75 followers. Let's see how long it takes for it to get several hundred retweets, and a few tens of thousands of views.
In the last 15 minutes, that tweet just gained 15.7k views, 187 likes, with no retweets. Two other tweets with similarly fake stores, posted around the same time, with similar profiles, have also suddenly gain a couple of hundred likes and around the same amount of views. This is, in real time, how a Russian disinformation campaign is using Twitter to promote its fake stories.
The thing is, nearly all of this engagement, apart from about 10 views and none of the likes, are entirely inauthentic. This doesn't help them reach genuine audiences, it's just boosting their stats so when they report back to their paymasters they can tell them how many views, likes and retweets they got, but they're all fake. It's effectively the people running these campaigns scamming their paymasters to make them think it's working, when it's not at all.
A new fake Bellingcat story, from a fake video claiming to be from Fox News. What's interesting about this one is I viewed the tweet 10 minutes ago, and it had 5 views, and suddenly it jumped to 12.5k, then 16.2k views in less than 5 minutes, with zero retweets or likes.
To me this suggests there's a bot network being used to boost views of tweets used in this disinformation campaign.
In 90 seconds this tweet just gained 154 retweets, another sign of bot activity.
It's clear this is a coordinated attack from pro-Orban media which they really don't want being noticed outside of Hungary, but what they don't seem to realise is I'm now going to use what they did at every presentation I do on disinformation to audiences across the world.
What's notable is the accusations made against Bellingcat were all taken (uncredited) from an article publishing by MintPress claiming we've loads of intelligence agents working for us, which even the original MintPress article fails to prove.
Which to me just means I get to add a couple more slides to the presentation I'll be doing about this, to audiences made up of exactly the sort of people they didn't want to find out about this.
State actors see alternative media ecosystems as a vehicle for promoting their agendas, and take advantage of that by not just covertly funding them, but also giving them access to their officials and platforming them at places like the UN.
A recent example of that is Jackson Hinkle going to Eastern Ukraine, then getting invited to the UN by Russia to speak at a press conference, and that footage being used by state media as evidence of "experts" rejecting the "mainstream narratives" on Ukraine.
A lack of transparency around the funding of the individuals and websites that are part of these alternative media ecosystems allows for state actors to get away with their covert influence, a clear example of which we've seen over the last 24 hours.