In today's #vatniksoup, I'll talk briefly about the Community Notes system and why it doesn't work. I've previously stated that the Community Notes mechanism is a "mob rule" and can be played easily by big accounts and troll farms.
1/15
Community Notes is a community-driven content moderation program, intended to provide informative context based on a crowd-sourced voting system. As of Nov 2023, this system had over 130 000 contributors.
2/15
The idea of a crowd-sourced system as a moderation tool did not come from Elon - it was announced already back in 2020 when it was called Birdwatch. Musk later rebranded the system as Community Notes and sold it to the platform as something new.
3/15
Vitalik Buterin (@VitalikButerin) has made a very extensive (and technical) analysis on the tool and the Community Notes algorithm as a whole.
I disagree with him on some points, but I really suggest to everyone to read it:
Also, focusing the analysis only on the algorithm and the technical aspects is simplifying the concept, as it rules out the most important variable: human factor.
People are prone to bias and disinformation tends to spread much more aggressively than the truth.
5/15
Twitter's former head of safety, Yoel Roth, has stated that the system was never intended to replace the curation team, but to complement it. But all this of course changed after Elon sacked everyone from Twitter's Trust and Safety in order to save money.
6/15
These sackings have resulted in long response times on reports on hate speech - X's attempts to deal with hateful direct messages has slowed down by 70%.
As of today, the company doesn't have any full-time staff singularly dedicated to hateful conduct issues globally.
7/15
Some Community Notes contributors (who are also NAFO activists) have claimed that the system is riddled with coordinated manipulation, infighting and lack of oversight from the platform. Also, many contributors engage in conspiracy-fueled discussions.
8/15
The Notes system also has a huge problem with its scalability. During events like the 7 Oct 2023 Hamas terrorist attack, the amount of disinformation grows so large, that it's simply impossible for the small community to keep up and check factuality of said content.
9/15
Analysis by NewsGuard showed that the most popular disinformation posts related to the Israel-Hamas war (not so surprisingly originating from serial liars like @jacksonhinklle, @drloupis and @ShaykhSulaiman) failed to receive Community Notes 68% of the time.
10/15
These big accounts also have the ability to fight against the Notes they've received by mobilizing people who support their views. In the most tragicomic instance, @elonmusk claimed, without any evidence, that a Community Note on his post was "gamed by state actors".
11/15
Other than being humiliated and ridiculed, getting Community Noted doesn't really have any major downsides. Noted posts don't provide you income and advertisers can decide if they want to show ads on accounts like @dom_lucre's, but most of these...
12/15
...so-called superspreader accounts make most of their income through other means, namely through the X's subscription system. Also, many of them, including @stillgray and (allegedly) @jacksonhinklle are employed by state actors like Russia and the CCP.
13/15
With accounts that post tens or hundreds posts a day, the Notes are also inefficient - while the Community is trying to put a note on a post that's clearly disinformation, there are already 10 or 20 new ones to replace it in the algorithm.
14/15
To conclude, Community Notes are a non-functional and slow mechanism that's desperately trying to replace the Trust and Safety team. They work on a "mob rule" basis and big enough accounts (including the owner of the platform) can play around the system.
In today’s Vatnik Soup, I’ll introduce an American lawyer and politician, Mike Lee (@BasedMikeLee). He’s best-known for opposing the aid to Ukraine, undermining NATO by calling the US to withdraw from the alliance, and for fighting with a bunch of braindead dogs online.
1/21
Like many of the most vile vatniks out there, “Based Mike” is a lawyer by profession. He hails from the holy land of Mormons, Utah, where he faces little political competition, allowing him to make the most outrageous claims online without risking his Senate seat.
2/21
Before becoming a senator, Mike fought to let a nuclear waste company dump Italian radioactive waste in Utah, arguing it was fine if they just diluted it. The state said no, the public revolted, and the courts told poor Mikey to sit down.
In today’s Vatnik Soup, I’ll introduce an American national security policy professional and the current under secretary of defense for policy, Elbridge Colby (@ElbridgeColby). He’s best-known for fighting with cartoon dogs online and for halting military aid to Ukraine.
1/21
Elbridge "Cheese" Colby earned his bachelor’s degree from Yale and a Juris Doctor from Harvard Law School. Before entering government, he worked at top think tanks and in the intelligence community, focusing on nuclear policy and strategic planning.
2/21
Cheese quickly became a key voice for a “China First” strategy, arguing the US must prioritize military buildup in Asia over commitments in Europe or the Middle East. He sees (or saw, rather) Taiwan as the core test of US credibility.
In today’s Vatnik Soup, I’m going to talk about… Vatnik Soup! As some of you know, we also have a website where you can find every soup ever published. The site also has other useful resources, making it the most comprehensive resource on Russian disinformation & vatniks.
1/15
Unfortunately, Elon has flagged the website as malware, as he might not be very happy about the soups I wrote about him - so far, they have garnered over 60 million views on X/Twitter.
The “freedom of speech” spokesperson doesn’t seem too keen on free speech, after all.
2/15
The heart & soul of the website is of course the soups page. There you can find all 360+ soups, which can be sorted chronologically, by popularity, etc. You can also search for soups by title or even in the soup text:
In today’s Wumao Soup, I’ll introduce how and where the Chinese Communist Party’s (CCP) online propaganda and influence operations work. Due to China’s massive population and advances in AI, CCP-aligned online content has become increasingly visible.
1/20
Like Russia’s troll farms, China has its own troll army: the “50 Cent Party” or “Wumao” refers to state-linked online commentators who are reportedly paid ¥0.50 per post to steer discussions away from criticism and amplify CCP narratives on social media.
2/20
Back in 2017, a research paper estimated that the Wumao produced almost 500 million fabricated comments annually to distract readers and shift topics. In that sense, Wumao operates very similarly to the Russian “Firehose of Falsehood” model:
In today’s Vatnik Soup and the “Degenerate Russia” series, I’ll show you the brutal reality of Russian war crimes, in particular the horrific tortures and sexual abuses of children, women and men.
Buckle up, this one is not for the faint-hearted.
1/24
For over a decade now and as part of their “firehose of falsehood” propaganda strategy, Russia has been spreading false narratives targeted at right-wing/conservative audiences, portraying russia as a bastion of Christian, traditional,family values.
In the previous “degenerate Russia” series we discussed Russia’s insanely high divorce rates, rampant domestic violence, high murder rates, thriving neo-Nazi culture, corruption of the Orthodox Church, and their massive demographic problem: