In today's #vatniksoup, I'll talk briefly about the Community Notes system and why it doesn't work. I've previously stated that the Community Notes mechanism is a "mob rule" and can be played easily by big accounts and troll farms.
1/15
Community Notes is a community-driven content moderation program, intended to provide informative context based on a crowd-sourced voting system. As of Nov 2023, this system had over 130 000 contributors.
2/15
The idea of a crowd-sourced system as a moderation tool did not come from Elon - it was announced already back in 2020 when it was called Birdwatch. Musk later rebranded the system as Community Notes and sold it to the platform as something new.
3/15
Vitalik Buterin (@VitalikButerin) has made a very extensive (and technical) analysis on the tool and the Community Notes algorithm as a whole.
I disagree with him on some points, but I really suggest to everyone to read it:
Also, focusing the analysis only on the algorithm and the technical aspects is simplifying the concept, as it rules out the most important variable: human factor.
People are prone to bias and disinformation tends to spread much more aggressively than the truth.
5/15
Twitter's former head of safety, Yoel Roth, has stated that the system was never intended to replace the curation team, but to complement it. But all this of course changed after Elon sacked everyone from Twitter's Trust and Safety in order to save money.
6/15
These sackings have resulted in long response times on reports on hate speech - X's attempts to deal with hateful direct messages has slowed down by 70%.
As of today, the company doesn't have any full-time staff singularly dedicated to hateful conduct issues globally.
7/15
Some Community Notes contributors (who are also NAFO activists) have claimed that the system is riddled with coordinated manipulation, infighting and lack of oversight from the platform. Also, many contributors engage in conspiracy-fueled discussions.
8/15
The Notes system also has a huge problem with its scalability. During events like the 7 Oct 2023 Hamas terrorist attack, the amount of disinformation grows so large, that it's simply impossible for the small community to keep up and check factuality of said content.
9/15
Analysis by NewsGuard showed that the most popular disinformation posts related to the Israel-Hamas war (not so surprisingly originating from serial liars like @jacksonhinklle, @drloupis and @ShaykhSulaiman) failed to receive Community Notes 68% of the time.
10/15
These big accounts also have the ability to fight against the Notes they've received by mobilizing people who support their views. In the most tragicomic instance, @elonmusk claimed, without any evidence, that a Community Note on his post was "gamed by state actors".
11/15
Other than being humiliated and ridiculed, getting Community Noted doesn't really have any major downsides. Noted posts don't provide you income and advertisers can decide if they want to show ads on accounts like @dom_lucre's, but most of these...
12/15
...so-called superspreader accounts make most of their income through other means, namely through the X's subscription system. Also, many of them, including @stillgray and (allegedly) @jacksonhinklle are employed by state actors like Russia and the CCP.
13/15
With accounts that post tens or hundreds posts a day, the Notes are also inefficient - while the Community is trying to put a note on a post that's clearly disinformation, there are already 10 or 20 new ones to replace it in the algorithm.
14/15
To conclude, Community Notes are a non-functional and slow mechanism that's desperately trying to replace the Trust and Safety team. They work on a "mob rule" basis and big enough accounts (including the owner of the platform) can play around the system.
In this first (and maybe last?) Basiji Soup, we’ll look at… the Islamic Republic of Iran, its disinformation operations, its hypocrisy, how it sells its atrocities as virtue and its repression as morality, how it serves the Kremlin, and the current protests against it.
1/20
Basijis are members of the most fanatical part of the Islamic Revolutionary Guard Corps (IRGC). In a broader sense: Iranian regime loyalists & propagandists. They may be fewer than vatniks or wumaos, but the goal is the same: destabilize the West to protect a brutal regime.
2/20
The regime oppressing Iran is a “theocratic” authoritarian state around a “Supreme Leader” hiding behind religion to justify its crimes: censorship, repression, executions, torture and terror — similar to Russia and its “holy war” against Ukraine.
In today’s Vatnik Soup, we introduce our first Czech vatnik, Tomio Okamura. He’s best known for building a political career on xenophobia while being of mixed origins himself, and for pushing Kremlin narratives in Czechia, a country otherwise very supportive of Ukraine.
1/19
Okamura was born in Tokyo in 1972 to a Japanese-Korean father and Czech mother. He spent part of his childhood in Japan, and part in a Czechoslovak foster home where he was heavily bullied. His mixed origins made it difficult for him to fit in either country.
2/19
Nonetheless, after working odd jobs in Japan, Tomio returned to Czechia and became a successful entrepreneur in Japanese tourism. He then rose in politics: Senator in 2012, MP in 2013, he founded two parties: Dawn of Direct Democracy and SPD (Freedom and Direct Democracy).
In today’s Vatnik Soup, we’ll introduce an American billionaire, real estate developer, and wannabe diplomat, Steve Witkoff. He’s best known for trying to sell Ukraine to Putin and for helping Trump sell this treason and encouragement of genocidal war as “peace”.
1/20
Steve studied law and political science at Hofstra University in New York. After law school, he worked as a real estate attorney, which led him into property acquisitions and development. He first met Trump in the 1980s when Trump was a client of his real estate law firm.
2/20
In 1997, Witkoff founded the Witkoff Group, a New York–based real estate development and investment firm. The firm has owned and developed dozens of properties in New York and other major US cities, making Witkoff quite wealthy, with some interesting business connections.
In today’s Vatnik Soup, our first on a non-human vatnik, we’ll talk about… Grok @grok. It’s best known for turning into Mecha-Hitler and Mecha-Putler and for defending its vatnik master, Elon Musk, at all costs, up to being willing to sacrifice the rest of mankind for him.
1/24
Let’s start with an introduction into how Large Language Models (LLMs) work, and the new “arguing with your toaster” phenomenon. LLMs like Grok are Artificial Intelligence (AI) but not the way we had imagined — a new form of intelligence that would somehow think like us.
2/24
Instead, LLMs are basically “guessing engines” and search engines trained on a massive dataset to give you the output you expect: they are imitating intelligence rather than being an actual intelligence. They’re chatbots generating responses pretending to be a helpful AI.
Robert Amsterdam is also a registered (and well-paid!) agent of Maduro’s Venezuela, the socialist regime and ally of Russia which Tucker Carlson has recently defended for some reason, shocking many of his right-wing supporters.
In today’s Vatnik Soup, we’ll explain the context of the upcoming Budapest Blunder, and how it follows the infamous Alaska Fiasco from two months ago and Trump’s absurd delaying of serious aid to Ukraine and effective sanctions on Russia for the past nine months.
1/20
Two months ago, Trump embarrassed the United States by rolling out the red carpet for war criminal dictator Putin and overall acting like a pathetic servant eager to meet his master. Of course, the Alaska Fiasco didn’t bring peace any closer.
Worse, the main outcome of the humiliation was to delay serious sanctions, which the US Congress, in rare bipartisan unity against Russia, was on the verge of passing. Two weeks by two weeks, Trump Always Chickens Out, postponing any real pressure on Putin for 9 months now.