In today's #vatnik soup I'll explain why Elon Musk's (@elonmusk) "balancing act" of purging just one side will increase the spread of fake news and disinformation.
First of all I'd like to say that biased, systematic blacklisting of content that Twitter conducted was WRONG.
1/14
Censorship is almost always bad and there are much better ways to fight dis- and misinformation (labels, semi-objective, external fact-checkers, etc.).
But based on recent events, it seems that Musk is just swinging this same system to the opposite direction.
2/14
In his Nov 18, 2022 tweet he said that "Negative/hate tweets will be max deboosted & demonetized", yet there are no definitions or clear rules what is considered hate speech.
Elon also promised to reinstate accounts that were previously suspended.
Another neo-nazi, Daily Stormer's editor Andrew Anglin also regained access to their Twitter profile.
5/14
The leaks from the Twitter Files has been advertised as a conspiracy among Twitter's ex-executives where they secretly decide what content gets seen, but Twitter has had a FAQ about these issues since '18: blog.twitter.com/official/en_us…
6/14
Elon has also participated in the discussion about the Twitter Files. He has attacked the NYT (@nytimes), calling the outlet "an unregistered lobbying firm for far left politicians". He also said that Alex Stamos (@alexstamos) operates a "propaganda platform".
7/14
In 2018, Science published a paper by Vosoughi, Roy, & Aral, titled "The spread of true and false news online". The paper concluded that "false news stories are 70 percent more likely to be retweeted than true stories are" and that ...
8/14
... "It also takes true stories about six times as long to reach 1,500 people as it does for false stories to reach the same number of people." Allowing fake news spread freely will slowly drown the platform and more and more factual news stay hidden. doi.org/10.1126/scienc…
9/14
I also consider Twitter's latest design choices part of the "dark pattern design" group. These design choices "trick" users to specific actions on the platform or "nudge" their thinking to a specific direction.
10/14
Manipulation of the information flow, meaning what tweets Twitter shows us, can manipulate our thinking and "nudge" our worldview slowly to a specific direction.
How does Twitter shape our worldview if most of the content is actually disinformation?
11/14
I have discussed this type of dark pattern design in my 2022 publication, "Facebook’s Dark Pattern Design, Public Relations and Internal Work Culture": doi.org/10.34624/jdmi.…
12/14
Musk has also criticized the "woke culture" taking place in the Western society. We have to remember that this culture war has been fueled by Russian disinformation and propaganda throughout the years, topics ranging from LGBT+ rights to movements like BLM.
13/14
To conclude: With the recent changes, we can pretty much expect Twitter to be the same as before, but the pendulum just swings from the left to the right.
And it will also contain MUCH more disinformation than before.
In today’s Vatnik Soup, I’ll explain the Alaska Fiasco and how it marks the peak of Trump’s two-year betrayal of Ukraine. What was sold as “peace talks” turned into a spectacle of weakness, humiliation, empty promises, and photo-ops that handed Putin exactly what he wanted.
1/24
Let’s start with the obvious: Trump desperately wants the gold medal of the Nobel Peace Prize, mainly because Obama got one. That’s why he’s now LARPing as a “peace maker” in every conflict: Israel-Gaza, Azerbaijan-Armenia, India-Pakistan, and of course Ukraine-Russia.
2/24
Another theory is that Putin holds kompromat — compromising material such as videos or documents — that would put Trump in an extremely bad light. Some have suggested it could be tied to the Epstein files or Russia’s interference in the 2016 US presidential election.
In today’s Vatnik Soup, I’ll talk about engagement farming: a cynical social media tactic to rack up likes, shares, and comments. From rage farming to AI-powered outrage factories, engagement farming is reshaping online discourse and turning division into profit.
1/23
Engagement farming is a social media tactic aimed at getting maximum likes, shares, and comments, with truth being optional. It thrives on provocative texts, images, or videos designed to spark strong reactions, boost reach, and turn online outrage into clicks and cash.
2/23
One subset of engagement farming is rage farming: a tactic built to provoke strong negative emotions through outrageous or inflammatory claims. By triggering anger or moral outrage, these posts often generate 100s or even 1,000s of heated comments, amplifying their reach.
In today’s Vatnik Soup, I’ll cover the autocratic concept of “Good Tsar, Bad Boyars”: the idea that the leader is wise and just, but constantly sabotaged by corrupt advisors. This narrative shields the ruler from blame, and it’s used by both Putin and Trump today.
1/20
The phrase “Good Tsar, Bad Boyars” (Царь хороший, бояре плохие), also known as Naïve Monarchism, refers to a long-standing idea in Russian political culture: the ruler is good and benevolent, but his advisors are corrupt, incompetent and responsible for all failures.
2/20
From this perception, any positive action taken by the government is viewed as being an accomplishment of the benevolent leader, whereas any negative one is viewed as being caused by lower-level bureaucrats or “boyars”, without the approval of the leader.
In today’s Vatnik Soup, I’ll introduce a Russian politician and First Deputy Chief of Staff of the Presidential Administration of Russia, Sergey Kiriyenko. He’s best known for running both domestic and foreign disinformation and propaganda operations for the Kremlin.
1/20
On paper, and in photos, Kiriyenko is just as boring as most of the Kremlin’s “political technologists”: between 2005-2016 he headed the Rosatom nuclear energy company, but later played a leading role in the governance of Russia-occupied territories in Ukraine.
2/20
What is a political technologist? In Russia, they’re spin doctors & propaganda architects who shape opinion, control narratives, and manage elections — often by faking opposition, staging events, and spreading disinfo to maintain Putin’s power and the illusion of democracy.
Let me show you how a Pakistani (or Indian, they're usually the same) AI slop farm/scam operates. The account @designbonsay is a prime example: a relatively attractive, AI-generated profile picture and a ChatGPT-style profile description are the first red flags.
1/5
The profile's posts are just generic engagement farming, usually using AI-generated photos of celebrities or relatively attractive women.
These posts are often emotionally loaded and ask the user to interact with them ("like and share if you agree!").
2/5
Then there's the monetization part. This particular account sells "pencil art", which again are just AI-generated slop.
In today’s Vatnik Soup, I’ll introduce an American lawyer and politician, Mike Lee (@BasedMikeLee). He’s best-known for opposing the aid to Ukraine, undermining NATO by calling the US to withdraw from the alliance, and for fighting with a bunch of braindead dogs online.
1/21
Like many of the most vile vatniks out there, “Based Mike” is a lawyer by profession. He hails from the holy land of Mormons, Utah, where he faces little political competition, allowing him to make the most outrageous claims online without risking his Senate seat.
2/21
Before becoming a senator, Mike fought to let a nuclear waste company dump Italian radioactive waste in Utah, arguing it was fine if they just diluted it. The state said no, the public revolted, and the courts told poor Mikey to sit down.