Pekka Kallioniemi Profile picture
Apr 8, 2024 19 tweets 9 min read Read on X
In today's #vatniksoup, I'll talk about the state of generative AI (GAI) today, and how it will affect propaganda & disinformation in the near future.The development of GAI right now is so fast, that it's very difficult to keep track of what's going on,but here's the latest.
1/18 Image
Generative AI is a technology capable of generating text, images, videos, or other data using generative models often in response to prompts. Some examples of GAI are platforms like Midjourney, DALL-E, ChatGPT and all those chatbots you find on every service desk these days.
2/18
Image
Image
In the last two years, GAI platforms, especially those that generate images, have become much more powerful. For example below, you see a comparison between generated content with different versions of Midjourney: v1 was published in Feb 2022, and v6 came out in Dec 2023.

3/18 Image
In the context of the Russo-Ukrainian War, there have been multiple examples of GAI that's been produced as anti-Ukrainian propaganda.

Few weeks after the full-scale war started, a crude and badly made Zelenskyy deepfake surfaced on social media.

4/18
There was also a big push to make Zelenskyy appear as a cocaine addict. This was done by cutting and pasting parts from his interviews, but there was also a video published which showed "white powder" on his workdesk. The video of course was a edited and fake.

5/18
In Nov 2023, a deepfake video depicting then Commander-in-Chief Valerii Zaluzhnyi made rounds on Telegram. In the fake video, he declared Zelenskyy as an "enemy of the state", and rallied people to "deploy their weapons and enter Kyiv".

6/18
Another deepfake video was published in Feb 2024. In this video, deepfake audio of the governor of Texas, Greg Abbott, claims that Joe Biden should "deal with real internal problems" and not "play with Putin, from whom he needs to learn how to work for national interests".

7/18
Another, more recent example was published in Mar 2024, some time after the Crocus City Hall terrorist attack. This deepfake video was a composition of two interviews with Danilov and it used a deepfake audio that was then dubbed on the video.

8/18
At the beginning of the war, when only a tiny amount of information and media was coming from the front lines, people shared a lot of video game content and videos from previous conflicts. This same phenomenon was evident after the 7th Oct terrorist attack.

9/18
Image
Image
Deepfake content is also used to sway opinions at the election polls. In the 2023 Slovak parliamentary election, a deepfake audio was used to construct a phone call about "rigging the elections" and raising the cost of beer. The latter may have even swung the elections!

10/18 Image
GAI is also used to generate text. One such example is recently launched fake news blog London Crier. According to @eliothiggins, most of the content on this site is stolen from other sites & then re-written by AI,with occasional hand-written fake news thrown in the mix.

11/18

Image
Image
Image
Most of the examples above are crude and not very convincing. Next, let's look at what can be done today with state-of-the-art GAI technologies. For example, here's a GAI video produced with Sora AI, a GAI video platform created by OpenAI.

12/18
Here's another example that was made by TikTok user @tech.bible. The AI used 30 seconds of both webcam and audio content to create this completely fake video. This is already pretty convincing, and it's just the start.

13/18
Generative AI can also be used to create music. Here's a song titled "Diapers and Deception (Fuck Vladimir Putin)", an electronic deep house song with vocals that took me around 15 seconds to make. As you can hear, it's a banger and a certified summer hit.

14/18
Below you see photorealistic examples made with the latest version of Midjourney AI (v6). I, for one, have hard time telling that these images were generated by AI. Many platforms are also figuring out how to overcome the challenging parts, like generating human hands.

15/18


Image
Image
Image
Image
Deepfakes, robocalls, and other techniques are already used for nefarious purposes. For example, fake robocalls that impersonated Joe Biden were used to sway voters in New Hampshire, and a finance worker transferred 25 million USD after a tele call with a deepfake boss.

16/18
Image
Image
To conclude, GAI is developing FAST, and it will change our attitude towards information & disinformation. In the near future, it will be extremely difficult to notice a difference between real and fake content. People will also have hard time believing in anything anymore.
17/18
Finally, I would recommend everyone to follow @Shayan86 and @O_Rob1nson, as they do the best fact-checking and de-bunking when it comes to fake content.

18/18
Image
Image

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Pekka Kallioniemi

Pekka Kallioniemi Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @P_Kallioniemi

Aug 18
In today’s Vatnik Soup, I’ll explain the Alaska Fiasco and how it marks the peak of Trump’s two-year betrayal of Ukraine. What was sold as “peace talks” turned into a spectacle of weakness, humiliation, empty promises, and photo-ops that handed Putin exactly what he wanted.

1/24 Image
Let’s start with the obvious: Trump desperately wants the gold medal of the Nobel Peace Prize, mainly because Obama got one. That’s why he’s now LARPing as a “peace maker” in every conflict: Israel-Gaza, Azerbaijan-Armenia, India-Pakistan, and of course Ukraine-Russia.

2/24 Image
Image
Image
Image
Another theory is that Putin holds kompromat — compromising material such as videos or documents — that would put Trump in an extremely bad light. Some have suggested it could be tied to the Epstein files or Russia’s interference in the 2016 US presidential election.

3/24 Image
Image
Read 25 tweets
Aug 11
In today’s Vatnik Soup, I’ll talk about engagement farming: a cynical social media tactic to rack up likes, shares, and comments. From rage farming to AI-powered outrage factories, engagement farming is reshaping online discourse and turning division into profit.

1/23 Photo by JULIE OLIVER /Postmedia
Engagement farming is a social media tactic aimed at getting maximum likes, shares, and comments, with truth being optional. It thrives on provocative texts, images, or videos designed to spark strong reactions, boost reach, and turn online outrage into clicks and cash.

2/23 Image
One subset of engagement farming is rage farming: a tactic built to provoke strong negative emotions through outrageous or inflammatory claims. By triggering anger or moral outrage, these posts often generate 100s or even 1,000s of heated comments, amplifying their reach.

3/23 Image
Read 24 tweets
Aug 6
In today’s Vatnik Soup, I’ll cover the autocratic concept of “Good Tsar, Bad Boyars”: the idea that the leader is wise and just, but constantly sabotaged by corrupt advisors. This narrative shields the ruler from blame, and it’s used by both Putin and Trump today.

1/20 Image
The phrase “Good Tsar, Bad Boyars” (Царь хороший, бояре плохие), also known as Naïve Monarchism, refers to a long-standing idea in Russian political culture: the ruler is good and benevolent, but his advisors are corrupt, incompetent and responsible for all failures.

2/20 Image
From this perception, any positive action taken by the government is viewed as being an accomplishment of the benevolent leader, whereas any negative one is viewed as being caused by lower-level bureaucrats or “boyars”, without the approval of the leader.

3/20 Image
Read 21 tweets
Jul 28
In today’s Vatnik Soup, I’ll introduce a Russian politician and First Deputy Chief of Staff of the Presidential Administration of Russia, Sergey Kiriyenko. He’s best known for running both domestic and foreign disinformation and propaganda operations for the Kremlin.

1/20 Image
On paper, and in photos, Kiriyenko is just as boring as most of the Kremlin’s “political technologists”: between 2005-2016 he headed the Rosatom nuclear energy company, but later played a leading role in the governance of Russia-occupied territories in Ukraine.

2/20 Image
Image
What is a political technologist? In Russia, they’re spin doctors & propaganda architects who shape opinion, control narratives, and manage elections — often by faking opposition, staging events, and spreading disinfo to maintain Putin’s power and the illusion of democracy.

3/20 Image
Read 21 tweets
Jul 27
Let me show you how a Pakistani (or Indian, they're usually the same) AI slop farm/scam operates. The account @designbonsay is a prime example: a relatively attractive, AI-generated profile picture and a ChatGPT-style profile description are the first red flags.

1/5 Image
The profile's posts are just generic engagement farming, usually using AI-generated photos of celebrities or relatively attractive women.

These posts are often emotionally loaded and ask the user to interact with them ("like and share if you agree!").

2/5 Image
Image
Then there's the monetization part. This particular account sells "pencil art", which again are just AI-generated slop.

Country code for the phone number is in Pakistan.

3/5 Image
Read 5 tweets
Jul 15
In today’s Vatnik Soup, I’ll introduce an American lawyer and politician, Mike Lee (@BasedMikeLee). He’s best-known for opposing the aid to Ukraine, undermining NATO by calling the US to withdraw from the alliance, and for fighting with a bunch of braindead dogs online.

1/21 Image
Like many of the most vile vatniks out there, “Based Mike” is a lawyer by profession. He hails from the holy land of Mormons, Utah, where he faces little political competition, allowing him to make the most outrageous claims online without risking his Senate seat.

2/21 Image
Before becoming a senator, Mike fought to let a nuclear waste company dump Italian radioactive waste in Utah, arguing it was fine if they just diluted it. The state said no, the public revolted, and the courts told poor Mikey to sit down.

3/21 Image
Read 23 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us!

:(