What we know is that their aims were to label Sikh political interests as extremist, stoke cultural tensions within India & overseas & promote the Indian Govt.
So WHO are the #RealSikhs?
WHAT was their aim?
And HOW did they attempt to distort perceptions?
First, WHO were these accounts trying to be and what did they look like?
The accounts were well-curated. They had relatively high follower numbers & were very active.
The personas were replicated across Twitter, Facebook and Instagram.
Let's take a look at some of the accounts. The first pattern identified in this network is the profile images.
Most of the accounts in this network used the images of Bollywood actresses & celebrities.
Here is @jimmykaur3. It's actually an image of an actress.
Here is @Amarjot42854873? Again - fake. This person does exist and it's not Amarjot Kaur.
Next up @TanvirSandhu16. Again, also fake. This is actually an image of Isha Rikhi.
Let's take a look at @nupurkaur1. This is actually a photo of the sister of Mubashra Aslam.
Are we seeing a pattern here?
Here is @kaurgunjann with more than 7000 followers.
This is actually actress Mandy Takhar.
I call these accounts sock puppets. They are fictitious online identities attempting to be a specific persona.
And these ones have a story that doesn't just exist on Twitter - let's take a look.
These fake accounts carry their personas over to different platforms.
We found that 22 accounts that were the same personas on Twitter & Facebook. They used the same image, name, cover photo, and posted the same content.
Here is an example of 'Sanpreet' on both platforms.
And here is 'Gunjan' - also copied across to Facebook.
Many of the personas were present on Instagram as well.
Note: the accounts had significantly less success with metrics on Instagram and Facebook in comparison to the traction they gained on Twitter.
So WHAT was their aim?
We can glean some obvious details from the content the accounts post. Here is some of the Facebook activity we looked at. It shows a strong focus on countering Sikh independence. Note the prominence of tags such as #PakistanBehindKhalistan
That same tag, as seen on Twitter, shows extremely similar content. Again targeting Sikh independence.
You can see much of the content had few interactions.
However, some of the content gained significant traction. For example this tweet about independence groups overseas received more than 3000 retweets and 16,000 likes.
Not bad for a sock puppet with a set list of talking points to promote certain narratives.
The coordinated fake network also amplified concerning narratives that attempted to define what 'real Sikh' and a 'fake Sikh' is.
The fake network of accounts also attempted to push specific narratives about the farmers' protests using messaging claiming that ‘Khalistani terrorists’ hijacked the protests.
There was also a common theme throughout the network of fake personas to retweet or tweet about the Indian Armed Forces and Indian Army content.
This content was unique as it was not related to Sikh independence much like the other content.
While most of the content appears to push specific talking points, or narratives, some tweets (example below) went further in a call for 'nationalists' to 'counter & expose' groups this network labelled as extremists.
While the fake personas appeared to gain significant traction on Twitter, their tweets and images were also linked, embedded, or reposted on news sites and blogs indicating platform breakout.
Below are two examples of this.
Often with influence operations, looking at a network through visualised data helps with a lot of things - namely identifying how it operates, size, spread and also finding more accounts.
I've set out that data using @Gephi to show the core network of primarily fake accounts and the wider network of amplifiers in the outer ring.
Doing helps to visualise the interactions (likes, retweets) between accounts.
For example, here I have highlighted the interactions of @SimranK60419840 to show its personal spread within the network.
Though the captured activity was only a sample of these accounts, it did show that there was consistent amplification from smaller fake accounts (on right in inner circle).
Using those accounts on the right, we could find more that were being retweeted within the fake network
I have received a number of messages asking about the location of this IL-76 (RA-76502) from Russian Aviacon Zitotrans (usually responsible for arms cargo).
It is not landing in Sudan (alleged in the comments), but is landing in Faya-Largeau in northern Chad. More in thread 👇
I identified an original and more clear version on TikTok (not hard if you want to find it). In it, we can see the registration number of the plane much more clearly, as well as the dune and cliffs in the background.
The registration, RA-76502 that had been owned by JSC Aviacon Zitotrans, named by US as "a Russian cargo airline that has handled cargo shipments for sanctioned Russian Federation defense entities".
This festive season I’m sharing a video every day for the next 24 days showing useful OSINT tools & techniques. Creating this OSINT Advent Series has been a lot of fun and I hope it’s helpful for the ever-growing OSINT community! 🎄👇
1. Searching Facebook with WhoPostedWhat.
2. Using AI to identify a car model in an image with Carnet.
Today is #WorldMentalHealthDay. As digital investigators we're often not experiencing what we see online, in real life, but it can still affect many.
So to keep doing the important documentation and investigative work online, here's a few practical steps you can take.👇
1. When sharing graphic content with colleagues and friends, remove those previews and give a little graphic warning indicator. There's always a little 'X' in the corner to remove preview.
2. Consider changing the settings on your social media platforms so you're not absorbing horrific content while doomscrolling. You don't have to filter it out, but at least stop the autoplay.
I was recently on a flight across Australia when I spotted this massive figure on the ground. It led me on a digital journey to find out what it was, how it got there and who made it.
I'm going to explain a bit more about what it is in this thread, and how I found out. 👇🧵
Without internet on the plane, I made a screenshot on my phone of the location (yes it was on flight mode) and later used flight tracking to pinpoint exactly where I saw it.
Using @flightradar24, I traced the path my plane took and found the spot!
Next stop: @googleearth 🗺️
I zoomed into the area and there it was — this huge humanoid figure etched in the Australian outback. Measuring 2.7km tall and covering an area of more than 1.7 square kilometres, it's a sight to behold.
Despite our reporting last week on the fake network of pro-Trump MAGA accounts, there appears to be many more accounts actively posting the exact same content. This one, @brenda_otto_ with 18k+ followers, is stealing photos from an Australian Instagram model in Queensland.
Here's another fake MAGA account, @Tracy_Miller044, stealing images from a fashion blogger (a popular one too).
@Tracy_Miller044 - if you want to reach out and talk about your work, why you're running these campaigns etc, I'm open.
Here is @Sarah_Hickey__
Sarah is stealing the photos of a Czech Instagram influencer to create a persona claiming to be a MAGA republican conservative with 40k+ followers and a blue tick (which means it's apparently not misleading).
A network of fake accounts are posing as young American women, posting pro-Trump content and disinformation, but they’re hiding behind, and manipulating, the images of European fashion influencers.
Our latest investigation at @Cen4infoRes. Details in this thread 🧵👇
Our full analysis can be seen at @Cen4infoRes here: . We also collaborated with @CNN to dig out the human stories behind those who had their photos stolen
One of the accounts is Eva. She lives in the US, and likes hanging out at the beach and posting to her 5000+ followers on X. Eva posts strong opinions against LGBTQ people and the US Democratic party and is a loyal supporter of former US president Donald Trump.