Here’s a viral tweet with a video claiming to expose Ukrainian disinfo that had mislabeled a military vehicle as Russian. The video also spread widely on Telegram. But it’s nearly impossible to find an example of the supposed Ukrainian fake being shared anywhere.
Here’s another example from a Russian government account. And another where an official with the pro-Russian separatist Donetsk People’s Republic claims to show “How Ukrainian fakes are made.” Good luck finding these supposed “Ukrainian fakes.”
Researchers at @clemsonhub and ProPublica identified more than a dozen videos that purport to debunk apparently nonexistent Ukrainian fakes. They don’t show up in reverse image searches, and the Russian debunking videos fail to cite a single example of them circulating.
@ClemsonHub Metadata from two videos reveals how they were made. Simply put, whoever created the videos duplicated the original footage to create the alleged Ukrainian fake. They added different captions or visual elements to fabricate the Ukrainian version. It’s a fake fake.
@ClemsonHub “If these videos were what they purport to be, they would be a combination of two separate video files, a ‘Ukrainian fake’ and the original footage…. Whoever crafted the debunking video created the fake and debunked it at the same time,” said Clemson's @DarrenLinvill
@ClemsonHub@DarrenLinvill Why make fake debunks? The goal is to inject doubt among Russian-language audiences as they encounter images of wrecked Russian vehicles and of the destruction in Ukraine. “It’s sufficient to make people uncertain as to what they should trust,” said @plwarre of Clemson.
@ClemsonHub@DarrenLinvill@plwarre It’s unclear who is creating the videos, or if they come from a single source or many. They have well over 1 million views on Telegram, thousands of engagements on Twitter, and have spread to Facebook and TikTok. The videos have Russian captions, but can transcend language.
@ClemsonHub@DarrenLinvill@plwarre On March 1 Russia's state-controlled Channel One aired a screenshot from one of the videos. It was shown as a warning to “inexperienced viewers” who might be fooled by false images of Ukrainian forces destroying Russian military vehicles, the BBC reported bbc.com/news/world-eur…
@ClemsonHub@DarrenLinvill@plwarre Joan Donovan @BostonJoan of Harvard's Shorenstein Center called the fake debunking videos “low-grade information warfare." She said they're particularly effective domestic propaganda when the videos get cited by state TV.
@ClemsonHub@DarrenLinvill@plwarre@BostonJoan A few days after the Channel One segment, footage from one of the fake debunks appeared in a video shared by the Russian government account on Twitter. “Western and Ukrainian media are creating thousands of fake news on Russia every day,” the video said.
@ClemsonHub@DarrenLinvill@plwarre@BostonJoan Of course, old footage of military vehicles and explosions *is* spreading with false/misleading info. And journalists try hard to debunk it.
But now we have to watch for fact-checking-as-disinfo. Tip: Credible fact-checks cite where a falsehood has appeared. Check the checkers.
Our new ProPublica/Wash Post investigation reveals public Facebook groups swelled with at least 650,000 posts attacking the legitimacy of Joe Biden’s victory between Election Day and Jan. 6, with many calling for executions or other political violence. 🧵 propublica.org/article/facebo…
Through interviews, data & internal docs, we reveal how FB/Meta relaxed its oversight of groups after election day, and belatedly rushed to try and police them once the Capitol attack was underway.
But by then, groups had become a hotbed of election misinfo and threats.
The seeds of groups dysfunction were planted when Zuckerberg made them a priority in 2017. Groups became more central to FB’s bottom line, but enforcement efforts were weak, inconsistent and heavily reliant on the work of unpaid group admins.
The Ozy Media meltdown has a lot of lessons. As the reporter who caught them buying junk traffic in 2017, I'm biased. But I see it as a perfect, cautionary illustration of Paid versus Earned media. So: a 🧵 on why Ozy failed as a paid media company — and why it matters.
Definitions:
PAID media is advertising. Spend money to get an audience. In today's world that could be display ads, FB ads ppl click on, YouTube ads for your video.
EARNED media is when people choose to watch/read/listen to your content or the media chooses to cover you.
Ozy's strategy has been to buy audience, influence and the trappings of success. It appears to have worked until money from investors started to dry up. Ozy's problem is that all its paid media never turned into earned media aka real audience success. So they had to keep buying.
NEW: FB Marketplace has 1 billion users and is one of the company’s most promising sources of $$. But growth comes at a cost: our investigation reveals how FB fails to protect buyers and sellers from scam listings, fake accounts & violent crime. Thread... propublica.org/article/facebo…
Internal documents, interviews with Marketplace workers, and law enforcement records show how the product has become a favorite of cybercriminals who come from around the world to find victims. There’s a staggering array of scams being perpetrated on Marketplace:
Facebook says Marketplace “lets you see what real people in your own community are selling,” and that viewing a profile is a great way to see who you’re dealing with. But workers say hacked and fake accounts are a huge issue, and are used by fraudsters to rip off people at scale
The WSJ's revelation of internal reports that showed the harm of Facebook's VIP profile program and the negative effect of Instagram on teens reveal a core truth about FB: people inside the company document and articulate the problems but they really struggle to affect change
What we can read of these reports shows the quality of work done internally to try and quantify harm and issues. Yes, insiders are the only ones with access to data to do this work. But ppl at FB take these challenges on because they care and want to see the company do better
@RMac18 and I saw this time and again last year in internal threads and reports. Lots of ppl at FB want to fix this stuff. So they put in the work to make the case internally. The problem is they end up hitting a wall when fixes conflict with growth/revenue/public image
Exclusive: An internal report reveals how Facebook failed to prevent the "Stop the Steal" movement from using the platform to "spread conspiracy, and help incite the Capitol insurrection.” This new evidence contradicts public statements from Zuck/Sandberg: buzzfeednews.com/article/craigs…
The report shows FB didn't know the "Stop the Steal" movement was building for months before Nov 3. On election day it exploded in a viral FB group that “normalized delegitimization and hate in a way that resulted in offline harm and harm to the norms underpinning democracy.”
The report (“Stop the Steal and Patriot Party: The Growth And Mitigation Of An Adversarial Harmful Movement”) provides yet another case study of how relatively small but coordinated groups of people can wreak havoc and spread misinformation on the world’s dominant social network.
BREAKING: David Brooks has resigned from his position at the Aspen Institute following our reporting — and new revelations — about conflicts of interest between the star NYT columnist and funders of a program he led for the think tank: buzzfeednews.com/article/craigs…
Something new we discovered: On March 15 of last year Brooks appeared on Meet The Press and said: "I think people should get on Nextdoor, this sort of ‘Facebook for neighbors.’”
Left unsaid: Nextdoor, a social network for neighborhoods, had donated $25,000 to Weave, his project.
A day before his appearance on the nationally televised NBC program, Brooks also tweeted to his nearly 250,000 followers, “If you know someone who lives alone, ask them to join NextDoor.”