, 60 tweets, 81 min read
My Authors
Read all threads
Ah, wonderful, Richard Spencer is back in the news.

The veracity of an audio recording is at the core of this story—& since it’s increasingly likely that recorded audio/video will play a role in the 2020 US election, I’m using the moment to share a thread about deepfakes.
1/ Some background: I cover deepfakes as part of my automation & emerging tech focus on Forbes. I also curate new media art shows, so I have a front row seat for the most dynamic approaches to new technologies. forbes.com/sites/jessedam…
2/ Deepfakes are a category of synthetic media produced using a form of Deep Learning called Generative Adversarial Networks (GANs) to generate fake videos or edited versions of real ones with face/body substitution. cnn.com/videos/busines…
3/ If you missed the recent waves of news & viral deepfakes, you might also remember them from when they first popped up in media in 2017—wherein creators subbed celebrity faces into porn scenes. vice.com/en_us/article/…
4/ If you want a more thorough primer on the deepfake landscape, I recommend the report @Deeptracelabs recently released. It’ll help situate you with notable trends & case studies. deeptracelabs.com/resources/
@Deeptracelabs 5/ I’ve tweeted about creative deepfakes before, & one recently went little-league viral. The majority of responses were something to the order of, “🤯” or “I’ll never trust video again” or “beware of this being weaponized in politics.”
@Deeptracelabs 6/ I won’t bury the lede: this usage is definitely cause for concern & consideration—but TLDR: deepfakes are not good enough yet, or easy enough to create, to pose much of a threat when it comes to a whole-cloth fabricated video.
@Deeptracelabs 7/ In fact, before I go any further, here are the tech implementations you should be VERY concerned about in the 2020 US election cycle:
@Deeptracelabs 8/
-Unprotected voting machines & software
-Bots & strawman accounts on social media & fake outlets
-Easily produced content like memes & “shallowfakes” (videos doctored with traditional editing techniques) niemanlab.org/2019/05/what-d…
@Deeptracelabs 9/ I won’t bullshit you: the aforementioned—deployed in concert as asymmetrical information cyberwarfare waged against technologically illiterate masses—pose *grave* threats to a sovereign democracy.
@Deeptracelabs 10/ The aforementioned are far cheaper & far less time-consuming to create—& they are very, very effective. Fantastic value on the dollar.
@Deeptracelabs 11/ There’s an entire ecosystem in place to firehose this disinformation media across the Internet—especially since @facebook (sorry: FACEBOOK) still won’t commit to blocking political ads across its platforms.
@Deeptracelabs @facebook 11a/ Hey, @Facebook? We get that this is a complicated call & that we're asking a lot, but really truly please for the love of humanity will you BLOCK POLITICAL ADS PLEASE for a hot sec?
@Deeptracelabs @facebook 12/ If you need convincing, watch THE GREAT HACK.

It’s imperative we have a basic understanding of how easily we can be targeted.

It’s going to be a whole lot worse in 2020 than 2016.
netflix.com/title/80117542
@Deeptracelabs @facebook 13/ Okay, on to deepfakes.

One way to think about these early days of deepfakes as disinformation media is to assume that threats will come from the very top & very bottom.
@Deeptracelabs @facebook 14/ As far as I’ve gathered anecdotally, people’s main concern seems to be that full audio-video deepfakes will be weaponized as propaganda & turn everything upside down.

That theoretically could be a concern for 2024, but definitely not 2020. Why?
@Deeptracelabs @facebook 15/ In the time it takes to make 1 passable deepfake—even if it comes out perfectly on try #1 (unlikely)—you could’ve produced thousands of impressions from bots, dank memes, shitposts, & fake blog posts. And those are sure bets. mediamatters.org/facebook/how-f…
@Deeptracelabs @facebook 16/ Full deepfakes are hard because creating faked video & audio are each difficult in isolation. The likelihood of being caught as fake skyrockets when you try to produce content that does BOTH well.
@Deeptracelabs @facebook 17/ While deepfaking audio is easier than video, that doesn’t make audio easy.

Most of the best deepfakes—ie. @jimrossmeskimen +@Sham00k’s recent video or Jordan Peele’s Obama—rely on voice acting for believable audio.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k 18/ It still takes mountains of voice data to recreate a voice—& even then leaves plenty of room to sound weird & uncanny. Peep @willknight’s coverage of @modulate_ai for more:
technologyreview.com/s/613033/this-…
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai 18a/ Which is why it is *exceedingly* unlikely that the Spencer audio was deepfaked, and we should be aware of that as a population if he tries to come at us with something like that.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai 19/ The irony here is that audio-only deepfakes are one of my *greatest* concerns re: political propaganda in 2020…but more on that in a minute.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai 20/ Deepfaked video is getting eerily good & improves by the day. Mercifully, for now, adversarial nets and detection software can spot doctored video easily & with a high degree of accuracy. Emphasis: for now.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai 21/ Even beyond detection tools, I’ve yet to see a deepfake that truly duped me if I watched it enough times—even @ctrl_shift_face’s Bill Hader video. Btw that video is incredible & you should watch it:
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 22/ Right now, you’re probably like:

But what if a faked video spreads for days before it’s debunked?!

Hasn’t the Internet proven by now that people don’t use fact-checking sources?!
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 23/ All super valid; in 2024 we might witness these impacts at-scale…& that will be…a doozy.

But right now, beyond being hard to make in general, full deepfakes are nearly impossible to make well.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 24/ So the worst we’d see on this front in 2020 is a temporary “wowie” moment.

The last thing a troll wants to be is duped. As soon as it becomes clear that a video was faked, that’ll spell the end of that video in the buzz cycle.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 25/ Looking to 2020, here are my principle concerns with regard to deepfakes:

1) Their existence used to cast doubt on real, recorded events (what could happen w Spencer, Trump, etc)
2) High-spend counterevidence ops
3) Audio-only deepfakes (+how they dovetail with the above)
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 26/ The scariest thing about deepfakes right now, bar none, is how the conversation around them is already being weaponized.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 27/ To be honest, if the metric is to truly fool the eye, the vast majority of the ~15k known deepfakes suck ass—but you’d never know that from the hyper-alarmist discussions coming out of Washington.
cnn.com/interactive/20…
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 28/ This—BY THE WAY—while the conversations around election security & American data sovereignty have felt a bit...underwhelming.

One might even say it’s eerie how little is being done to rectify the security of our elections.

…One *might* say that that isn’t an accident.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 29/ Remember: the innovation with deepfakes is not the ability to produce edited video. We’ve had pretty excellent VFX capabilities for decades.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 30/ The innovation here is *who* can now create them—with the idea being that deepfakes eventually make it possible for everyone to create them on unprecedented timeframes.

(We’re years away from that point—but not many)
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 31/ I’m not saying the alarmist rhetoric is a coordinated attempt in Washington, but it sure does lay alibi-fertile ground if, say, a Trump-Zelensky audio recording were leaked to the public.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 32/ Trump is a master of moving goal posts with obfuscation & hashtag zingers.

Witch hunt. LameStream media. No collusion/conspiracy. (Reminder: the Mueller Report absolutely asserts both).
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face 33/ A legitimate recording that should seal the deal could *easily* be appropriated as ammo by Trump, Spencer, &/or the @GOP.

With the same assuredness Trump claimed Barack Obama wasn’t born in the US, for instance, he could decry the recording as a Deep State deepfake.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 35/ Even the word “deepfake” sounds threatening.

Imagine you’re the type of person who believes in a deep state—this is a far more believable lie than a secret Clinton-backed child trafficking ring in a pizza restaurant.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 36/ Moving forward—particularly while the literacy around deepfakes is low—this excuse will invariably be used to rile up the base, muck up legitimate cases, waste time/money, possibly even produce false rulings.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 37/ That’s to say nothing of how a public display like this would further divide the country, deteriorating public trust in any semblance of a shared reality.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 38/ Speaking of which, this is where the barrier to entry in producing deepfakes is actually great cause for concern in the 2020 election cycle.

Guess who’ll have the motive, resources, & community to produce high-end faked video content?
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 39/ Oh yeah, baby. Criminals & politicians.

(Particularly those who might happen to be, you know, both.)
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 40/ So, say a Trump associate is spotted somewhere in a recording that looks bad for Trump. Somewhere really inconvenient, like, ohhhh, just riffing here…Ukraine.

Speaking purely hypothetically here, folks.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 41/ In this hypothetical, we’d definitely expect team Trump to lead with accusations of deepfakery. But if they’re especially concerned, the next thing you should expect is a faked video with said parties *somewhere else*.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 42/ Furthermore, you better believe that these inconveniently spotted associates will be faked into videos with other team Trump loyalists, because isn’t corroboration just a fantastic resource when your whole team is compromised?
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 43/ I know. It’s really fucking scary because of how believable it sounds. Part of the reason I felt so compelled to write this thread is so that, in the event it does happen, there’s a reference point somewhere in the rearview.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 44/ Finally: audio.

Deepfakes are generally easy to debunk because they leave open so many opportunities for debunking—every muscle movement in the face, every tic, etc. Both our eyes & our ears require convincing.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 45/ But if a deepfake is being produced to sow chaos, why not find a way to do so at the lowest cost to you—while leaving fewer opportunities for debunking?

This is what concerns me about audio-only deepfakes in 2020.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 46/ As I mentioned, audio is hard to fake, but it’s far easier than video+audio, and is ultimately much easier to make well *enough*.

We’re more likely to gloss over discrepancies in audio because we’re used to shoddy audio recordings.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP 47/ The scenario that alarms me most about audio deepfakes can be summed up easily in this story from @WSJ: wsj.com/articles/fraud…
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 48/ When you combine the ability to spoof phone numbers with a newfound ability to produce faked voices in realtime, you’ve entered a scenario where it only takes one unscrupulous employee to permit an instance of devastating social hacking.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 49/ Imagine, for instance, if [your fave candidate]’s manager receives a call that appears to be from the boss. [Candidate] asks them to open an email on their computer because [candidate] can’t get the attachment to load.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 50/ One little click later, the campaign’s proprietary data is compromised.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 51/ Of course, there’s also the asynchronous stuff.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 52/ Anybody with oodles of voice data on the Internet (read: every candidate) might wake up to a recording of themselves saying things they don’t remember. Maybe something that sounds like it was recorded by a bystander at a stump speech.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 53/ Even if this is ultimately dismissed as fake, you can imagine how, if produced at scale—targeting candidates on important days on the campaign trail, it could erode enthusiasm, distract them, or even weaken public support.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 54/ Coordination with more established fake media will just exacerbate & prolong the impact of the disinformation—not that any of those outlets are waiting around for source material to whip up cockamamie bullshit.
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 55/ This means that candidates & other people in positions of power *unequivocally* need to institute strict cybersecurity policies—and in this case I’m not referring to software (though yeah, duh, that too).
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 56/ Social engineering is actually still one of the most efficient ways to gain entry into a system.

That being the case, there are very simple verification procedures that humans can use to ensure security. digitalguardian.com/blog/social-en…
@Deeptracelabs @facebook @jimrossmeskimen @Sham00k @willknight @modulate_ai @ctrl_shift_face @GOP @WSJ 57/ Okay, that's about all I can do for now.

That said, I’m planning a bunch more of these threads because I’m concerned about technology & media’s role in politics.

If you feel the same, please share resources, tag friends, etc. My DMs are open.

/END
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Jesse Damiani

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!