Grateful to @_KarenHao for sensitively reporting my discovery of this deepfake app that makes non-consensual faceswapping into pornographic footage disturbingly accessible.
We knew this was coming, but it didn't make discovering it any less shocking.
This app mirrors the Telegram bot I discovered last year that had over 100k users. 67% stated they were using it to 'strip' images of women they knew in real life.
The more accessible the tool, the more its harm scales and affects private individuals.
Today, intimate image abuse targeting women remains the most common malicious use of deepfakes by a wide margin.
Yet far too often do we see it neglected in relation to other malicious uses that do need addressing, but represent a tiny fraction of present-day harm in comparison.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1) The impersonator's excellent performance mirroring Cruise's speech, accent, mannerisms, gait, and appearance.
2) The expertise of a professional 'deepfaker', who likely spent many hours generating the faceswap and applying post-production edits.
3) The viewing context of TikTok, where users largely scroll through fast-moving video feeds on small phone screens (hiding some visual flaws in the videos)
Given these limiting factors, highly realistic deepfakes aren't likely to become widely accessible anytime soon.