, 17 tweets, 5 min read Read on Twitter
Susan makes an important point, comparing the current white supremacist terror threat to the online battle against ISIS.

I'll outline some of the ways that this challenge differs from what tech faced with ISIS several years ago.
When I joined Facebook in 2015, the most pressing content issue was the successful use of social media by the Islamic State. As many analysts have pointed out, ISIS was able to develop a digital propaganda strategy much more advanced than their predecessor groups.
The first step of our work was to understand the issue, and a key move was to hire @brianfishman from the West Point Countering Terrorism Center.

Brian recently wrote a piece on terrorist use of the internet. It is a must-read:
tnsr.org/2019/04/crossr…
Brian outlines seven internet-terrorist functions:
- Content Hosting
- Audience Development
- Brand Control
- Secure Communication
- Community Maintenance
- Financing
- Information Collection and Communication

Which of these apply to white supremacist terrorism (WST)?
An obvious difference between WSTs and ISIS is that the latter is an organized group with self-identified adherents, a leadership structure and a desire to control territory. Functions like "Financing" and "Information Collection" are less relevant to loosely affiliated WSTs.
The other functions, however, are all relevant to white supremacists. One major advantage for white terrorists is the ease with which they can find a home for these functions than Islamic terrorists could.
As @susanthesquark pointed out, 4chan, 8chan and other well-known websites are happy to accommodate WST supporting communities, with file-sharing sites hosting the large blobs such as PDFs and the Christchurch shooting video.
This is a significant difference from ISIS, as there were severe penalties involved in intentionally providing technical support to that group. For individuals in many countries, this could include prosecution. For supporters outside of extradition, death: en.wikipedia.org/wiki/Junaid_Hu…
The open existence of these sites means that there are tens of thousands of what were called jihobbyists in the Islamist context: supporters of anti-Semitic and white supremacist ideology who are willing to spend their time and energy spreading the message online.
This makes the scale of the content moderation problem much more difficult for the "legit" platforms with rules against hate speech. As @Klonick described in the New Yorker, 1.5M uploads in 24 hours is well beyond the worst day of ISIS propaganda. newyorker.com/news/news-desk…
Another difference is the lack of deterrence for individuals participating in these forums. Near the end of ISIS's online heyday, it became a bit of a joke that their Telegram channels were mostly populated by western IC/LE analysts and employees of intel or tech firms.
Furthermore, as @josephfcox and @jason_koebler wrote, applying a wider net to WST-aligned content would likely capture politicians and "legit" political commentators in the US, UK and AU. There was almost no political push-back for false positives driven by anti-ISIS moderation.
So what now for the tech platforms? While they work on improving the quality of moderation using their current standards, I would also propose a much more extreme step: a blockade of the small number of sites responsible for the majority of WST community and recruiting.
Thanks to the fight against ISIS, the big platforms already have a coordinating body in gifct.org. This organization could create a unified rule defining platforms that intentionally host WST communities and completely banning links to them from their members.
This would, effectively, be a private version of the material support rules. A first pass would likely catch *chan, G*b and Vo*t, and would greatly reduce the load on moderators who have had to judge whether any specific discussion violates policy enough to be banned.
While the platforms have their own issues disrupting the on-platform radicalization cycle, they can take a big step by creating a moat that makes it less likely that an individual starts with slightly alt-right videos and ends up cheering on a shooter on a *chan.
The creation of a private ban list might also serve to motivate those banned sites or the US companies that provide them infrastructure services to reconsider their current path. It won't solve the problem, but it would be a start.

FIN
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Alex Stamos
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!