1/ Apropos 60 Minutes interview with Facebook whistleblower, there are two thing I want to share. media matters did a much research leading up to and around FB role in attack. But two big one jumps out…
2/ Days before Jan 6 w/in far right FB groups we track (many closed ones), we started seeing a sudden shift. People urging others to bring guns to the rally.
3/ This was notable for a couple reasons. For starters, atypical. Usually people remind others *not* to bring guns because of DC gun laws. Second, it seemed…almost calculated, coordinated or organized.
FB was advised. But nothing. No additional investigation or follow through.
4/The second thing was how Facebook helped build the infrastructure that spread disinfo, extremism and the core of the stop steal organizing.
5/ Between the spring and September 2020, QAnon groups/pages growth rates were about 24%. For comparison, right-leaning pages had about 2% growth rate, news pages about a 1% and left pages about 0.7%.
6/ The overwhelming bulk of the metastization of the QAnon networks on FB were the result of Facebook's recommendation engine. It was in overdriving helping connect all these individuals. This QAnon infrastructure was massive, eclipsed basically anything else.
7/ FB was forced to eliminate much of Q pages/groups in October. But the damage was done. Much of the disinfo had already fully spread and much of the infrastructure just adapted, camoflaged or was used to help pump up other things.
8/ So to sum up, the two big things that we saw were: FB knowingly and actively helped organize and build extremist networks, and then allowed those same extremists to operationalize violence to be carried out off platform.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
1/ I'm not sure this is a good thing for a few reasons...
a) For starters, FB isn't good at defining political. For example, daily wire has spent over 10M on ads this year alone. But FB political ad library says they have spent 800k on political/social ads since 2018. C'mon!
2/ a continue) My point there is that FB will narrowly define this in ways that are overall not helpful and if past is any indicator in ways that will almost certainly unfairly advantage right-wing content.
3/
b) This will calcify and intensify an existing problem on platform.
Asymmetry on FB is intense. For ex, last weekend right-leaning content had 50.96% of all engagement, left-leaning had 12.4%. News/non-aligned content (which was majority in terms of volume) only had 36.64%.
1/ Tom Cotton working double time to get publicity RE his take on Afghanistan.
Cotton is on Armed Services Committee. There have been 2 (public) hearings solely about Afghanistan since January 2020. Cotton attended neither of them. No one in media has mentioned that.
2/ May 20, 2021: Hearing - The transition of all United States and Coalition forces from Afghanistan and its Implications
Cotton did not attend. Gosh. Ya think maybe someone who has so much to say now shoulda shown up for that one?
3/ February 11, 2020: Hearing - United States Strategy in Afghanistan
2/ Recognize that it was Facebook -- not Trump -- that appealed the ban. And, they structured they appeal in such a way that it's basically engineered to all but ensure that the oversight group restores the account.
They asked that ban only be evaluated based on 2 posts. 2.
3/ We did an analysis of every one of Trump's 6,018 Facebook posts from 2020 and found 24% of them contained misinformation about public health, elections or other extreme rhetoric.
But see, Facebook was working to cook the books here so why make that a part of the decision here
1/ Facebook announced today they deleted 1.3 billion fake accounts between Oct - Dec 2020. They're highlighting this as an example of them fighting disinformation effectively.
Actually, it shows opposite and there's a big question about consumer fraud that needs to be asked.
2/ RE Fraud - 1.3 billion is *a lot* of accounts. Think of all the advertisers on the platform (political orgs, civic groups, corporations).
Facebook took money from advertisers, then FB showed those ads to fake accounts they let proliferate on the platform unchecked. It's fraud
3/ RE Fraud (cont...) - Every FB advertiser should demand a refund for any money FB charged them to serve ads to these fake accounts.
Only way Facebook will start addressing this on an ongoing basis and not once every few years is if it costs them money. A refund bare minimum.
1/ RE Tucker Carlson. So, I went through every single one of his ads for 2021 so far. Here's the thing: He has lost so many advertisers already that he doesn't really have any left to target.
His top paid advertiser is MyPillow and his 2nd biggest "paid" sponsor is Fox News.
2/ Yes. Tucker Carlson has lost so many advertisers that in addition to reducing the number of paid ads on the show due to not enough advertisers, Fox News is now buying up ads on Tucker's show...so many...that they're his second largest advertiser.
3/ If you want to have an effect, then what really needs to happen at this point is advertisers need to drop Fox News entirely.
Here are the biggest companies sponsoring Fox News (and by extension, Tucker) right now: @GSKUS, General Motors (@GM), @ProcterGamble and @KraftBrand