NEW from me: YouTube has allowed baseless conspiracy theories about interference with voting machines to rack up around 3 million views (& counting). YouTube also made money from the conspiracy theories due to ads airing on some of the videos. mediamatters.org/google/youtube…
Many of the videos pushing the two conspiracy theories -- "hammer" and "scorecard" and "Dominion" -- are clips from Fox News. Before the election, YouTube listed Fox News as an "authoritative" source "for election-related news and information queries."
Besides having ads, some of the videos pushing the conspiracy theories even sold merchandise below the videos -- from which YouTube may also financially benefit. The image below shows in one screenshot the problem with YouTube's business model.
Additionally, the videos have earned at least 200,000 combined Facebook engagements. This continues the YouTube-to-Facebook pipeline that I and others such as @daveyalba have warned about.
New from me: YouTube has consistently allowed The Next News Network, a conspiracy theory channel with 1.7 million subscribers, to monetize misinformation, including a new video with 2 million+ views pushing a false bin Laden body double conspiracy theory. mediamatters.org/google/youtube…
In recent months, The Next News Network has repeatedly spread misinformation & falsehoods -- about masks, wildfires on the West Coast, the first presidential debate -- & it keeps making money off of those videos because YouTube allows the videos to have ads.
This channel's history of spreading misinformation is well-known. @JessReports last year noted the channel's history of falsehoods & that it was making money off of its videos. huffpost.com/entry/youtube-…
A development regarding QAnon's spread that maybe hasn't gotten enough attention is the apparent increasing amount of QAnon content that has been promoted by local Republican Party chapters around the country.
For example, Florida and Georgia county Republican Party chapters have posted QAnon content on Facebook in the past year. mediamatters.org/qanon-conspira…
In May, an official with a Texas county Republican Party chapter ran a Facebook ad with QAnon hashtags.
Key point from @kevinroose about QAnon that I'm not sure has gotten enough attention: besides QAnon's extremism, its supporters repeatedly play major roles in misinformation campaigns on social media. Just to give a few recent examples:
Jordan Sather, a QAnon supporter with a major following who helped organize a Washington, D.C. QAnon rally last year & used his account to spread misinformation (
), has been suspended from Twitter. (h/t @prag_com)
In particular recently, Sather has been a spreader of coronavirus misinformation, such as falsely suggesting drinking bleach could be used to help with the virus & that there was some preexisting patent on the virus. bbc.com/news/53061563
It appears the next part of the coronavirus conspiracy theory video "Plandemic" is coming tomorrow. Question will be if social media platforms are ready this time. The 1st installment earned at least 9 million YouTube views & 16 million FB engagements. mediamatters.org/coronavirus-co…
Zach Vorhies, a QAnon supporter who's also pushed Pizzagate & who helped promote Judy Mikovits originally, has announced he will be in part 2 of "Plandemic."