#Teachers: You can use this false tweet that went viral last week (apparently as a joke, though it tricked lots of people) to challenge your students' digital verification skills. How? Lemme show you.
The tweet used a video clip of Malaysian military personnel putting up a razor wire barrier near Kuala Lumpur to falsely claim (apparently as a joke) that British soldiers were preparing for an "anti 5G and Lockdown and pro-Brexit protest from patriots" in London.
But as often happens, many people took it seriously and amplified the claim.
The tweet has since been deleted, but you can find an archived version of it here: archive.is/wip/5sM7f
And here's the same video clip used in the tweet on YouTube:
.@Facebook: If you are truly cracking down on "harmful misinformation" about COVID-19 miracle cures, why isn't there a clear option for reporting it? Which options in your reporting menus should people select to report these kinds of posts?
Let's say a post is promoting a potentially lethal "cure" for COVID-19. It's not nudity, violence or harassment; not really someone threatening self-harm; it's not spam, an unauthorized sale, hate speech or terrorism; & it's not incorrect voting info. It must be "Something Else"
When we get to "Something Else," we're given 16 additional options for reporting -- but again, none of them have anything to do with reporting dangerous medical misinformation:
#Teachers: If you're trying to help students make sense of @ABC's false use of gun range footage in a report on Sunday about violence in Syria, here are some pointers:
1. Don't be cynical: Despite the online rantings of bad-faith partisans, there is no evidence that this was intentional nor ideologically-motivated. The conflict in Syria is highly newsworthy & the motivation here was almost certainly to get a sensational, great-for-TV clip.
2. Wouldn't make sense for ABC News: While using visuals like videos & photos in false contexts is an extremely common strategy employed by misinformation purveyors, it's almost always exposed. This strategy makes sense for opportunists looking for get quick clicks, or for...
I searched "should I get my child vaccinated?" on YouTube (signed out, in a privacy browser) and the "Up Next" suggestion algorithm queued up an anti-vaccination video after just one click. From there things didn't get better. Here's my "rabbit hole" path:
Teachers: This is a great #newsliteracy learning experience for your students. Pick a trending or controversial topic, do a neutral, good-faith search about it, and see where YouTube's algorithm takes you, documenting and reflecting as you go.
This opens up all sorts of questions you can engage: Why does YouTube have a suggestion algorithm? (To engage you so you stay and consume more ads.) How does it make selections? What makes people watch suggested videos? (Fear & outrage work pretty well.) Can algorithms have bias?
#Teachers: Here's a thread laying out how to create a digital forensics learning pathway using this false quote meme featuring Denzel Washington that recirculated last week:
Before you get students started on their digital forensics work, you might point out the misinformation pattern it fits. This rumor has gone viral three separate times: first during the 2016 campaign, then again after Washington was nominated for an Oscar in February 2018, and...
...then again last week when Kanye West's tweeted statement of support for President Trump sparked a broader conversation about the political loyalties of the African American community. So one lesson here is that viral rumors recirculate when new contexts for them emerge.
Thread: #Teachers: Here's a quick demo of how to use an example of misinformation as a digital forensics learning pathway with students. Let's start with this piece of inflammatory clickbait, published under the pretense of satire by Daily World Update: archive.is/PtWFV
Drill down on that lead image of people in the trucks using a reverse image search, and you'll find examples of the image that include the license plate on the truck. Like this one: