Breaking: More than 200 @Facebook content moderators (and supporting employees) from across the US and Europe have published a demand for better protections - and full employment rights - during the pandemic.
This is the biggest joint international effort of Facebook content moderators yet. We are proud to have worked with them to do it. Many more moderators in other sites wanted to sign, but were too intimidated by Facebook - these people are risking their livelihood to speak out.
They've taken that risk because Facebook has risked their lives by forcing content moderators back to the offices. Full FB staff are allowed to work from home - many mods have to come in, even with live Covid cases on the floor, as @samfbiddle reported: theintercept.com/2020/10/20/fac…
Moderators who live with relatives at high risk of Covid are still being made to come in. @daithaigilbert reports they are terrified: vice.com/en/article/z3v…
Facebook has done this because its experiment in automating content moderation during the pandemic failed. It NEEDS these workers. from @markscott and @LauKayapolitico.eu/article/facebo…
So as the moderators say, Facebook should hire them! "If our work is so core to Facebook’s business that you will ask us to risk our lives in the name of Facebook’s community—and profit—are we not, in fact, the heart of your company?" The letter demands full employment rights.
Facebook can well afford this. It is a $780bn company. Zuck's personal wealth passed $100bn during the pandemic. TikTok is bringing moderators in house. It's time for Facebook to treat these workers with the dignity and respect they deserve.
We're now asking everyone on Facebook to join these workers. They're risking their lives and jobs so you have social media at home. Stand with them: foxglove.org.uk/solidarity-wit…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
NEW: last night, @Ofqual pulled from its website its guidance about the algorithmic grading appeals process. We’re posting it and a letter we've sent this AM because we think it opens up a new ground of challenge. (tl;dr – it basically works like this) foxglove.org.uk/news/grading-a…
First, the position was that the algorithm was sacrosanct and more reliable than mock exams or teachers’ assessed grades.
Ofqual then changed tack, saying mock exams could be more reliable than the algorithm.
THREAD: hundreds of you emailed in anguish about the unfair grading algorithm. This case affects you all, and we are asking for crowdfunding support (gofundme.com/f/fairgrades20…), so we wanted to post our letter as soon as possible. It’s at foxglove.org.uk/news/grading-a…. Key points:
Ofqual just hasn’t got the power to mark students based on an algorithm which actively ignores individual student performance for cohorts over 15. In law-speak, this is called ‘ultra vires’.
The algorithm is irrational – it treats teachers’ rankings as sacrosanct, yet teachers’ assessed grades are treated as either everything (under 5) or nothing (over 15).
Thousands of A-level students get their “results” tomorrow. Covid-19 meant exams were cancelled. So, instead, their grades will in many cases have been generated by a hastily-built government algorithm.
We’ve got concerns about this algorithm.
Firstly, as is so often the case, there’s a lack of transparency about what this algorithm does and how it works.
This is unacceptable. Students have a right to understand how decisions which affect them are made. And algorithmic decision-making needs to be open to scrutiny.
Secondly, it appears the algorithm grades schools rather than students: “Where a subject has more than 15 entries in a school, teachers’ predicted grades will not be used”.
So an individual student’s life chances hang on an estimate based on their school’s historic performance.
HUGE news. From this Friday, the Home Office's racist visa algorithm is no more! 💃🎉 Thanks to our lawsuit (with @JCWI_UK) against this shadowy, computer-driven system for sifting visa applications, the Home Office have agreed to “discontinue the use of the Streaming Tool”.
This matters because racism and bias in the visa system inflicts untold misery and tears families apart, and excludes talented people from contributing to the UK. The visa algorithm was a key tool in this system, delivering computer-aided “speedy boarding for white people”.
There are real questions about whether companies like Palantir have earned the public trust required to work with the NHS. Just ask @ConMijente, who know all about Palantir's support for ICE's deportation machine.