We've put together a 🧵THREAD🧵 about the proposed Online Safety Bill. Our concerns range from privacy and safety, to enforcement and moral policing.

We have included how to learn more and how to fight back at the end of this thread (1/27)
In Dec 2020, the E-Safety commissioner released an exposure draft of their proposed Online Safety Bill, a piece of legislation that attempts to improve and promote online safety for Australians.
ICYMI, the E-Safety commissioner has various functions, but primarily to:
- promote online safety for Australians
- administrate the online content scheme
- to advise the Minister about online safety for Australians
- The full list of functions is here: esafety.gov.au/about-us/who-w…
Julie Inman Grant is current Australian E-Safety commissioner and was appointed to that role in 2017.
The E-Safety Commissioner, is a single UNELECTED individual who wields a great deal of power to make decisions about online content with little oversight, transparency or consequence.
Grant is US born and raised, previously working for Twitter, Adobe and even Microsoft as their first lobbyist in Washington DC in 1995. She is also currently sitting on the board of WeProtect Global Alliance.
WeProtect operates in all of the Five Eyes countries and has been involved in recent anti porn legislation proposed in Canada and the UK. WeProtect is a UK government led body founded during conservative PM David Cameron's term. en.wikipedia.org/wiki/Five_Eyes
WeProtect links to Nicholas Christoff's NYT article as a resource on their website, which was used to platform and promote an anti LGBTI, anti sex and anti abortion group which recently took aim at online content platforms.
Fun Fact! The E-Safety site lists sexually explicit content under the illegal and harmful content page, right there with content advocating acts of terrorism esafety.gov.au/report/illegal…
✨One of the other super fun facts✨ the statistics that the e-safety commissioner has been referencing publicly regarding CSAM (child sex abuse material) have little to no transparency regarding research methods and terminology.
The data the E-Safety commissioner is using to make decisions is extremely flawed. In 2019, according to the FBI there were only 12 cases of domestic sex trafficking involving a minor in the US. policeprostitutionandpolitics.com/end_demand_sta…
Maybe the E-Safety commissioner should listen to @yourewrongabout who did two whole episodes on the problems with this type of data. 🙃
- podcasts.apple.com/us/podcast/hum…
- podcasts.apple.com/us/podcast/way…
According to the Australian Criminal Intelligence Commission, there has been little systemic research into CSAM in Australia. aic.gov.au/sites/default/…
The burning question we have is, why hasn't the E-Safety commissioner's office commissioned research as to the prevalence of CSAM in the APAC region before determining they needed a legislative hammer to deal with it?
The E-Safety commissioner presented a keynote at #AusCert2020 () stating that the classifier for CSAM is the individual is under 18, despite the age of consent being 16 in Australia.
The laws about what age a young person can validly consent to sex are different in each state and territory.
Victoria (like many states) have what is often referred to as a Romeo & Juliet Law.

This allows for consensual sex between two young people of similar age, while criminalising sex between an adult and a young person.
The age of consent laws are said to "strive to find a balance between recognising the developing sexuality of young people and protecting children from exploitation and abuse by older people."
We are concerned that cases of young people exploring their sexuality will be conflated with cases of serious CSAM. In the 2019 WeProtect Global Threat Assessment, Queer youths are more likely to explore their sexual orientation online.
We have many concerns but we'd like the E-Safety Commissioner to provide the following information since being appointed in 2017:
- How many reports were determined not to be actionable or abandoned?
- How many were fake or malicious reports?
(1/2)
-How many reports were cases of minors consensually sending content to another minor?
-How many of these reports go on to be a take down request? How many of these are repeat or follow up reports?
-How many reports were appealed by either the platform or the person reported?(2/2)
The E-Safety commissioner mentions the use of AI as the preferred way to stop CSAM online during her keynote at #auscert2020. Can the commissioner confirm if they have worked with or continue to work ClearviewAI or Thorn?
A Canadian court ruled that Clearview AI was breaking the law due to collecting images of the general public without obtaining consent techcrunch.com/2021/02/03/cle…
Thorn has failed on multiple occasions to respond to privacy concerns and other queries asked by the sex working community and digital rights orgs (observer.com/2019/11/sex-wo…)

The CEO of Thorn is part of the WeProtect board, alongside the e-safety commissioner.
Regulating technology companies alone will not end CSAM. Sex workers and marginalised communities are having their voices ignored and infantilised. Their concerns suppressed by government and non-for-profit organisations that have financial stakes in their oppression.
This is a larger problem that needs to be addressed. We need more education for children and parents surrounding sex, consent, internet safety and an expansion of our public education to cover queer sex education
We've already seen the government use fear to push genuinely bad legislation under the guise of protecting Australian citizens - such as the metadata retention and the #AAbill.
Please read the bill.
Please raise awareness.
Submit your concerns before 15th of Feb AEST: communications.gov.au/have-your-say/…
If you're Australian based, please contact your Local Member aph.gov.au/senators_and_m…
For more information about the dangers of the bill, how to make a submission and what can do to help, please follow @scarletalliance and read their easy to digest guide:
scarletalliance.org.au/library/online…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Assembly Four

Assembly Four Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!

Follow Us on Twitter!