Yesterday, the Ninth Circuit filed its order in Diep v. Apple. They had me in the first half...
Strong #Section230 ruling regarding Apple's content moderation efforts. Until the Court got to the UCL claims...creating yet another bizarre 230 loophole. sigh. 🧵
Hadona Diep is a cybersecurity professional.
She downloaded an app called "Toast Plus" from Apple's App store thinking it was the "Toast Wallet" for storing cryptocurrency.
It was not the Toast Wallet.
Long after transferring a reasonable sum of crypto to Toast Plus, Diep discovered that her crypto was missing and her account was deleted.
Among other claims, Diep sued Apple under state consumer protection law + negligence for failing to "vet" and remove Toast Plus.
Additionally, Diep claimed that Apple's disclaimer (see screenshot) fraudulently misleads Apple's customers. The disclaimer describes the app store as a "safe and trusted place" among other things.
The CA District Court rightfully dismissed all of the claims against Apple "because plaintiffs' claims are premised on Apple's role as a publisher of the Toast Plus app."
In other words, clear as a day Section 230 win -- Apple is not liable for rogue third party apps.
Diep appealed, bringing the case to where it is today.
Yesterday, the Ninth Circuit held that the claims pertaining to Apple's failure to remove Toast Plus are barred entirely under Section 230.
Unfortunately, the Court didn't stop there.
Regarding the unfair competition law claims, the Court said that Apple's disclaimers are Apple's own speech (which is true) and therefore it is possible that Apple's disclaimer could be fraudulent / misleading.
But to assess the disclaimer, we have to assess Apple's moderation.
In other words, the only way to reach a conclusion about the disclaimer is to consider Apple's approach to reviewing third-party apps.
Which takes us right back to the beginning of this case -- Apple's failure to remove the Toast Plus app....which is barred by 230.
Therein lies the loophole.
It's like the "negligent design" hack we've seen frequently with new social media litigation. If you frame your claims as something other than the underlying content, you might just escape 230.
That's exactly what happened here.
So, now we're back to re-litigating Apple's content moderation practices that are supposed to be shielded by 230 / 1A. But there are worse implications here:
Several states have enacted mandatory transparency laws for social media companies.
For example, New York has a hate speech law that requires social media companies to adopt a hate speech content policy (this is the subject of Volokh v. James).
Other states like California, Texas, and Florida have laws that mandate thorough content and enforcement policies.
So, you might be thinking...well Apple could just have a policy that says "we can do whatever we want" and perhaps they can.
But that would suck. It would also completely cut against this growing transparency effort. Everyone loses here.
But at the same time, in some states (as discussed), Apple may not even have that choice. They may be forced to adopt certain policies and disclaimers that could then be used against them whenever someone disagrees with their content decisions, as we saw here in Diep.
This is a troubling trend that's been flying under the radar for some time. The NetChoice & CCIA Supreme Court Cases against Texas/Florida could help here if the Court decided against mandatory transparency (though it definitely seems unlikely).
But even if the states were no longer allowed to force transparency...do we really want a bunch of social media services to adopt this "we can do whatever we want" approach to community guidelines?
It's fully in their right to do so. But is it a good policy result?
Before the Ninth Cir's order, we (@ProgressChamber ) along with @EFF @NetChoice Information Industry Association, and @actonline filed an amicus brief in support of Apple warning the Ninth Circuit about the consequences their decision could have on the app store market broadly.
Of major concern to us is the impact to smaller developers. The beauty of app marketplaces is that they lower the barriers for developers / consumers.
But if Diep signals that app stores will be on the hook for all apps they fail to vet, you can bet those barriers will rise.
CSM argues that AB 3172 is "only" a statutory damages bill.
But they accidentally said the quiet part out loud: the goal is effectively a prior restraint, forcing online publishers to restrain their protected editorial decisions, if those decisions could "harm" a younger user.
In other words, by levying millions of dollars worth of damages for editorial decisions that could be considered harmful to a child, AB 3172 effectively chills private speech.
That's what it means to "be more careful" when we're talking about private publishers.
Just finished listening to the Murthy oral arguments.
The Court appears poised to decide for the Biden Admin on standing grounds, based on lack of traceability and the absence of any threat of future imminent harm.
Notably Justice Kagan's reference to platforms as "speech compilers" and recognition of Facebook's policy enforcement as reflective of its viewpoints perhaps foreshadows the Court's direction in the soon-to-decided NetChoice & CCIA cases...
I thought it was also interesting that the Justices (except Alito and maybe Thomas) weren't sold on this being a unique media issue, emphasizing that the same sort of persuasion tactics are employed by government against the traditional media counterparts all the time.
My name is Jess Miers and I serve as Senior Counsel for Chamber of Progress, where we champion technological innovation to benefit all Americans, including the vibrant community here in Nashville, the heart of our nation’s music industry.
We stand before you to express our deep concerns regarding Senate Bill 2096.
Okay now that I've had some time to process, here is where I'm at after today's oral arguments...
I can see this being a 9-0 decision to affirm the preliminary injunctions. The line of questions and discussions from the Court were strikingly similar to Taamneh / Gonzalez.
While the Justices wrestled with some of the particulars of the Texas and Florida laws as they apply to the different types of services offered by the major platforms, one theme emerged throughout:
these regulations, rife with content, speaker, and viewpoint discrimination, are fundamentally at odds with constitutional principles.
We are now at the NetChoice v. Paxton oral arguments. I am starting a new thread for live tweeting here.
NetChoice is up first arguing that Texas cannot just convert social media companies into common carriers just because they so.
First discussion is about whether the social media companies can just leave Texas. Keep in mind though that Texas prohibits even this counter move. So, NetChoice explains that the social media companies would have to effectively remove content instead.
Yesterday, the District Court of Ohio granted @NetChoice's request for a preliminary injunction, enjoining Ohio's parental consent law.
This is the 4th order in the NetChoice cases declaring social media parental consent laws likely unconstitutional. 🧵netchoice.org/wp-content/upl…
Recall: NetChoice won a temporary restraining order (TRO) against the state last month due to the act's looming effective date. You can read my discussion of the TRO below.