Alex’s vision of our exciting tokengated-commerce future is pretty compelling:
You’re at a party that requires a membership card to attend.
You feel unmistakable joy that a machine recognizes you by your membership card.
Because your membership card is on the blockchain.
Packy depicts tokengated commerce as a MetaMask-headed Trojan horse preparing to invade the online commerce space with social-context soldiers.
I wonder, is this smart-looking visual standing for actual use cases, or is it covering for yet another Web3 #HollowAbstraction?
Let’s turn away from Alex’s soaring abstract rhetoric, and instead examine his specific use case.
Namely: Increasing a band’s t-shirt sales revenue by preventing anyone from buying unless they’re fans of that band and/or another band.
I clipped this from
Shopify launched tokengating features on Jun 22.
We know the animated video looks "lit", but what do we know about the traction with merchants and end users?
How many bands are using it? How much additional t-shirt sales revenue are they making?
There are a total of six merchants in the Gated Merch section of Shopify’s mobile app.
All of these merchants are NFT projects.
No bands selling t-shirts.
No empirical validation that tokengated commerce can make non-crypto merchants care about crypto.
.@InvsbleFriends is the only merchant featured by Shopify with currently available tokengated products.
Their NFT-holders get 20% off their t-shirts.
They're using blockchain tech to do the equivalent of sending a discount link to their customer mailing list.
I also noticed the Gated Merch flow in the Shop app is buggy.
Maybe the Shopify Crypto team isn't prioritizing maintenance work on an experimental feature that doesn’t seem to be getting much usage.
Alright, we made an empirical observation that Shopify’s tokengated commerce is having a lukewarm start. Not much interest from bands, or any non-crypto merchants.
IMO, this outcome was inevitable as a matter of pure logic.
Folks just didn't think critically about use cases.
Alex’s use case was supposed to be allowing his band, The Fundamentals, to offer their merch to a different band’s fans.
How is that use case accomplished today, without blockchain?
The answer is: it's rarely attempted. Bands don’t see much value in this type of "collab".
If we insist for the use-case example to be *realistic*, that's when cracks appear in the abstract pitch.
So let’s keep adding realistic details.
Let’s say The Fundamentals is doing a collab on tokengated commerce with The Mighty Mighty Bosstones, a more popular ska band.
Let’s say a Bosstones fan named Marty buys a $100 NFT that lets him attend a show, and another $100 NFT to drag his wife along.
Afterwards, Marty lists his wife’s NFT on OpenSea for $5, since she has no interest in holding it.
Now Alex’s example is showing its cracks…
Say Biff, who isn’t a fan of either band, nevertheless wants to buy a Fundamentals t-shirt. Tokengating ought to block him, right?
But Biff can buy a $5 Bosstones NFT and then buy Fundamentals merch.
If Bosstones NFTs are common, Biff could pay just $0.01 to access gated merch.
The only merchants that remotely make sense are crypto clubs like BAYC which limit group membership via rare NFTs. Demand for group membership theoretically maintains a high NFT floor price.
This explains why the only merchants using Shopify Gated Merch are NFT communities.
Also, we don’t need a blockchain to tell The Fundamentals who Bosstones fans are. Existing Web2 technology can easily implement that kind of data integration.
An OAuth API integration could securely pull membership info. So could a simple CSV dump with encrypted member emails.
Alas, people will never stop thinking they've discovered a Web3 use case. Their abstractions feel rich with beautiful potential, until they’re tried and inevitably fail.
Only those who focus on specific use cases are able to recognize when a pitch is a #HollowAbstraction mirage.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Eliezer Yudkowsky can warn humankind that 𝘐𝘧 𝘈𝘯𝘺𝘰𝘯𝘦 𝘉𝘶𝘪𝘭𝘥𝘴 𝘐𝘵, 𝘌𝘷𝘦𝘳𝘺𝘰𝘯𝘦 𝘋𝘪𝘦𝘴 and hit the NYTimes bestseller list, but he won’t get upvoted to the top of LessWrong.
That’s intentional. The rationalist community thinks aggregating community support for important claims is “political fighting”.
Unfortunately, the idea that some other community will strongly rally behind @ESYudkowsky's message while LessWrong “stays out of the fray” and purposely prevents mutual knowledge of support from being displayed, is unrealistic.
Our refusal to aggregate the rationalist community beliefs into signals and actions is why we live in a world where rationalists with double-digit P(Doom)s join AI race companies instead of AI pause movements.
We let our community become a circular firing squad. What did we expect?
Please watch my new interview with Holly Elmore (@ilex_ulmus), Executive Director of @PauseAIUS, on “the circular firing squad” a.k.a. “the crab bucket”:
◻️ On the “If Anyone Builds It, Everyone Dies” launch
◻️ What's Your P(Doom)™
◻️ Liron's Review of IABIED
◻️ Encouraging early joiners to a movement
◻️ MIRI's communication issues
◻️ Government officials' review of IABIED
◻️ Emmett Shear's review of IABIED
◻️ Michael Nielsen's review of IABIED
◻️ New York Times's Review of IABIED
◻️ Will MacAskill's Review of IABIED
◻️ Clara Collier's Review of IABIED
◻️ Vox's Review of IABIED
◻️ The circular firing squad
◻️ Why our kind can't cooperate
◻️ LessWrong's lukewarm show of support
◻️ The “missing mood” of support
◻️ Liron's “Statement of Support for IABIED”
◻️ LessWrong community's reactions to the Statement
◻️ Liron & Holly's hopes for the community
Search “Doom Debates” in your podcast player or watch on YouTube:
Also featuring a vintage LW comment by @ciphergoth
He spends much time labeling and psychoanalyzing the people who disagree with him, instead of focusing on the substance of why he thinks their object-level claims are wrong and his are right.en.wikipedia.org/wiki/Bulverism
He accuses AI doomers of being “bootleggers”, which he explains means “self-interested opportunists who stand to financially profit” from claiming AI x-risk is a serious worry:
“If you are paid a salary or receive grants to foster AI panic… you are probably a Bootlegger.”
Thread of @pmarca's logically-flimsy AGI survivability claims 🧵
Claim 1:
Marc claims it’s a “category error” to argue that a math-based system will have human-like properties — that rogue AI is a 𝘭𝘰𝘨𝘪𝘤𝘢𝘭𝘭𝘺 𝘪𝘯𝘤𝘰𝘩𝘦𝘳𝘦𝘯𝘵 concept.
Actually, an AI might overpower humanity, or it might not. Either outcome is logically coherent.
Claim 2:
Marc claims rogue unaligned superintelligent AI is unlikely because AIs can "engage in moral thinking".
But what happens when a superintelligent goal-optimizing AI is run with anything less than perfect morality?
That's when we risk permanently disempowering humanity.