Decentralized cloud storage networks eliminate the need for third-party trust, remove single points of failure, and increase data security and privacy — all while being environmentally sustainable and cost-effective.
Are these networks the future of data storage? 🧵
In a decentralized cloud storage model, data is distributed across a global P2P network of individually operated nodes, rather than on a single server located in a data center.
Data is typically split into shards, replicated, and distributed to various nodes.
A big value prop. is that decentralized storage networks offer storage >70% cheaper than providers like Amazon S3
The networks decouple raw hard drive space from the services & support provided by centralized providers, turning cloud storage more into a commodity or utility
Here's a summary of the networks (see report for in depth analysis)
- Filecoin geared to cold storage for enterprises and devs. It's attractive for Web2 entities seeking cheaper alternatives for storing a lot of archival data
-Arweave is the go-to for permanent storage
-Storj optimized for hot storage to enterprises, offers fast retrieval times and has proven effective for large video file sharing. Not fully decentralized tho
-Sia is used for hot storage mainly by devs. Good for users looking for fully decentralized and private storage solution
While the products and services built on top of decentralized storage protocols are still in their infancy, important elements such as access layers, content delivery networks, and enterprise-scale storage providers are beginning to appear in the market.
Let's dive into data. Key metrics for decentralized storage protocols are used storage, total network capacity, network utilization, and revenue
In 2022, the four storage networks accumulated over 17 million terabytes (TB) of total storage capacity, up 2% YoY, and 532,500 TB of used storage, up 1280% YoY.
Revenue tracked is demand-side (revenue from real network usage). Revenue generated by Arweave, Storj, and Sia was roughly $1.4M, down 17% YoY.
Filecoin generated $13M in 2022, down 98% lol. I'm not going to get into this cause its complex, but its explained in the report
This started as our year-in-review report on Bittensor. But halfway through writing it, we realized there wasn't a single comprehensive resource that simply explains the thesis, maps the ecosystem's growth, and explains Dynamic TAO
So Seth and I set out to create it
Bittensor draws much inspiration from Bitcoin's design and principles.
Bitcoin proved decentralized markets could coordinate global computing resources at massive scale.
In 2021, Bittensor launched and extended this to AI with its text-prompting network.
Then in Oct 2023, the Revolution upgrade transformed Bittensor from a single AI market into an ecosystem of specialized markets through the introduction of subnets.
This spawned 64 AI networks, in just over a year, each competing for $1.2B in annual rewards.
BTW, we dislike the term "subnet"- reminds me of "L2".
So we use the term AI Market instead through out the deck, as its a better description of what a subnet is anyways
Everyone's bull posting about GPU compute networks on the timeline, so I want to toss in a bit of a reality check on the situation
The main question I looked at is whether these networks in their current state are supply or demand constrained.
I think it's a bit of both 🧵
On the supply side, the main challenge has been sourcing high-end GPUs like the A100s and H100s, which are best in class for ML.
The initial idea behind these networks was that entities with those GPUs could monetize them during periods of idleness.
However, I'm starting to see two issues w/ this model: 1. these GPUs are consistently in high demand, which means they are rarely idle 2. the dependence on these GPUs being available only in their idle times creates a lack of stable supply, undermining networks' ability to provide reliable services.
Illustrating this, ionet currently has 304 A100 GPUs, a big decrease from 860 just two months ago, and 151 H100 GPUs.
In comparison, Akash currently only has 22 A100s. But to combat the network's supply issue, a liquidity mining program has been proposed to incentivize high-end GPUs. And around 88 more A100s have been committed so far bcuz of it
Just dropped the most in-depth report on DePIN with @DAnconia_Crypto covering developments in 2023 and where the space is going in 2024.
Report can be accessed for FREE on @MessariCrypto (link at end)
In 2023 there were over 650 DePINs, >$20B in market cap, and >$15M ARR
Compute marketplaces dominated 2023. However, the challenge for 2024 will be storage networks finding more paying users, compute networks sourcing high quality GPUs, and retrieval networks increasing their density.
The number of GPU networks has surged over the last year. These teams are gearing up for war as they vie for access to limited GPU resources alongside their rivals
There’s a reason that Fully homomorphic encryption (FHE) is referred to as the holy grail of computing & black magic at the same time.
FHE enables computations on encrypted data without decryption, basically meaning sensitive data can be processed without ever being revealed.
The transparency of blockchains poses challenges for enterprise adoption, as most businesses are unwilling to publicize every aspect of their operations onchain.
Using FHE to encrypt and compute on onchain data could be the solution to this.
Won’t ZKPs solve crypto’s privacy problems? Likely not. ZKPs are not a great privacy solution for 2 reasons:
1. For most apps, third parties with stronger machines are needed to generate the proofs, exposing user data. Users would have to trust these entities w/ their data
The intersection of crypto and AI is proving to be a lot more than just hype
Crypto is playing an active role addressing challenges & bottlenecks in the AI industry, serving as a:
-Hardware coordination layer for compute resources 🖥️
-Machine intelligence coordination layer 🧠
Problem 1:
Computing power has become a bottleneck in the AI industry as the computations required double every few months
The cost of training AI models has also been increasing around 3100% per year, emphasizing the need for more efficient & cost-effective training solutions
This trend towards rising costs and increasing resource demands needed to develop and train cutting-edge AI systems is resulting in centralization, where only entities with massive budgets are able to conduct research and produce models