1/ A thread on blockchain scalability, decentralization, throughput, gas, and modular blockchains
2/ Blockchains are decentralized computers that execute transactions and maintain a global ledger across a network of nodes
Decentralization is important because it removes central points of control as well as prevents manipulation and arbitrary protocol changes
3/ There's block-producing nodes and non-block producing nodes
The former extends the chain with new blocks of transactions (ensuring liveness)
Both types or nodes execute and validate transactions and blocks
A misconception is that the former type of nodes control the network
4/ This isn't strictly true
Block producing nodes can engage in censorship and reorgs, but they *cannot* print new coins beyond what the protocol says they can, spend coins from other people's addresses, or make larger blocks than the protocol allows
5/ How?
Non-block producing nodes ignore invalid blocks, so producing such blocks is pointless
These nodes keep the block-producing nodes in check
6/ Ensuring decentralization means keeping the hardware requirements low, so that the verification of blocks can be done by anyone using a certain level of hardware
The hardware requirements of a chain is a social contract (h/t @jadler0)
7/ This is why you can't simply raise the block size and reduce the block time of a blockchain (as it increases the hardware requirements and centralizes the network) vitalik.ca/general/2021/0…
8/ Gas/second for EVM blockchains is rough measure of the hardware requirements of nodes to stay in sync with the tip of the chain
Gas: units of computational work
A native coin transfer is 21k gas
ERC20 transfer is ~52k (LINK as a baseline)
Gas usage scales up with complexity
9/ Simply increasing the gas limit (block size) and decreasing block times alone leads to greater throughput, but not scalability
Scalability is increasing throughput without meaningfully increasing hardware requirements (increasing centralization and trust assumptions)
10/ Transactions consume a static deterministic amount of gas, this gas is paid for in a blockchain's native coin
The amount of coins paid per unit of gas is dependent on a fee market
Fee markets are independent of the coin/USD markets, but rather depend on demand of gas
11/ Let's compare the *max* throughput of EVM-chains
Nuance: EIP-1559 on Ethereum is designed to target 50% network gas usage, where the base fee increase above 50% and lower under 50%
12/ Ethereum: ~2.26M max gas/second
Polygon PoS: ~9.4M max gas/second
HECO: ~13.3M max gas/second
BSC: ~31M max gas/second
13/ Again these metrics are not scalability numbers, this is the max throughput a chain is configured to process
The realized network TPS depends entirely on the complexity of transactions being processed on network
14/ Maximum gas/second determines the highest hardware requirements needed to stay in sync with a EVM-compatible chain during congestion
But why doesn't this metric correlate 1:1 to each network's gas fees?
15/ If the demand/supply ratio is over one, then fees will rise, unless additional supply is added
But again, simply raising the block size and lowering the block time isn't sufficient if you want to maintain decentralization
16/ This is where the modular blockchain thesis comes into play
Rather than having a single monolithic chain handle all the consensus, data availability, and execution needs
These processes can be seperated into different networks
17/ An execution layer processes transactions at high speed
This off-chain execution is verified on the consensus layer using a succinct proof
The raw transaction data used during execution is stored on a data availability layer so anyone can rebuild the state
17/ As a result, the blockspace (available gas) of the consensus layer is used more efficiently per transaction because it's only used to verify succient proofs covering many transactions
Scalability intuition: you don't need every node to execute every transaction
18/ With this model, the consensus and data availability layers can be highly decentralized by limiting gas throughput (or sharding), while achieving greater scale by outsourcing execution
The execution layer derives its security from the other layers, so it can be more lean
20/ An interesting property of using succient proofs to validate transactions is that scale lowers cost!
A single proof can validate a greater number of transactions, meaning the gas costs of validating proofs are amortized over more user transactions, lowering per tx cost
21/ There's different types of modular chains
zkRollups:
Validity proofs
On-chain data
Optimistic Rollups:
Fraud proofs
On-chain data
Validiums:
Validity proofs
Off-chain data
Volitions:
Validity proofs
On-chain and off-chain data
22/ With zkRollups, there can be a single high performance prover executing transactions and generating cryptographic proofs, which are then verified by a decentralized network of lower performance nodes
Verifying proofs is much easier than executing TXs and generating proofs
23/ And because of the cryptographic proof, this prover is unable to steal user funds, generating a near equivalent level of security as the consensus layer
Even with Optimistic Rollups, there only need to be at least one honest execution node (validator)
24/ Ultimentally I think it will become clear over time that the monolithic blockchain model is limited in its ability to scale while maintaining low hardware requirements for end user verification and decentralization
Modular blockchains is a chain agnostic model anyone can use
25/ There will be different ways blockchains attempt to scale and lots of experiments going on concurrently, especially within the modular realm, it will be interesting to see the result
A lot of work is being done by some very smart people, it's not something I would sleep on
• • •
Missing some Tweet in this thread? You can try to
force a refresh
2/ Oracle computation is the process of using decentralized oracle networks (DONs) to perform off-chain computation on the behalf of smart contracts, while being anchored to a blockchain to generate trust-minimization
3/ Just like how data delivery extends the capabilities of smart contracts through added connectivity, oracle computation extends smart contracts with scalability, cost-efficiency, privacy, and additional features that cannot be realistically achieved on a blockchain itself
"The research-driven initiative combines @WOOnetwork’s liquidity provision strategies with @Chainlink’s robust oracle technology to create customized Chainlink oracles based on WOO’s market data"
"Whether you’re a backend developer, a web developer, or new to development work altogether, there’s a path for you to succeed and plenty of people who are happy to help"
"Oracle computation uses decentralized oracle networks (DONs) to perform off-chain computation on behalf of smart contracts while remaining anchored to blockchains to create trust-minimization guarantees"
3/ "We’re excited to announce that @Codi_Finance, a DEX with an IDO launchpad protocol, is integrating #Chainlink Price Feeds on the @solana blockchain"
"Smart contract developers can use @Chainlink Price Feeds in combination with the high-throughput & low fees of the @terra_money blockchain to build increasingly sophisticated #DeFi applications"-@stablekwon
2/ "#Chainlink Price Feeds are the industry standard oracle solution in #DeFi, so it is exciting to see its expansion to the @terra_money ecosystem, allowing for an upgraded oracle mechanism for @mirror_protocol"
3/ "With the widely adopted #Chainlink Price Feeds now on the @terra_money Testnet, we are gearing up to improve @anchor_protocol’s savings product with the most time-tested oracle solution in the industry"