Sooraj 🚢 Profile picture
Aug 7 38 tweets 10 min read
Data availability has been an exciting and new topic for discussion among the #CardanoCommunity recently

But it's also an abstract and complex topic

So here's a thread that simplifies "Data availability"(DA)

And how a DA-Layer would function on #Ethereum 2.0
Data availability is the guarantee that the block proposer published all transaction data for a block

And that the transaction data is available to other network participants

Why is it important?

For this, one must understand how trustlessness work in public blockchains
• Block:

Each block has two major parts:

• The block header:

This contains general information (metadata) about the block

Such as the timestamp, block hash, block number, etc.

• The block body:

This contains the actual transactions processed as part of the block
Then there are three main actors in a public blockchain network

• Full node

• Light node

• Block-producing nodes

In a P2P network,

Block-producing nodes & full nodes are mixed to form a network

With light nodes linked to either of them
Block-producing nodes are responsible for packaging transactions and generating blocks

They have different names in different blockchains, such as

“Stakepools” in Cardano

“Miner/Mining pool” in Bitcoin & Ethereum

“Validator” in beacon chain of Ethereum 2.0
In a P2P network,

Full nodes (eg: Daedalus) download new transactions & blocks for verification

Then broadcast only valid transactions & blocks to other nodes

In order to verify that

The full node must have the complete current state of the network
Light nodes are different from full nodes

in that, they only download and verify only the block header of every new block

Therefore, light nodes can verify the validity of the block header

But cannot validate the transactions in the block body
When proposing new blocks,

The block producers must publish the entire block including the block body

Which contains the transaction "Data"

Nodes participating in consensus can then download the block's "Data"

And re-execute the transactions to confirm their validity
Without these nodes verifying transactions

Block proposers could get away with inserting malicious transactions into blocks

This is how nodes verify that new information is valid

Rather than having to trust those block producers are honest

("don't trust, verify")
So in this way,

DA guarantees that the block proposer published all transaction data for a block

And that the transaction data is available to other network participants

This is how a public blockchain reduces trust assumptions

By enforcing rules on data availability
So what happens,

If a validator publishes a block without the transaction data (block body)?

This situation is known as the “data withholding attack”

If a data withholding attack happens

Full nodes cannot verify the correctness of updates to the blockchain's state
This gives malicious block proposers an opportunity

To manipulate the protocol rules

And advance invalid state transitions on the blockchain network

That's why the visibility of block data to full nodes is critical
And because light clients, rely on full nodes to verify the network’s state

The DA ensures that full nodes can validate the block

And prevent the chain from getting corrupted and attacked by malicious actors

In short, DA is critical to the security of a public blockchain
As we have understood the basic concept of DA

now let's look at the types of DA

In general,

There are two types of DA

• On-chain

• & Off-chain
• On-chain DA

On-chain DA is a feature of "monolithic blockchains" like Cardano or Ethereum 1.0

Because these #Blockchains manages

• Consensus

• Data availability

• & Transaction execution, on a single layer
Monolithic blockchains deal with the data availability problem

or in other words

The problem of

"how do we verify that the data for a newly produced block is available?"

By storing state data redundantly across the network
By ensuring the storage of data redundantly across the network

The blockchain protocol ensures,

that the full nodes have access to data necessary

1. To reproduce transactions

2. Verify state updates

3. and Flag invalid state transitions
But there is a drawback to the on-chain DA

The on-chain DA places bottlenecks on scalability

Meaning,

the on-chain DA creates a bottleneck on 2 key aspects of a #blockchian

What are they?

• Scalability

• & Decentralization
• Scalability

As nodes must download every block and replay the same transactions

Monolithic blockchains often take a longer time to process & finalize transactions

And as the size of the state grows, the time to process transactions also grows
• Decentralization

The transaction data of a #blockchain grows over time

The size of the #Ethereum full node is around 842.48 GB

As the protocol requires full nodes to store increasing amounts of data

It would likely reduce the number of people willing to run a full node
• Off-chain DA

Due to the scalability limitations of the on-chain DA

Off-chain DA systems move data storage off the blockchain

Here,

Block producers don't publish transaction data on-chain

But provide a cryptographic commitment to prove the availability of the data
Scaling solutions like Validium & Plasma adopt this approach

By separating DA from Consensus & Execution

This is considered the ideal way to scale blockchains without increasing node requirements

But it has negative implications for decentralization, security, & trustlessness
For eg:

Participants in Validiums must trust block producers

Not to include invalid transactions in proposed blocks

Block producers can act maliciously by advancing invalid state transitions

and cripple attempts to challenge malicious transactions by withholding state data
Due to the problems associated with off-chain DA

Scaling solutions like Optimistic- and ZK-rollups don't store their own transaction data

But they use for eg: Ethereum Mainnet as a data availability layer
Now we have seen that both on-chain & off-chain DA has limitations

So, is there an optimal solution for DA?

That allows for scalability & decentralization

There are a number of theoretical solutions to the DA problem

So let's see how Ethereum is trying to solve the DA problem
Danksharding(DS) is the proposed method for increasing data throughput on Ethereum’s execution layer

DS takes this same principle of splitting network activity into shards

But instead of using the shards to increase throughput

It uses them to increase space for “blobs” of data
Shards would store data

& attest to the availability of ~250 kB sized blobs of data for L2 protocols such as rollups

Here,

A particular technique is used to verify the availability of high volumes of data

Without requiring a single node to personally download all of the data
This technique is known as Data availability sampling (DAS)

With DAS

A node can download just a small portion of the full block of the data published by the rollup on L1

Which can give guarantees that are identical to downloading and checking the entirety of the block
So how does it work?

1. "Erasure code" the data first

2. Data is then interpolated as a "polynomial"

3. Then the data is evaluated at a number of random indices on the blob

Let's go further into those steps👇
1. Erasure coding the data

Erasure coding (EC) is a method of data protection

In which data is broken into fragments, expanded & encoded with redundant data pieces

And stored across a set of different locations
2. Conversion of data as a "polynomial"

Once erasure coding of the data is completed

This data is then extended by converting it into a polynomial

Meaning,

The coded data is converted into a mathematical representation

Which is reconstructible from any point
3. Evaluation of data at random indices

50% of the erasure-coded data must be available

The entire block can be reconstructed from that

So an attacker would have to hide >50% of the block

To successfully trick DAS nodes into thinking the data was made available when it wasn’t
The probability of <50% being available after many successful random samples is very small

This process is combined with KZG commitments (a.k.a. Kate commitments)

To prove that the original data was erasure coded properly
To avoid this design forcing high system requirements on validators

A concept of proposer/builder separation (PBS) was introduced

1. Block builders bid on the right to choose the contents of the slot

2. Proposers need only select the valid header with the highest bid
In this case

Only the block builder needs to process the entire block

All other validators and users can verify the blocks very efficiently through DAS

Credits: @Delphi_Digital
Unfortunately, PBS enables the builders to censor transactions

A Censorship Resistance List (crList) helps to put a check on this power

Proposers specify a list of all eligible transactions in the mempool

Then the builder will be forced to include them unless the block is full
TL;DR -

• Ethereum has embraced a rollup-centric-roadmap

• And "Danksharding" is the step in that direction

• Creating an efficient data availability for rollups building on top of #Ethereum

• Is this model the right path for DA? only time can tell
That's it for now!

Give me a follow @Soorajksaju2

if you never wanna miss a weekly update from the Cryptoverse, #Cardano

And Deep Dives into crypto metrics & L1 assessments?

Then subscribe to your weekly Just The Metrics Newsletter👇
bit.ly/JustTheMetrics…

• • •

Missing some Tweet in this thread? You can try to force a refresh
 

Keep Current with Sooraj 🚢

Sooraj 🚢 Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

PDF

Twitter may remove this content at anytime! Save it as PDF for later use!

Try unrolling a thread yourself!

how to unroll video
  1. Follow @ThreadReaderApp to mention us!

  2. From a Twitter thread mention us with a keyword "unroll"
@threadreaderapp unroll

Practice here first or read more on our help page!

More from @Soorajksaju2

Aug 3
How to assess an L1 PoS smart-contract enabled #Blockchain?

It's one of the most challenging tasks a normal user faces

And for that, you have to know about the metrics that make an L1 valuable

So, here is a framework for you to assess an L1 PoS Blockchain 📊🧵👇
L1 utility tokens are an emerging asset class of the 21st century

And when making investment decisions in this particular asset class

It is important to not base them on hype or ever-changing narratives

And it's also important to separate a "solid" L1 from a "mediocre" L1
Especially in the light of recent events

It becomes more & more obvious

That fundamental and metrics-based decision-making is key for a profound project assessment

So don't follow the hype, follow the numbers
cnbc.com/2022/06/01/sol…
Read 22 tweets
Jul 19
Recently, #Algorand has become a hot topic among the #CardanoCommunity

The curiosity peaked, especially after @JohnAlanWoods left @InputOutputHK

To become the new CTO of @AlgoFoundation

So, here is a thread on #Algorand & its Pure Proof of Stake(PPoS) algorithm🧵👇 Image
@JohnAlanWoods @InputOutputHK @AlgoFoundation The purpose of this thread is to focus on👇

• Nodes & Network

• Accounting model

• The team behind #Algorand

• Launch & Initial token distribution

• Consensus & Pure proof of stake(PPoS) sybil protection

To provide a basic understanding of #Algorand
@JohnAlanWoods @InputOutputHK @AlgoFoundation Team👇

Algorand is a L1 Blockchain

Founded in June 2019 by computer scientist & MIT professor Silvio Micali

Who co-invented many breakthroughs at the heart of modern cryptography

such as VRFs & zero-knowledge proofs (ZKPs)

& won the 2012 Turing Award for his contributions Image
Read 26 tweets
Jul 1
Governance is a crucial component of #Decentralization

This is something that #Bitcoin & #Ethreum conveniently ignore

But, #Cardano is built based on 1st principles & governance is a key focus of development

Here is a thread on Governance, Voltair era & Project Catalyst🧵👇
Like the way, democracy unshackled regular citizens to grab power from the feudal lords

#Bitcoin was the 1st movement

Enabling regular citizens to grab power from the flawed financial institutions

To create a fair, transparent & inclusive financial system

To bank the unbanked
But #Bitcoin has a fundamental flaw, which #Ethereum inherited

"Lack of Governance"

This lack of governance resulted in major community splits

Ethereum & Ethereum Classic as a result of a hack in 2016

Bitcoin & Bitcoin Cash over the debate of the network's scalability in 2017
Read 30 tweets
Jun 1
#Ethereum's account-based model & smart contract based on Solidity is a marriage made in hell

Vasil-HFC Event is on the horizon

& its time to unleash synergies of #Cardano's EUTxO model & Plutus

Here's a thread on why EUTxO model & functional programming is what DeFi needs🧵👇
The ultimate goal of #Blockchain

Is to replace the corrupt & centralized traditional financial system

To create an inclusive global financial system to bank the unbanked

But are we going to do it with a system that is more problematic than what we have now?
#Crypto in general & DeFi has seen massive growth in the last decade

DeFi-exploits have also seen proportional or even bigger growth

This makes us realize the fundamental requirement of the system we want to be a part of

A blockchain architecture that offers safety & security
Read 31 tweets
May 16
'Parallelism' is a core feature of #Cardano's EUTxO ledger model
 
This is one of those features that enable the  creation of scalable DApp on #Cardano
 
So let's dive in and explore the concept of 'Parallelism':🧵👇
So what is ''Parallelism'' ??

In the context of computation

The term Parallelism refers to techniques to make programs faster by performing several computations at the same time
Why is parallelism important in the context of #Blockchain?

A blockchain architecture that allows transaction parallelism is a requirement to have high throughput on-chain

and it allows for the building of scalable Dapps on top of the #Blockchain
Read 16 tweets
May 10
Djed was the Quintessence of ''stability'' in ancient Egypt

The symbolic backbone of the god Osiris, the god of Afterlife & Ressurection

And Stability is what Djed is bringing to the #Cardano ecosystem

Here’s a deepdive on the tech behind Djed, stable coin on #Cardano🧵👇
So what is a stable coin?

A stable coin is a price-stable digital asset

that behaves like a fiat currency

but maintains the mobility and utility of a cryptocurrency

It offers a way to bridge the gap between fiat currencies like the U.S dollar and cryptocurrencies
Stable coins are important for crypto markets

Because price stability is built into the assets themselves

It opens up many opportunities for DeFi and value transfer without the risk of price fluctuations

They achieve this price stability is through the underlying collateral
Read 31 tweets

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3/month or $30/year) and get exclusive features!

Become Premium

Don't want to be a Premium member but still want to support us?

Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal

Or Donate anonymously using crypto!

Ethereum

0xfe58350B80634f60Fa6Dc149a72b4DFbc17D341E copy

Bitcoin

3ATGMxNzCUFzxpMCHL5sWSt4DVtS8UqXpi copy

Thank you for your support!

Follow Us on Twitter!

:(