- assuming that resource constrained IoT devices would provide enough PoW security (hash power)
- using a slow and gameable random walk to "find" the heaviest branch
- using ternary logic and alot of other things
but I strongly disagree ...
@JorgeStolfi@pobserver2@WealthSeeker28@Mat_Yarger ... with the statement that it started with a totally broken idea. The idea to translate the longest chain wins consensus of Nakamoto into a heaviest branch wins consensus in a DAG is absolutely brilliant - and imho even necessary if you want to deliver on the original vision ...
Nakamoto consensus is the most secure and robust consensus mechanism that we know and it would in theory work well with millions of nodes and millions of ...
1. The competitive mining process combined with the dynamics of "economies of scale" lead to the formation of mining pools and a centralization of power.
And most importantly it removes the need for absolute consensus on the weight of the messages that introduce new transactions into the ledger.
@JorgeStolfi@pobserver2@WealthSeeker28@Mat_Yarger This opens up a whole new dimension of sybil protection mechanisms and you can use things like subjectively perceived trust instead of a "limited resource" like PoW or PoS. I usually tend to say that contemporary DLT projects waste trust like bitcoin wastes energy. There are ...
@JorgeStolfi@pobserver2@WealthSeeker28@Mat_Yarger ... already projects like Stellar that use such a different sybil protection mechanism but they have other problems like the limited size of their quorum slices and the message overhead that arises from a lot of people trusting the same nodes / actors.
With IOTA you can have ...
@JorgeStolfi@pobserver2@WealthSeeker28@Mat_Yarger ... quorum slices of millions of nodes and a single validator that is trusted by a lot of actors would not have to answer more requests because his statements are replicated as part of the DAG using gossip and Nakamotos principles.
It really hurts my sould to see how other ...
@JorgeStolfi@pobserver2@WealthSeeker28@Mat_Yarger ... projects are switching to classical consensus mechanism with their limited validator set size and lower byzantine thresholds to overcome the limitations of Nakamoto consensus and it sometimes feels like IOTA is the only project that is left in the whole space that still ...
I think that our new approach that solves the algorithmic callenges and attack vectors of the first and broken iteration of the technology will greatly influence the DLT space in the coming months and years.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
@JorgeStolfi@iotafi@pobserver2@WealthSeeker28@Mat_Yarger IOTA is not your typical crypto project where you take some well established principles (i.e. blockchain), extend them with a unique selling point and then do an ICO collecting hundreds of millions of dollars that you use to deliver your ideas which by that point are ...
IOTA started as a research project that tried to go a completely different route than everybody else trying to fix the only remaining issues of Nakamoto Consensus. When it did its ICO it collected 500k USD which were used to pay for the ...
There was no premine for the founders or the project. The community decided to donate 5% of all the tokens that they bought (25k USD), to establish the IOTA Foundation to further advance and research the technology. Completely ...
@lKuzon It is true that IOTA nodes can be configured to prune old data that is no longer needed but this doesn't make the system less secure.
In fact Bitcoin has exactly the same option which is called "pruning mode" and which was introduced as part of Bitcoin Core 0.11 in 2014.
@lKuzon The reason why Bitcoin, Ethereum, IOTA and pretty much any major cryptocurrency that exists has such a feature in their node software, is due to the fact that it does not impact the security and nodes do not even access this old data anymore as part of their consensus mechanism.
@lKuzon It is true that IOTA will produce more data than a Bitcoin node so there will most probably be more people having this pruning mode enabled and less people that will maintain a full history of everything that ever happened.
Roman argues that this will make it harder to find ...
@ThroneOfCrypto@__Javs_@TheADAApe That's totally fine. I think Cardano serves its purpose very well but it wouldn't perform very well in the scenario I just described.
The BFT-style aspect of Ouroboros would already prevent a lot the required properties to hold (i.e. dynamic availability).
You didn't mean to simply remove fees from the game all together but to allow merchants to i.e. cover the cost for their clients.
Of course this can be done with delegated fees and it is a powerful concept to lower ...
@ThroneOfCrypto@__Javs_@TheADAApe ... the entry barrier for new users and boost adoption (especially if your gains have enough margin to cover these costs).
Well ... In that case just read my comment as a nice summary of IOTAs vision 😅
The fees are not only rewards for staking (nobody would stake if there would be no fees, so you need a completely different anti-sybil mechanism) but they are also a vital part of the spam-protection ...
@ThroneOfCrypto@__Javs_@TheADAApe ... mechanism. Going feeless is ORDERS OF MAGNITUDE more complex than just "not paying the validators".
Regarding you other points, I can however understand where you are coming from and your concerns would be completely justified if IOTA would be like any other DLT. It is ...
@ThroneOfCrypto@__Javs_@TheADAApe ... however very important to understand the vision and purpose of IOTA. IOTA is not just trying to build a "better DLT" but a protocol that can serve as a foundation for the upcoming "Internet of Things".
The vision of this "4th industrial revolution" is that machines will ...
If you are fine with this number of validators, then you don't even need to be worried about any of the things I wrote but as soon as the amount of ...
I really hope that DeFi and it's inevitable future of having to deal with these kind of "problems" will finally make people understand why it is a bad idea to build decentralized systems that try to establish a shared perception of time among all nodes.
It doesn't just severely limit scalability but it also creates these really tricky problems where even if you build a system that is "secure" and where the participants are incentivized to behave "honest", it ultimately doesn't matter because at the end of the day every ...
... block producer runs an algorithm and there will always be people that will study these algorithms and try to game them to their advantage.
As soon as you give "somebody" (no matter how honest) control over how others perceive time you are in for some serious problems.