Excellent article explaining the disadvantages of monolithic & micro-monolithic (microlith) architectures. The same historical problems outlined with web 2.0 is what we are currently witnessing with the existing web 3.0 blockchain & smart contract paradigm oreilly.com/radar/the-evol…
The best practices of the past decade in distributed systems design needs to be adopted by web3. Shifting to a scalable micro-services architecture using reactive design principles will align us with where web 2.0 is today, initiating the real world adoption we are all hoping for
If you look at the previous decade, the successful web 2.0 companies like Netflix, Twitter, Facebook, Amazon, Uber, were all enabled by highly scalable microservices architectures. @Conste11ation aims to bring that same paradigm shift to web 3.0: divante.com/blog/10-compan…
Just as @ApacheSpark provides an interface for programming clusters with implicit data parallelism and fault tolerance, #HGTP will do the same for decentrally validated big data applications through the introduction of state channel networks that embody custom defined complex
data types and processing pipelines. @Conste11ation is inverting the existing paradigm of how data is verified in blockchain by enabling the state of each state channel ledger to be safely captured & stored as snapshots, a format that can be safely verified without reliance on a
homogeneous global execution context (i.e. Ethereum Virtual Machine sandbox). This removes dependence on a synchronous, single monolithic database, with a single schema for all services. State channels are inherently data oracles by design which form intra & inter-networks of
heterogeneous microservices which maintain their own ledger states & safely interface with each other over the Hypergraph Transfer Protocol #HGTP to converge globally. Verifiable remote code execution on decentrally hosted & untrusted hardware is made possible with this approach.
By following the innovations of Apache Hadoop Distributed File System (HDFS) & Elasticsearch to data partitioning & locality, Constellation Network is capable of handling nodes entering & leaving the network in real time. This enables dynamic partitioning of the network & the
elastically scalable compute architectures of Web 2.0 to be possible in Web 3.0. @Conste11ation enables a horizontally scalable network topology to emerge by incentivizing node participation by dynamically aligning cryptocurrency reward allocation with actual network utilization.
#HGTP's Proof of Reputable Observation (PRO) consensus algorithm decouples the dependence on financial stake, seen in Proof Of Stake (PoS) models, to secure the network. Rather than trusting nodes with the most financial weight, a reputation scoring machine learning model weighs
the trust of nodes according to their behaviors and real world relationships. This enables a multi-dimensional & dynamic web of trust to form amongst nodes which naturally aligns with how trust is actually measured in the real world. It enables the network to self-organize in a
way that maximizes trust through incentivization of good behaving & reliable nodes while efficiently identifying & mitigating malicious and/or unreliable nodes. Using reputation in replace of financial or computational weight provides autonomy over the tokenomics which govern
validator rewards. It maximizes decentralization by decoupling security considerations from the utility of the cryptocurrency token, allowing it to serve as a focused mechanism for promoting node behavior that achieves network efficiency & scalability while removing barriers to
network participation, for both computational consumers & computational providers alike. This is accomplished by having a free tier for executing a single transaction on the global network, per global snapshot finality window, with state channel participants who need to exceed
this rate having the option to purchase $DAG to do so. Additionally, state channels can attract individual holders to stake their $DAG to increase their network's throughput on the global network in return for tokenized rewards. This behavior functions as a darwinistic mechanism
which naturally prioritizes the allocation of throughput according to the true real world value being generated by the state channel. This further optimizes the utilization of the global network's computational resources, resulting in a self-sustaining generative network effect.
The $DAG utility token facilitates the interoperation of state data between the micro-services that it interconnects, forming a decentralized Hypergraph. As a natural consequence, $DAG's value economically scales with the economic activity that flows through the network.
As more complex data pipelines & integrations occur across networks, a system of systems emerges. This results in a proportional amount of network effects to manifest since there are no technical or economic barriers preventing their generation.
The Hypergraph Transfer Protocol #HGTP enables a modular, resilient, and flexible hierarchical network of decentralized & distributed ledgers to be feasible, emulating how the existing internet has evolved over time with the introduction of TCP/IP: internet-map.net
Lastly, reputation / web of trust is the future of Web 3.0 & historically substantiated in other human networks & systems outside of blockchain which can result in an equally fair system compared to Proof of Work while exceeding PoW capacity for decentralization.
Promoting honest/good behavior in an untrusted decentralized network of nodes through gamification by solving computationally complex hash problems leads to consolidation of hardware & financial resources. This cartelization effect impacts the maximum number of nodes
participating in consensus, minimizes unique ownership/individual participation, as well as limits geographic dispersion (i.e. mining pools in china). Ofcourse parallel asychronous ledger updates, apart from the gamification strategy, also is critical for network diversification.
Looking back to the DARPA report on Blockchain centralization released in June of this year, the evaluation of the current Blockchain status quo is further substantiated. Full report can be viewed here: assets-global.website-files.com/5fd11235b3950c…
Also, the recent NIST 800-160 Vol.2 publication outlines the systems security engineering approach to developing cyber resilient systems, of which @Conste11ation's Microservice approach to modularizing & decoupling DLT backend capabilities can facilitate. csrc.nist.gov/publications/d…
• • •
Missing some Tweet in this thread? You can try to
force a refresh
@wyatt_noise explains the notion of cell complexes in @Conste11ation's network topology which enables user defined complex data types & consensus. This is an entirely novel approach to horizontal scalability that can be described as "Database Cellularization" which is 1/10
different than the typical database sharding approach to scale. $DAG's functional reactive programming (FRP) approach encodes the entire topology of a data transformation pipeline into type classes which are geometric representations of referential data. 2/10
The logic embedded within the typeclass defines how to proceed with the next level of the data dependency graph. Verification occurs by the creation of a non-linear API call graph across networks, and the formation of a graph of cryptographic signatures across the result 3/10
cnb.cx/3y8NJPZ $DAG 1. "Apple (for Maps), Uber, Snapchat, Spotify and Coca-Cola are among the many digital and consumer brands that rely on Foursquare for location services technology, and anonymized and aggregated datasets."
2. "More than 125,000 developers worldwide embedding it in their own software, & with 14 billion-plus human-verified “check-ins,” Foursquare is the underlying location engine that powers a myriad of brands, such as Twitter, Snapchat, Uber, Spotify, Airbnb, Coca-Cola, and JetBlue"
3. "Location based data is probably the most sensitive PII [personally identifiable information] in the ecosystem — where people move, where phones move, and the correlation of that. From the very beginning, we’ve built our systems first to be opt-in,” Little said."