As blockchain technology evolves, new challenges continue to arise. One of the most crucial roadblocks faced by developers today is the need for more agile and scalable solutions. While traditional blockchain structures have helped gain momentum in the Web3 space, they are too inflexible to meet the needs of modern businesses and their ever-increasing needs.
Fortunately, the promise of modularity offers a trusted way of resolving these limitations. By breaking down blockchain systems into smaller chunks and more manageable modules, developers can now create more efficient, scalable, and secure solutions.
In this post, we will explore the potential of modularity to address the data availability problem and other hurdles encountered in the monolithic blockchain landscape.
Data sampling to scale network computation
Network throughput—the amount of data accepted and processed by a network—determines blockchain performance, and boosting it serves as one of the core challenges facing monolithic blockchains. To increase throughput in monolithic systems, you either need to create larger blocks, increase block frequency, or improve block propagation to transfer more data.
Another bottleneck lies in ensuring that network participants receive and execute transactions before accepting them.
To understand this constriction, it’s important to consider the three key components of blockchain systems — validator nodes, full nodes, and light clients. These three entities interact in a complex, clever, yet sophisticated manner to promote network consensus, data verification, and information validation.
Validator nodes participate in the consensus process by proposing and validating blocks. Full nodes store and maintain a complete copy of the blockchain ledger, validate transactions, and enforce the network rules. Light nodes, meanwhile, typically retrieve and validate the blockchain header, confirming its consensus approval. They then interact with full nodes to acquire and verify data, ensuring consistency with the validated header. While light clients are used to improve efficiency and reduce resource requirements, they rely entirely on the full node they're connected to for data availability and accurate computation. As such, the need for trust in the validator set is a significant challenge for blockchain systems. If a validator node turns out to be malicious, it could attempt to corrupt the blockchain ledger or prevent new transactions from being added.
Avail's focus on data availability addresses this specific issue in traditional blockchains, where a monolithic stack handles availability and computation.
By combining several key components, including light clients, data availability sampling, erasure coding, and KZG commitments, Avail enables greater scalability and overcomes the issue of trust in data availability:
- Using light clients and data availability sampling over the peer-to-peer network makes data availability verification possible without the need to download entire blocks. This not only enhances security but also ensures network resilience as block sizes increase. By leveraging this sampling technique, the amount of data that needs to be processed and transmitted can be reduced, making it more manageable for light clients as they don’t need to worry about how much data is being passed throughout the chain. The network becomes more elastic, meaning that the Data can still be readily available even as the amount of data posted to Avail increases. Light clients (and full nodes) are insulated from the costs that normally come when chains try to increase their network's throughput.
- Additionally, In contrast to enshrined rollups — which rely on the underlying L1 for data availability to perform costly on-chain verification — Avail serves as a the perfect base-layer for sovereign rollups by enabling client-side proof verification. This not only reduces costs but also improves trust minimization, providing an appealing option for developers who can now validate data availability using DAS-empowered light clients and verify execution correctness with zero-knowledge proofs (ZKPs).
- Avail also distinguishes itself from competitors by incorporating KZG polynomial commitments, commonly referred to as "Kate commitments," as a foundational element in its design. This enables accelerated block acceptance and eliminates the need for fraud proofs and challenge periods, typically required in other data availability networks. Additionally, by leveraging erasure coding, a technique that generates redundant data fragments, Avail ensures the reconstruction of the original data in the event of data inaccessibility.
Modular vs. Monolithic Blockchains
Monolithic blockchain systems are tightly integrated frameworks, but they often face challenges aligning participants through crypto-economic incentives and bootstrapping security. This means participants are incentivized to behave in a way that benefits the network — for instance, validating transactions or securing the network. Modular systems, on the other hand, offer increased flexibility and trade-off options for developers. The modular concept thus fully enhances the prospects for interoperability and innovation. For example, applications interact more trustlessly by sharing a data availability layer. These small yet innovative steps truly push the envelope of blockchain capabilities.
In the bootstrapping security department, monolithic and modular infrastructures both require decentralized consensus and distributed networks of validators. However the distinction lies in the specific tasks assigned to these full nodes and validators.
Within monolithic chains, validators are responsible for executing tasks and reaching consensus on the chain’s current state. Conversely, in modular chains, like Avail, the validators’ consensus focuses solely on determining the order and availability of data — disregarding state. As such, validators of modular networks are relieved from the burdensome calculations associated with state management.
When comparing both, modular blockchains like Avail stand as a clear winner offering a list of advantages. They are more flexible, adaptable, and scalable, and developers also have a more comprehensive range of options that they can leverage to create more innovative applications.
- With the ability to fix bugs or upgrade the chain easily without requiring a hard fork (since tasks are distributed across separate modules), modular blockchains also reduce the chances of network ossification. Furthermore, a modular design also helps create a synergistic ecosystem that can handle increased transaction capacity while preserving the benefits of blockchain.
- Another standout feature of a modular design is its ability to seamlessly integrate with other data layers. In the current monolithic landscape, when you have two applications that need to communicate with each other, it's straightforward if they exist within the same chain. However, if they are in different chains, you must navigate trust boundaries. These cross-trust zone bridges have proven to be fragile and prone to vulnerabilities.
- A significant advantage emerges by adopting a modular stack and incorporating a shared data availability layer. The rollups built on this shared layer reside within the same security zone. This arrangement enables them to interoperate more trustlessly, promoting synchronous composability, atomicity, and streamlined communication within the application. Consequently, the need to traverse trust boundaries is minimized, fostering a more efficient and secure environment for cross-application interactions.
The path to achieving a scalable and efficient blockchain ecosystem is thorough exploration, integration of modularity, and enhanced data availability. As developers push the boundaries, concepts like modularity will find their way to the masses.