Former Bybit Technical Director: Looking at the future of blockchain 3.0 and web3 from the perspective of ICP

Author: 0xkookoo, former Bybit Tech Lead, now Geek web3 consultant

Lead: ***Former Bybit technical director / current geek web3 consultant *** @0xkookoo *** ICP and the future of the blockchain world in the eyes ***

Introduction

  • BTC proposes electronic cash and opens up the blockchain industry from 0 to 1
  • ETH proposes smart contracts, leading the blockchain industry from 1 to 100
  • ICP proposes Chainkey technology, driving the blockchain industry from 100 to 100,000,000

On January 3, 2009, the first block of BTC was dug out, and since then, the blockchain has developed rapidly for 14 years. Throughout the past 14 years, the delicacy and greatness of BTC, the birth of Ethereum, the passionate crowdfunding of EOS, the fateful battle of PoS & PoW, the interconnection of Polkdadot’s 10,000 chains, each amazing technology, each wonderful The incomparable story has made countless insiders bow their heads!

At present, in 2023, what is the pattern of the entire blockchain? The following are my thoughts, please refer to interpretation of the public chain pattern in this article

  • BTC, relying on the orthodoxy of introducing electronic cash, stands tall and is a giant stone in the industry
  • ETH is the absolute leader in the industry by virtue of the programmability of smart contracts and the composability of L2 ecology.
  • Cosmos, Polkadot, etc. try to dominate the world by virtue of cross-chain interoperability
  • All kinds of Ethereum killers emerge in endlessly, each leading the way in a small field

But how will the entire blockchain industry develop in the next 10 years? Here are my thoughts

  • Sovereignty is the only problem that blockchain needs to solve, including asset sovereignty, data sovereignty, speech sovereignty, etc. Otherwise, there is no need for blockchain;
  • Cannot be tampered with is a sufficient condition, but not a necessary condition. As long as you can guarantee that my sovereignty will not be damaged, I will tamper with you at will. Everyone's assets in the world will be tampered with and doubled in the same proportion. What's the difference?
  • Complete decentralization is impossible, no matter how it is designed, there will always be "talented" people/vested interests who will occupy a greater voice, and there will always be people who will actively choose not to participate, [go to Centralized multi-point centralization] is the final pattern;
  • Transparency is a must. Isn’t this social experiment for all human beings to let everyone have a say and have the right to protect their own sovereignty? Although there are always people who are lazy, there are always people who are willing to trust more professional people, and there are always people who choose to abstain from voting in order to maximize efficiency, but this is also a choice they made on their own initiative. They have the right but choose not to exercise it voluntarily. As long as everything is transparent and there is no black-box operation, I am willing to accept it if I understand it. If I lose, I am not as good as others. Survival of the fittest is also in line with the market economy;
  • Decentralized control of code execution is the core, otherwise it is just taking off your pants and farting, voting for a week of publicity, and finally the project party deployed a malicious version of the code, even if it is not a malicious version, it is still teasing everyone. It can be said that half of the world is now composed of codes, and decentralized entities do not include control over code execution, so how dare people, including the government, make the blockchain industry bigger? ***Unlimited scalability of linear cost, ***As the blockchain is more and more closely integrated with real life, more and more people are involved, and the demand is increasing, and the infrastructure cannot support unlimited Scalability, or being too expensive to scale, is not acceptable.

Why ICP

Here is a story to introduce first. In 2009, Alibaba proposed the strategy of "going to IOE", which was also a major milestone in the success of Ali's "Double Eleven".

Exit IOE

The core content of the "de-IOE" strategy is to remove IBM minicomputers, Oracle databases and EMC storage devices, and implant the essence of "cloud computing" into Ali's IT genes. in

  • I refers to the IBM p-series minicomputer, the operating system is AIX (IBM's proprietary Unix system);
  • O refers to the Oracle database (RDBMS);
  • E refers to EMC mid-to-high-end SAN storage.

There are three main reasons for going to IOE, but the first point is the essential reason, and the latter two are more indirect:

  • Unable to meet the demand, the traditional IOE system is difficult to adapt to the high concurrency requirements of Internet enterprises, and cannot support large-scale distributed computing architecture;
  • ***The cost is too high, ***The cost of maintaining IOE is too high, such as 500,000 for IBM minicomputers, hundreds of thousands for Oracle, etc.;
  • ***Dependence is too strong, ***IOE system is too dependent, "kidnapped" by manufacturers such as IBM and Oracle, it is difficult to flexibly configure according to their own needs.

Then why was the "de-IOE" strategy proposed in 2009 instead of earlier?

  • Prior to this,
  • Ali's business scale and data volume have not yet reached the level that makes it difficult for the traditional IOE system to adapt, so the demand for IOE is not urgent;
  • Domestic database products are not mature enough in terms of technology and quality, and cannot well replace the role of IOE;
  • Internet thinking and cloud computing concepts have not yet been popularized in China, and decentralized architecture has not become a popular direction;
  • It may take a period of time for management and technical personnel to realize the problems and the measures that must be taken.
  • Year 2009,
  • Ali's rapid business expansion, the IOE system is difficult to support the scale, and the cost is more prone to problems;
  • Some open source database products such as MySQL are relatively mature and can be used as substitutes;
  • Internet thinking and cloud computing have been widely spread and applied in China, making it easier to promote the concept of "de-IOE";
  • Former Microsoft technology guru, Wang Jian, joined Ali in 2008 with a global technical perspective. He was deeply trusted by Jack Ma and proposed "to IOE".

But going to IOE is not simply changing the software and hardware itself, replacing the old software and hardware with new software and hardware, but replacing the old one with a new one, which is to use cloud computing to completely change the IT infrastructure. In other words, this is caused by changes in the industry, not just simple technological upgrades.

Three Stages of Enterprise Development

The development of an enterprise can be divided into 3 stages,

  • Shaping genes, organizational culture, Start-up, from 0 to 1
  • Rapid growth, small steps, Scale-up, from 1 to 100
  • Unlimited expansion, widening the boundary, Scale-out, from 100 to 100,000,000

Let’s analyze the entire blockchain industry as an enterprise

Start-up / Blockchain 1.0 / BTC

The innovation of Bitcoin is that it solves a problem that has puzzled computer scientists for decades, namely, how to create a digital payment system that can operate without trusting any central authority.

However, BTC does have some limitations in its design and development, and these limitations provide market opportunities for subsequent blockchain projects such as Ethereum (ETH). Here are some major limitations:

***Transaction throughput and speed: ***BTC's block generation time is about 10 minutes, and the size limit of each block leads to the upper limit of its transaction processing capacity. This means that when the network is busy, transactions may take longer to confirm and may require higher transaction fees.

***Smart contract has limited functions: ***BTC is designed mainly as a digital currency, and the transaction types and scripting language functions it supports are relatively limited. This limits the use of BTC for complex financial transactions and decentralized applications (DApps).

***Not easy to upgrade and improve: ***Due to BTC's decentralization and conservative design principles, major upgrades and improvements usually require a broad community consensus, which is difficult to achieve in practice, which also makes BTC's Progress is relatively slow.

***Energy consumption problem: ***BTC’s consensus mechanism is based on Proof of Work (PoW), which means that a large amount of computing resources are used for competition among miners, resulting in a large amount of energy consumption. This has been criticized in terms of environmental protection and sustainability. Regarding this point, you can also pay attention to EcoPoW, which partially alleviates this limitation.

Scale-up / Blockchain 2.0 / ETH

The current Layer 2 expansion form of Ethereum can be regarded as a kind of "vertical expansion", relying on the security and data availability guarantee of the underlying Layer 1. Although it seems to be a two-layer structure, it will still be limited by the processing capability of Layer1 in the end. Even if it is replaced with a multi-layer structure, that is, to build Layer3 and Layer4, it will only increase the complexity of the entire system and delay a little time. What's more, according to the diminishing marginal effect, every time an additional layer is added, the extra overhead will greatly reduce the expansion effect. This vertical layering method can be regarded as a stand-alone hardware upgrade, but this stand-alone refers to the entire ETH ecosystem.

And as usage increases, users' demand for low cost and high performance will also increase. As an application on Layer1, Layer2's cost can only be reduced to a certain extent, and ultimately is still subject to the basic cost and throughput of Layer1. This is similar to the demand curve theory in economics - as the price falls, the aggregate quantity demanded increases. Scaling up is difficult to fundamentally solve the scalability problem.

Ethereum is a towering giant tree, and all people rely on that root. Once the speed of the root's absorption of nutrients cannot keep up, people's needs will not be met;

***Therefore, only horizontal expansion is easier to have infinity. ***

Some people think that multi-chain and cross-chain can also be regarded as a way of horizontal expansion.

Take Polkadot as an example. It is a heterogeneous kingdom. Every country looks different, but every time you make something, you need to build a kingdom;

Cosmos is an isomorphic kingdom. The meridians and bones of each country look the same, but every time something is made, a kingdom must be established;

***However, from Infra's point of view, the above two models are a bit strange. ***Each additional application will build an additional kingdom? *** Let's take an example to see how weird it is, ***

I bought a Mac 3 months ago and developed a Gmail application on it;

Now I want to develop a Youtube app, but I have to buy a new Mac to develop, which is so strange.

And the above two methods both face the problem of high complexity of cross-chain communication when adding new chains, so they are not my first choice.

Scale-out / Blockchain 3.0 / ICP

If you want to scale-out, you need a whole set of underlying infrastructure that supports rapid horizontal expansion without reinventing the wheel.

A typical example that supports scale-out is cloud computing. [VPC+subnet+network ACL+security group] These underlying templates are all exactly the same, all machines have numbers and types, and the upper layer RDS, MQ and other core components support Unlimited expansion, if you need more resources, click a button to start quickly.

A leader shared with me before that if you want to know what infrastructure and components Internet companies need, then you only need to go to AWS to look at all the services they provide, which is the most comprehensive and powerful combination.

In the same way, let's take a look at ICP at high-level to see why it meets the requirements of Scale-out.

Here are a few concepts first,

***Dfinity Foundation: *** is a non-profit organization dedicated to promoting the development and application of decentralized computer technology. It is the developer and maintainer of the Internet Computer protocol, aiming to realize the comprehensive development of decentralized applications through innovative technologies and an open ecosystem.

***Internet Computer (IC): *** is a high-speed blockchain network developed by Dfinity Foundation, specially designed for decentralized applications. It adopts a new consensus algorithm that can achieve high-throughput and low-latency transaction processing, while supporting the development and deployment of smart contracts and decentralized applications.

***Internet Computer Protocol (ICP): *** is the native Token in the Internet Computer Protocol, which is a digital currency used to pay network usage fees and reward nodes

What's ICP

Many of the following content will be a bit hardcore, but I have described it in plain language, and I hope everyone can follow. If you want to discuss with me more details, you can find my contact information at the top of the article.

Architecture Overview/ Architecture Overview

From the perspective of hierarchical structure, from bottom to top are

***P2P layer, *** collects and sends messages from users, other replicas in the subnet, and other subnets. Guaranteed message delivery to all nodes in the subnet to ensure security, reliability, and resiliency

*** Consensus layer: *** The main task is to sort the input to ensure that all nodes within the same subnet process tasks in the same order. To achieve this, the consensus layer uses a new consensus protocol designed to guarantee safety and liveness, and is resistant to DOS/SPAM attacks. After a consensus is reached on the order in which various messages are processed within the same subnet, these blocks are passed to the message routing layer.

***Message routing layer: *** Prepare the input queues of each Canister according to the tasks sent by the consensus layer. After execution, it is also responsible for receiving the output generated by the Canister and forwarding it to Canisters in the local or other regions as needed. Additionally, it is also responsible for logging and validating responses to user requests.

*** Execution layer: *** provides Canister with a runtime environment, reads input in an orderly manner according to the scheduling mechanism, calls the corresponding Canister to complete the task, and returns the updated status and generated output to the message routing layer. It uses the non-determinism brought by random numbers to ensure the fairness and auditability of calculations. Because in some cases, the behavior of the Canister needs to be unpredictable. For example, when performing encryption operations, random numbers need to be used to increase encryption security. In addition, the execution result of Canister needs to be random, so as to prevent attackers from analyzing the execution result of Canister to find vulnerabilities or predict the behavior of Canister.

(4-layer structure of ICP)

Key Components/Key Components

From the perspective of composition:

***Subnet: ***Support unlimited expansion, each subnet is a small blockchain. Subnets communicate through Chain Key technology, because a consensus has been reached within the subnet, so it only needs to pass Chain Key verification.

***Replica (Replica): ***There can be many nodes in each Subnet, and each node is a Replica. IC’s consensus mechanism will ensure that each Replica in the same Subnet will be processed in the same order The same input makes the final state of each Replica the same. This mechanism is called Replicated State Machine.

***Canister: ***Canister is a smart contract, which is a computing unit that runs on the ICP network, can store data and code, and can communicate with other Canisters or external users. ICP provides a runtime environment for executing Wasm programs within the Canister and communicating with other Canisters and external users through messaging. It can be simply thought of as a docker for running code, and then you inject the Wasm Code Image yourself to run in it.

***Node (Node): ***Independent server, Canister still needs a physical machine to run, and these physical machines are the machines in the real computer room.

***Data Center (Data Center): ***The nodes in the data center are virtualized into a replica (Replica) through the node software IC-OS, and some Replicas are randomly selected from multiple data centers to form a subnet (Subnet). This can ensure that even if a data center is hacked or encounters a natural disaster, the entire ICP network is still running normally, which is a bit like an upgraded version of Alibaba's "two locations and three centers" disaster recovery and high availability solution. Data centers can be distributed all over the world, and even a data center can be built on Mars in the future.

***Boundary Nodes: ***Provide the entrance and exit between the external network and the IC subnet, and verify the response.

***Principal: ***The identifier of the external user, derived from the public key, is used for permission control.

***Network Nervous System (NNS): ***An algorithm DAO that uses mortgage ICP for governance, and is used to manage IC.

***Registry (Registry): ***The database maintained by NNS, which contains the mapping relationship between entities (such as Replica, canister, Subnet), is a bit similar to the current working principle of DNS.

***Cycles: ***Local Token, which represents the CPU quota used to pay for the resources consumed by the canister runtime. If I have to express it in Chinese, I will use the word "computing cycle", because cycles mainly refer to the unit used to pay for computing resources.

**ICP's **Key Innovation Technology

From the underlying point of view, the Chain-key technology is adopted, among which

Publicly Verifiable Secret Sharing scheme (PVSS Scheme): It is a publicly verifiable secret sharing scheme. In the white paper of the Internet Computer Protocol, the PVSS scheme is used to implement the decentralized key generation (DKG) protocol to ensure that the private key of the node will not be revealed during the generation process.

Forward-secure public-key encryption scheme (forward-secure public-key encryption scheme): The forward-secure public-key encryption scheme can ensure that even if the private key is leaked, previous messages will not be decrypted, thereby improving system security.

***Key resharing protocol: ***A threshold-based signature-based key sharing scheme for key management in the Internet Computer Protocol. The main advantage of this protocol is that it can share existing keys to new nodes without creating new keys, thus reducing the complexity of key management. In addition, the protocol also uses threshold signatures to protect the security of key sharing, which improves the security and fault tolerance of the system.

***Threshold BLS signatures: ***ICP implements the threshold signature scheme. For each Subnet, there is a public verifiable public key, and its corresponding private key is split into multiple shares. A share is held by a Replica in this Subnet, and the message is considered valid only if more than the threshold number of Replicas in the same Subnet sign the message. In this way, the messages transmitted between Subnets and Replicas are all encrypted but quickly verifiable, which ensures both privacy and security. Among them, the BLS algorithm is a well-known threshold signature algorithm. It is the only signature scheme that can produce a very simple and efficient threshold signature protocol, and the signature is unique, which means that for a given public key and message, there is only one valid signature.

***Non-interactive Distributed Key Generation (NIDKG): ***In order to safely deploy a threshold signature scheme, Dfinity designed, analyzed, and implemented a new DKG protocol that runs on an asynchronous network and has a high Robustness (it can succeed even if up to one-third of the nodes in the subnet crash or become damaged), while still being able to provide acceptable performance. In addition to generating new keys, this protocol can also be used to re-share existing keys. This capability is critical to enable autonomous evolution of IC topology as subnets change membership over time.

***PoUW: ***PoUW has one more U than PoW, which stands for Userful, mainly to improve a lot of performance and make node machines do less useless work. PoUW will not artificially create difficult hash calculations, it will focus on serving users as much as possible. Most of the resources (CPU, memory) are used to execute the code in the actual canister.

***Chain-evolution technology: *** is a technology used to maintain the blockchain state machine, which includes a series of technical means to ensure the security and reliability of the blockchain. In the Internet Computer protocol, Chain-evolution technology mainly includes the following two core technologies:

***1.Summary blocks: ***The first block of each epoch is a summary block, which contains some special data for managing different threshold signature schemes. Among them, a low-threshold scheme is used to generate random numbers, and a high-threshold scheme is used to authenticate the replication state of the subnet.

***2. Catch-up packages (CUPs): ***CUPs is a technology for quickly synchronizing node status, which allows newly joined nodes to quickly obtain the current status without re-running the consensus protocol .

***My logical derivation of the underlying technology of the entire IC is: ***

In traditional public-key cryptography, each node has its own public-private key pair, which means that if a node's private key is leaked or attacked, the security of the entire system will be threatened. The threshold signature scheme divides a key into multiple parts and distributes them to different nodes. Only when a sufficient number of nodes cooperate can the signature be generated, so that even if some nodes are attacked or leaked, it will not affect the security of the entire system. too much impact. In addition, the threshold signature scheme can also improve the degree of decentralization of the system, because it does not require a centralized organization to manage the key, but distributes the key to multiple nodes, which can avoid single point of failure and centralization risk. Therefore, ***IC uses the threshold signature scheme to improve the security and decentralization of the system, and *** hopes to use the threshold signature method to complete a universal blockchain with high security, scalability, and fast verification.

***BLS is a well-known threshold signature algorithm, and it is the only signature scheme that can produce a very simple and efficient threshold signature protocol. ***And another advantage of BLS signature is that there is no need to save the signature state. As long as the content of the message remains unchanged, the signature is fixed, which means that for a given public key and message, there is only one valid signature. These all ensure extremely high scalability, so ICP chose the BLS solution.

Because *** uses a threshold signature, there needs to be a distributor to distribute the key fragments to different participants, **but the person who distributes the key fragments is a single point, which can easily lead to a single point of failure, **Therefore, Dfinity designed a distributed key distribution technology, that is, NIDKG. ***During the initialization period of subnet creation, all participating Replicas jointly generate a public key A non-interactively. For the corresponding private Key B, each participant calculates and holds one of the secret shares derived and calculated by mathematical means.

***If you want to be a NIDKG, you must ensure that every participant in the distribution is not cheating. ***So each participant can not only get their own secret share, but also publicly verify whether their secret share is correct. It is a very important point to realize distributed key generation.

So what if the subnet key at a certain historical moment is leaked? How to ensure the immutability of historical data? Dfinity adopts a forward-safe signature scheme, which ensures that even if the subnet key at a certain historical moment is leaked, the attacker cannot change the data of the historical block, which also prevents later corruption attacks on the blockchain. Threat to historical data. If this restriction is stronger, it can actually ensure that the information will not be eavesdropped successfully during transmission, because the time stamps do not match, even if the key is cracked in a short period of time, the past communication content cannot be cracked.

With NIDKG, if a certain section of secret share is held by a node for a long time, once each node is gradually eroded by hackers, the entire network may have problems. Therefore, key updates need to be performed continuously, but key updates cannot require all participants Replicas to gather together for interactive communication, but must also be performed non-interactively. But because the public key A has been registered in NNS, other subnets will also use this public key A for verification, so it is best not to change the public key of the subnet. But if the subnet public key remains unchanged, how to update the secret share between nodes? Therefore, ***Dfinity designed a Key resharing protocol. Without creating a new public key, all Replicas holding the current version of the secret share non-interactively generate a new round of derived secret shares for the new version of the secret. share holder, *** this way

It not only ensures that the new version of secret share is certified by all current legal secret share holders

It also ensures that the old version of secret share is no longer legal

It also ensures that even if the secret share of the new version leaks in the future, the secret share of the old version will not leak, because the polynomials between the two are irrelevant and cannot be reversed. This is also the *** forward security *** just introduced earlier.

In addition, *** ensures efficient re-random distribution, *** when the trusted node or access control changes, the access policy and controller can be modified at any time without restarting the system. This greatly simplifies the key management mechanism in many scenarios. This is useful, for example, in cases where subnet membership changes, as re-sharing will ensure that any new members have the appropriate secret share, while any replica that is no longer a member will no longer have a secret share. Furthermore, if a small number of secret shares are leaked to the attacker in any one epoch or even in each epoch, these secret shares do not do the attacker any good.

Because traditional blockchain protocols need to store all block information starting from the genesis block, as the blockchain grows, this will lead to scalability problems, which is why it is very troublesome for many public chains to develop a light client . So IC wanted to solve this problem, so IC developed Chain-evolution Technology. At the end of each epoch, all input and consensus information that has been processed can be safely cleared from the memory of each Replica, which greatly reduces The storage requirements of each Replica, which enables the IC to scale to support a large number of users and applications. In addition, Chain-evolution technology also includes CUPs technology, which allows newly joined nodes to quickly obtain the current state without re-running the consensus protocol, which greatly reduces the threshold and synchronization time for new nodes to join the IC network.

To sum up, all the underlying technologies of ***IC are linked together, **based on cryptography (from theory), and fully consider the problems of the entire industry such as fast synchronization nodes (from practice) **. It's really a master!

**ICP's **Characteristics / Key Features

***Reverse Gas Model: **Most traditional blockchain systems require users to first hold native tokens, such as ETH, BTC, and then consume native tokens to pay transaction fees. This increases the entry barrier for new users and does not conform to people's usage habits. Why do I have to hold Tiktok shares first when I use Tiktok? ICP, on the other hand, adopts a reverse Gas model design, users can directly use the ICP network, and the project party will be responsible for the handling fee, which lowers the threshold for use, is more in line with the habits of Internet services, and is conducive to obtaining a larger network effect, thus Support for more users to join. *

***Stable Gas: ***For other public chains on the market, for the security of the chain and for transfer needs, some people will buy native tokens, and miners will dig desperately, or some people will desperately dig Tun native tokens, thus contributing computing power to this public chain such as Bitcoin, or providing pledge economic security for this public chain such as Ethereum. It can be said that our demand for btc/eth actually comes from the Bitcoin/Ethereum public chain's requirements for computing power/staking, which is essentially the chain's security requirements. Therefore, as long as the original token is directly used to pay for the gas chain, it will still be expensive in the future. Maybe the original token is cheap now, but as long as the chain itself becomes ecological, it will become more expensive in the future. ICP is different. The Gas consumed in the ICP blockchain is called Cycles, which is exchanged by consuming ICP. Cycles are stable under the adjustment of the algorithm and are anchored with 1 SDR (SDR can be regarded as a multi-national legal currency calculation. stable unit). Therefore, no matter how much the ICP rises in the future, the money you spend doing anything in the ICP will be the same as today (regardless of inflation).

***Wasm: ***Using WebAssembly (Wasm) as the standard for code execution, developers can use a variety of popular programming languages (such as Rust, Java, C++, Motoko, etc.) to write code, *** thus supporting more Multiple developers joining ***.

***Support running AI models: ***Python language can also be compiled into wasm. The number of Python users is among the best in the world, and it is also the first language of AI, such as matrix and large integer calculations. Some people have already run the Llama2 model on the IC. I would not be surprised if the concept of AI+Web3 happens on the ICP in the future.

***Web2 user experience: *** At present, many applications on ICP have achieved amazing results of millisecond-level query and second-level update. If you don’t believe me, you can directly use OpenChat, a decentralized chat application on a pure chain.

*** Run the front end on the chain: *** You have only heard that part of the back-end content is written as a simple smart contract, and then run on the chain, which can ensure that the core logic such as data assets will not be tampered with. But the front-end actually needs to run completely on the chain to be safe, because front-end attacks are very typical and frequent problems. Just imagine that everyone may think that the Uniswap code is very safe. The smart contract has been verified by so many people for so many years, and the code is also simple, so there will definitely be no problems. But suddenly one day, if the front end of Uniswap is hijacked, and the contract you interact with is actually a malicious contract deployed by hackers, you may go bankrupt in an instant. But if you store and deploy all the front-end code in the Canister of IC, at least the consensus security of IC ensures that the front-end code cannot be tampered with by hackers. This protection is relatively complete, and the front-end can be run and rendered directly on the IC. It does not affect the normal operation of the application. On IC, developers can directly build applications without traditional cloud services, databases or payment interfaces, and there is no need to buy a front-end server or worry about databases, load balancing, network distribution, firewalls and other issues. Users can directly access the front-end web pages deployed on ICP through browsers or mobile apps, such as a personal blog I deployed before.

***DAO control code upgrade: ***Now in many DeFi protocols, the project party has full control and can initiate major decisions such as suspending operations, selling funds, etc., without community voting and deliberation. I believe everyone in this case Witnessed or heard. In contrast, the DAPP code in the ICP ecosystem runs in a DAO-controlled container. Even if a certain project party accounts for a large proportion of voting, it still implements a public voting process, which satisfies the blockchain transparency described at the beginning of this article. necessary condition for transformation. This process guarantee mechanism can better reflect the wishes of the community. Compared with other current public chain projects, *** has a better degree of governance. ***

***Automatic protocol upgrade: ***When the protocol needs to be upgraded, a new threshold signature scheme can be added in the summary block, thereby realizing automatic protocol upgrade. This approach can ensure the security and reliability of the network while avoiding the inconvenience and risk of hard forks. Specifically, the Chain Key technology in ICP can ensure the security and reliability of the network, and it maintains the blockchain state machine through a special signature scheme. At the beginning of each epoch, the network uses a low-threshold signature scheme to generate nonces, and then uses a high-threshold signature scheme to authenticate the replication state of the subnetwork. This signature scheme can ensure the security and reliability of the network, and can also realize automatic protocol upgrades, thus completely avoiding the inconvenience and risks caused by hard forks. ***

(Proposal Voting)

***Fast forwarding: *** is a fast node state synchronization technology in the Internet Computer protocol, which allows newly joined nodes to quickly obtain the current state without re-running the consensus protocol. Specifically, the process of Fast forwarding is as follows:

  1. The newly added node obtains the Catch-up package (CUP) of the current epoch, which contains the Merkle tree root, summary block and random number of the current epoch.

  2. The newly joined node uses the state sync subprotocol to obtain the complete state of the current epoch from other nodes, and uses the root of the Merkle tree in the CUP to verify the correctness of the state.

  3. The newly joined node uses the random number in the CUP and the protocol messages of other nodes to run the consensus protocol, so as to quickly synchronize to the current state.

The advantage of Fast forwarding is that it can absolutely allow newly joined nodes to quickly obtain the current state, without having to generate blocks from scratch like some other public chains. *** This can speed up the synchronization and expansion of the network. At the same time, it can also reduce the communication traffic between nodes, thereby improving the efficiency and reliability of the network.

(fast forwarding)

*** Decentralized Internet **** Internet identity: *** The identity system on IC really makes me think that the DID problem can be completely solved, and it is completely solved, no matter it is extended Sex or privacy. The identity system on IC currently has an implemented version called Internet Identity, and a more powerful NFID is developed based on it.

***Its *principle is as follows:

  1. When registering, it will generate a pair of public key and private key for the user. The private key is stored in the TPM security chip inside the user's device and never leaked, while the public key is shared with services on the network.

  2. When a user wants to log in to a dapp, the dapp will create a temporary session key for the user. This session key will be signed by the user through an authorized electronic signature, so that the dapp has the authority to verify the user's identity.

  3. After the session key is signed, the dapp can use the key to access network services on behalf of the user, and the user does not need to sign electronically every time. This is similar to delegate authorization login in Web2.

  4. The session key is only valid for a short period of time. After the expiration, the user needs to re-sign the biometric authorization to obtain a new session key.

  5. The user's private key is always stored in the local TPM security chip and will not leave the device. This ensures the security of the private key and the anonymity of the user.

  6. By using temporary session keys, different dapps cannot track each other's user identities. For truly anonymous and private access.

  7. Users can easily manage their own Internet Identity synchronously among multiple devices, but the device itself also needs corresponding biometric identification or hardware key for authorization.

Some of the advantages of Internet Identity are as follows:

***1. No need to remember the password. ***Use biometrics functions such as fingerprint recognition to log in directly, no need to set and remember complex passwords.

***2. The private key does not leave the device, which is more secure. ***The private key is stored in the security chip of the TPM and cannot be stolen, which solves the problem of username and password being stolen in Web2.

***3. Log in anonymously and cannot be tracked. ***Unlike Web2, which uses mailboxes as usernames to be tracked across platforms, Internet Identity removes this tracking.

***4. More convenient for multi-device management. *** You can log in to the same account on any device that supports biometrics, not limited to a single device.

***5. Do not rely on central service providers to achieve true decentralization. ***Different from the mode in which usernames correspond to email service providers in Web2.

  1. With the entrusted authentication process, there is no need to sign repeatedly every time you log in, and the *** user experience is better.

7. Support the use of dedicated security devices such as Ledger or Yubikey to log in, improving security.

***8. Hiding the user's actual public key, *** cannot query transaction records through the public key to protect user privacy.

***9. Seamlessly compatible with the Web3 blockchain, ***can log in and sign blockchain DApps or transactions safely and efficiently.

The architecture is more advanced, representing the organic integration of the advantages of Web2 and Web3, and is the standard for future network account and login.

In addition to providing a new user experience, the following technical measures are also taken to ensure its security:

  1. Use the TPM security chip to store the private key, which is designed so that even developers cannot access or extract the private key to prevent the private key from being stolen.

  2. Secondary authentication mechanisms such as biometric authentication, such as fingerprint or facial recognition, need to be verified in combination with the device, so that only the user holding the device can use the identity.

  3. The session key adopts a short-term expiration design to limit the time window for being stolen, and force the relevant ciphertext to be destroyed at the end of the session to reduce risks.

  4. The public key encryption technology makes the data in the transmission process encrypted, and the external listener cannot know the user's private information.

  5. Do not rely on third-party identity providers, PRIVATE KEY is generated and controlled by users themselves, and third parties are not trusted.

  6. Combined with the non-tamperable modification brought by the IC blockchain consensus mechanism, it ensures the reliability of the entire system operation.

  7. Relevant cryptographic algorithms and security processes are being continuously updated and upgraded, such as adding more secure mechanisms such as multi-signatures.

  8. Open source code and decentralized design optimize transparency and facilitate community collaboration to improve security.

(Internet Identity)

Core Team/ Core Team

From the perspective of the team, there are a total of 200+ employees, all of whom are very elite talents. Employees have published 1600+ papers, been cited 100,000+, and hold 250+ patents.

Academically, his recent mathematical theories include Threshold Relay and PSC chains, Validation Towers and Trees and USCID.

From the perspective of technical background, he has a profound technical research and development background. He has been engaged in research in the field of big data and distributed computing in his early years, which laid the technical foundation for the construction of complex ICP networks.

Entrepreneurially, he previously ran an MMO game on his own distributed system hosting millions of users. Dominic started Dfinity in 2015, and he is also the President and CTO of String labs.

From the point of view, he proposed the concept of decentralized Internet more than 10 years ago. It is not easy to promote this grand project for a long time. At present, his design ideas are very forward-looking.

Founder Dominic Williams is a crypto theorist and serial entrepreneur.

*** In terms of technical team, Dfinity is very strong. ***Dfinity Foundation has assembled a large number of top cryptography and distributed system experts, such as Jan Camenisch, Timothy Roscoe, Andreas Rossberg, Maria D., Victor Shoup, etc., and even the "L" in the author of the BLS cryptographic algorithm - Ben Lynn Also works at Dfinity. This provides strong support for ICP's technological innovation. The success of blockchain projects is inseparable from technology, and the gathering of top talents can bring technological breakthroughs, which is also a key advantage of ICP.

Dfinity Foundation Team

Financing and Economic Model/ Fund-raising & Tokenomics

If this piece of content is also discussed, this article will be too long, so I decided to write a separate article later for you to analyze in detail. This article focuses more on why ICP has a great opportunity from the perspective of the development direction of the blockchain industry.

Applications/Applications

All types of applications, social platforms, creator platforms, chat tools, games, and even metaverse games can be developed on ICP.

Many people say that because it is difficult to achieve global state consistency on IC, it is naturally not suitable for DeFi, but I think this question itself is wrong. It's not that the global state is consistent, it's that the global state is consistent under low latency. If you can accept 1 minute, 10,000 machines around the world can also do global consistency. With so many nodes in Ethereum and BTC, isn’t it forced to achieve global state consistency under high latency, so they are currently unable to achieve unlimited horizontal expansion. IC first solves the problem of infinite horizontal expansion by cutting subnetworks. As for the global state consistency under low latency, it uses a strongly consistent distributed consensus algorithm, a well-designed network topology, and high-performance distributed data synchronization. It is also achievable to stamp effective verification and a mature fault-tolerant mechanism. But to be honest, it will be more difficult to build a trading platform at the IC application level and the high-performance trading platform made by the people on Wall Street now, not just to reach an agreement among multiple computer rooms. However, the difficulty does not mean that it cannot be done at all, but that many technical problems must be solved first, and a moderate state will be found after all, which not only ensures safety, but also ensures an acceptable experience for people. For example, ICLightHouse below.

ICLightHouse***, ***an orderbook dex on the whole chain, what is the concept on the whole chain? How many technical difficulties need to be solved? On other public chains, it's impossible to even think about it, but at least it's doable on IC, which gives us hope.

OpenChat***, ***A decentralized chat application with a great experience, I have not seen a second such product in the entire blockchain industry, and many other teams have also This direction has been tried, but ultimately failed due to various technical problems. In the final analysis, users feel that the experience is not good. For example, the speed is too slow. It takes 10 seconds to send a message and 10 seconds to receive a message from others. However, a small team of three people made such a successful product on the ICP. You can experience how smooth it is. Welcome to join the organization, where you can enjoy the collision of ideas, and to a certain extent, enjoy the refreshing feeling of freedom of speech.

Mora***, ***A platform for super creators, where everyone can create a planet and build their own individual brand, and the content you output will always belong to you , and even support paid reading. It can be called a decentralized knowledge planet, and now I have to refresh articles on it every day.

Easy - 0xkookoo

The OpenChat and Mora apps are products that I use almost every day, and they give people a sense of comfort that cannot be separated. The two words describe it as freedom and fulfillment.

At present, some teams have developed game applications on IC. I think the narrative of full-chain games may eventually be taken over by IC. As I said in the GameFi section of this article I wrote before, game playability and fun are things that the project side should consider, and playability is easier to achieve on IC, looking forward to **Dragginz ** masterpiece.

Summary/Summary

ICP is like the earth, and Chainkey technology is like the core of the earth. Its relationship with ICP is similar to the relationship between the TCP/IP protocol and the entire Internet industry. Every Subnet is like the continent of Asia, Africa and Latin America. Of course, the Subnet can also be the Pacific/Atlantic , There are different buildings and regions (Replica and Node) in the continent and ocean, plants (Canister) can be planted on each region and building, and different animals live happily;

ICP supports horizontal expansion, and each subnet can communicate between different subnets while being autonomous. No matter what application you are in, social media, finance, or even metaverse, you can achieve final consistency through this distributed network. It is easy to achieve a global ledger under synchronous conditions, but it is a great challenge to achieve "global state consistency" under asynchronous conditions. ***Currently, only ICP has the opportunity to do this. ***

It should be noted that this does not refer to "global state consistency", but "global state consistency". "Global state consistency" requires all participating nodes to [agree on the order of all operations], [final results are consistent], [objectively consistent, independent of node failure], [clocks are consistent], [instantly consistent, all All operations are processed synchronously], which can be guaranteed in IC single subnet. But if you want to ensure "global state consistency", you need all subnets as a whole to achieve the above "global state consistency" for the same data and state. In actual implementation, this is impossible to achieve within low latency , which is also the bottleneck where public chains such as ETH cannot expand horizontally. Therefore, IC chooses to reach a consensus within a single subnet, and other subnets quickly verify that the results are not falsified through communication, so as to achieve "final global state consistency". It is equivalent to *** combining the decentralization of large-scale public chains and the high throughput and low latency of alliance chains, and realizes the unlimited horizontal expansion of subnets through mathematical and encryption algorithm proofs. ***

To sum up, it can be seen that according to the final development direction of the blockchain I thought about at the beginning of the article, *** [sovereignty] + [decentralized multi-point centralization] + [transparency] + [code execution control Quantity] + [Infinite scalability of linear cost], ***

Sovereignty is the only problem that the blockchain needs to solve, including asset sovereignty, data sovereignty, speech sovereignty, etc. Otherwise, there is no need for blockchain;

IC totally did it

  • Cannot be tampered with is a sufficient condition, but not a necessary condition. As long as you can guarantee that my sovereignty will not be damaged, I will tamper with you at will. Everyone's assets in the world will be tampered with and doubled in the same proportion. What's the difference?

IC did it too

***Complete decentralization is impossible. ***No matter how it is designed, there will always be people with "talented" talents/vested interests who will occupy a greater voice, and there will always be people who will actively choose not to participate. [Go to Centralized multi-point centralization] is the final pattern;

  • ***IC is currently the best among all public chains. It can not only maintain a certain degree of decentralization, but also make full use of the advantages of centralized entities, so as to better realize the governance and operation of the network. ***
  • Transparency is a must. Isn’t this social experiment for all human beings to let everyone have a say and have the right to protect their own sovereignty? Although there are always people who are lazy, there are always people who are willing to trust more professional people, and there are always people who choose to abstain from voting in order to maximize efficiency, but this is also a choice they made on their own initiative. They have the right but choose not to exercise it. As long as everything is transparent and there is no black-box operation, I am willing to accept it if I understand it. If I lose, I am not as skilled as others. Survival of the fittest is also in line with the market economy;

IC totally did it

  • The control of code execution is the core, otherwise it is just taking off your pants and farting, voting for a week, and finally the project party deployed the malicious version of the code, even if it is not a malicious version, it is still teasing everyone.

Currently only IC can do it

  • Infinite scalability of linear cost. As the blockchain is more and more closely integrated with real life, more and more people are involved, and the demand is increasing. The infrastructure cannot support unlimited scalability, or expand Too expensive and unacceptable.

Currently only IC can do it

Based on the above facts and my thinking and analysis, I think that ICP = Blockchain 3.0.

This article is just to talk about why ICP is likely to be the innovation driver of blockchain 3.0 from the perspective of the future development direction of the blockchain industry, but it is undeniable that there are indeed some problems in the design of Tokenomics of ICP, and the ecology has not yet Outbreak, the current ICP is far from the ultimate blockchain 3.0 in my mind and needs to continue to work hard. But don’t worry, this matter is inherently difficult. Even the Dfinity Foundation has prepared a 20-year roadmap. It has achieved such a great achievement in only 2 years since the mainnet was launched. At present, it has also used cryptography to connect with BTC And the ETH ecology, I believe it will be even better in 3 years.

Future

  • IC has completed the construction of Infra from bottom to top, and the application from top to bottom is beginning to take shape. My recent direct impression is that IC can play more and more cards and is fully prepared for the next bull market.
  • IC is a paradigm update, not just a simple technology upgrade, it is a paradigm migration from stand-alone computing to distributed computing, and even more from a stand-alone system to a distributed system. The concept of decentralized cloud computing allows many small companies to enjoy a one-stop development experience at the initial stage.
  • According to Mr. Yu Jun’s product value formula: product value = (new experience – old experience) – migration cost, in the future, as long as some people find that the experience benefits of joining the IC ecology are greater than the migration cost, IC will have more people including project parties and With the addition of users, the scale effect of "cloud computing" will be more easily reflected. Solve the problem of "which came first, the chicken or the egg", and the positive flywheel of the IC is established.
  • Of course, everyone’s definition of experience is subjective, so some people will choose to join first, while others choose to join later. Those who join first take greater risks, but usually get greater benefits on average .

References

  • "De-IOE" was the earliest, and the architecture has become a trend.
  • Introduction to Internet Identity/2.1 What is Internet Identity/
View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments