If the disruptive DLT still intrigues you at certain levels, read on to know what it is, how it works, its advantages and everything else that you need to know.
DLT stands for Distributed Ledger Technology. A Distributed Ledger Technology is a system for maintaining Digital Records (Ledgers), of transactions or any other information pertaining to assets or details of things of value, at multiple places at the same time. This aspect of being distributed lends the technology its name.
The distributed nature of DLT poses a serious challenge. Any record is useful for the human society if it is a single source of truth. For multiple copies of the same information to be regarded as single source of truth, it needs to be appended at the same point of time so that no record mismatches from its copy. All records have to be constantly updated and that too, simultaneously and in sync with all other copies of the record. The records, known as Ledgers, are a registry, record of transactions, asset ownerships, important social events, transfer of rights and obligations and such other information stored chronologically over a period of time. This enables referencing and verification of facts in case of any dispute between the participants to that information. Thus, the entries in a ledger should be carefully and responsibly done so that no false information gets recorded. It is, therefore, very important that such data is validated as authentic before it is entered into the records.
Till date, ledgers were validated and appended by a trusted person or organisation of authority. Known as the ‘trusted third party’, they had the final call on the claim to truth in case of disputes between participants to that ledger. The King or the Government acting through the bureaucracy or large institutions were entrusted with such roles. However, this process involved a ledger to be appended and maintained at location by a central authority. This did not have multiple simultaneous copies distributed with the participants to the ledger. The trusted third party system of record keeping had some inherent problems. The rules of authenticity could change with the change in regime, old records could be disregarded completely and human bias could impact the decision making while validating the authenticity or passing a judgement in case of a dispute. All very much possible. In order to take the human element out of the record keeping, the process of validating authenticity and appending the system by the community was needed. That’s where DLT came in. Though it has its own challenges, the usage of latest technologies – cryptography, advanced algorithms supported by strong computing powers - makes the validation of authenticity of a record and appending and updating a record at multiple locations simultaneously possible.
Such a method of record keeping has multitude social benefits:
Transparency: It is one of the most transparent ways of handling records, as information is shared with everyone and all entries are publicly witnessed.
Security: The fact that the copies of the same ledger are distributed over a large number of participants reduces the vulnerability of data compromise as compared to a central server. The central server needs to be hacked once in order to change/modify/manipulate/delete or add data into the register/database. However, in a distributed system, all the individual copies have to be updated simultaneously, making it nearly impossible for the hostiles to derive much benefit from hacking a DLT.
Speed: Middlemen or central third party or trusted third party have no role in DLT. Thus, the need for validating and entering millions of transactions by one single person or organisation, that usually delays the appending of the ledger, is eliminated. This essentially speeds up transactions between participants. Like they rightly say, ‘Less government and maximum governance’.
Cost Reduction: The trusted third party employs a rather large workforce to enter the information as data in the ledger, which attracts huge costs. When the participants themselves manage the entries of the transactions, this cost is avoided and hence such a system can execute transactions at a much lesser cost than the traditional system.
The very first application of the Distributed Ledger Technology was in recording transactions between participating individuals, famously known as the Bitcoin. The concept of Bitcoin as a peer-to-peer electronic cash system has gained popularity worldwide and has successfully proven that the Distributed Ledger Technology can work in our social ecosystem. Not only has the bitcoin DLT proven its workability, it has also inspired many other social activities to be similarly treated, like government and business dealings, financial transactions, tax collection, property deed transfers, social benefits distribution, voting procedures, healthcare records, processing and execution of legal documents and many similar exchanges. Due to a few well-demonstrated properties of the Distributed Ledger Technology like pseudonymity of transactions, it can be used by individuals to hold and control personal information, and then selectively share part of those records when needed. Use cases here include individual medical records and corporate supply chains. Efforts are also being made to employ DLT to help better track intellectual property rights and ownership for art, commodities, music, films and more.
Trust is a vital element in executing transactions and helping the growth and development of social economy. Till now, this has been achieved by entrusting an authoritative central third party who is held responsible in case of transactions going wrong. Can the need of such trust be eliminated? Can something refrain the participants to a social activity from cheating in the absence of a trusted authority? Can a system be made ‘trustless’? A ‘trustless system’ is one in which the participants neither need to know or trust the individuals participating in the system nor do they need a central trusted third party to carry out their functions in the system. As there is no single entity that controls the system, human bias is eliminated from the system. Instead, it is controlled and constrained by mathematical algorithms and executable programs that function in a set and defined manner under all circumstances. Although, we love to call it a ‘trustless system’, it does not completely eliminate trust, but rather distributes it in a type of economy that incentivizes honest behaviour. In this case, therefore, trust is minimized but not eliminated. The human society is, however, habituated to centralized systems as it is the environment which existed before 2009. The property of trustlessness in a peer-to-peer (P2P) network was introduced by Bitcoin, as it allowed all transactional data to be verified and immutably stored on a publicly distributed ledger. The Bitcoin DLT and few others following it achieved the ‘trustless system’ status by providing economic incentives for honest behaviour. This makes the Distributed Ledger system most resilient to vulnerabilities and attacks, and simultaneously eliminates the need of a trusted single point of vulnerability.
The Distributed Ledger Technology involves the participants to an activity in a ‘trustless’ environment, whereby a fair ledger is maintained by fellow participants without having personal information of each other or trust between them. The absence of a trusted third party has its own benefits, but there’s an underlying risk that the entries are vulnerable to malicious attempts by unknown participants who would corrupt the entries with false information. The essential precondition of a trustless system is that the participants have to place their trust in the system itself. To achieve this, two things are needed to be done. One, is to maintain complete transparency, so that everyone is aware of all transactions. Second, is to arrive at a social consensus regarding all transactions. The consensus is taken as valid by the entire participating group or a majority of its members. This in a DLT is achieved by what is known as a Consensus Protocol. Let us dive a bit deeper to understand this more closely. A consensus algorithm is a procedure through which all the participants of the Distributed Ledger network reach a common agreement about the present state of the distributed ledger. In this way, consensus algorithms achieve reliability in the network and establish trust between unknown participants. Essentially, the consensus protocol makes sure that every new entry that is added to the ledger is the one and only version of the truth that is agreed upon by all the participants using the ledger.
The DLT Consensus Protocol consists of some specific objectives such as:
1.Agreement: Arrive at a common agreement for the majority of the group.
2.Collaboration: Make the group collaborate towards achieving a result, which puts the group’s interest ahead of individual interests.
3.Cooperation: Work together and try to achieve a common goal.
4.Equality: Treat all participants equally by a system where every vote has equal weightage. In other words, the system should be egalitarian and unbiased.
5.Inclusion: Encourage participation by all and give everyone an equal chance to benefit from the system.
6.Active Participation: Ensure more active participation, as it would mean a more robust mechanism and security in the system.
7.Proper Validation: Enter only data validated to be true into the ledger.
8.Dependability: Aim to establish a consistent, reliable and fault-tolerant system.
First, the causes of a possible invalidation of an entry need to be eliminated. Every DLT system has its own rules for validation of an entry. For e.g., A transaction would be invalid if there the spender doesn’t have sufficient balance. For validating this particular condition, the validators have to just check the spender’s ledger balance (a public document, thanks to DLT) against the transaction request (broadcast over the network). If the transaction amount is less than or equal to the ledger balance, it’s a valid transaction and hence it is processed further. Otherwise, it gets discarded as an invalid transaction by the majority of the network.
After this check, valid transactions need to pass through the Consensus Protocol for them to be added to the ledger. Different DLTs use different Consensus Protocols. But the common element in these Consensus Protocols is that they all attempt to achieve reliability through consensus of the majority of the network. A good Consensus Protocol fulfils the prerequisites of collaboration and cooperation amongst participants, equal weightage to votes and active participation. If a Consensus Protocol does not satisfy these pre-conditions, it may result in multiple copies of the same ledger being considered true by different sections of the participants (known as fork in a blockchain). With the DLT failing to deliver accurate and desired results there would be a general consensus failure in the network, which would result in reduced efficiency as more time would be spent in resolving conflicts.
The Byzantine General’s Problem is a hypothetical situation which demonstrates the complexity of arriving at a consensus in a ‘trustless’ system involving possible corrupt entities.
The problem assumes several divisions of an army laying siege to a city. The city can only be won if all the divisions attack in a collaborative manner at the same time. However, if the attacks are un-coordinated, the city would defend each of them easily and the whole army will be defeated, division by division. Under such a circumstance, the general needs to communicate with the commanders of each division to coordinate an attack plan. He has at his disposal the services of a messenger, commanders and the general, each of whom are vulnerable in their own way. So the problem is how do the loyal commanders and general reach a consensus and attack at the same precise time eliminating the risk of misinformation? That is the Byzantine General’s problem.
The Digital Ledger Technology, functioning in a ‘trustless’ environment entails unknown participants to share information in the network to validate and append valid transactions in the ledger by arriving at a consensus. This is achieved by different protocols running solution algorithms to Byzantine General’s Problem in different DLTs. Each of such consensus protocols have their own advantages and disadvantages.
A Timestamp Server is a time validating protocol which assigns every piece of authorised (using one’s own private key/ digital signature) information passed onto a network with a timestamp and securing the information and the time together cryptographically.
In an DLT environment, use of timestamp server to chronologically record ledger entries is very crucial. As for transactional records, the valid spends which arrive first and are noted in the ledger are only considered, all later attempts are disregarded as double-spend attempts.
Some popular consensus algorithms are:
|Consensus Algorithms||Used in||Type|
|Proof-of-Work||Bitcoin, Litecoin, Ethereum, Dogecoin||Competitive Consensus|
|Proof-of-Stake||Ethereum (2.0), Peercoin, NXT||Competitive Consensus|
|Delayed Proof-of-Work||Komodo||Collaborative Consensus|
|Delegated Proof-of-Stake||BitShares, Steemit, EOS, Lisk, Ark||Collaborative Consensus|
|Proof of Elapsed Time||Hyperledger, Sawtooth||Collaborative Consensus|
|Byzantine Fault Tolerance||Hyperledger Fabric, Stellar, Ripple, Dispatch||Collaborative Consensus|
|Delegated Byzantine Fault Tolerance||Neo||Collaborative Consensus|
|Directed Acyclic Graphs||Iota, HashGraph, Byteball, RaiBlocks/Nano||Collaborative Consensus|
|Proof-of-Authority||POA Network, Ethereum Kovan Testnet, VeChain||Collaborative Consensus|
|Proof-of-Capacity||BurstCoin, Shia, Spacemint||Collaborative Consensus|
|Proof-of-Burn||Slimcoin, TGCoin||Competitive Consensus|
Consensus is an agreement that satisfies each of the parties involved, and is a key to democracy and decentralisation in general and DLTs in particular.
In cryptography, consensus is a voting procedure. Its goal is to ensure that all members of the network agree on the contents of the ledger to be legit entries in its current state even after the addition of a new entry/entries. A Consensus Protocol guarantees that the records in the ledger are true and that there are incentives to keep participants fair. It’s a major framework for preventing a single entity from controlling the entire system and makes sure everybody follows the rules. On the other hand, protocols are a definite set of rules. Protocols help us to ensure that the transactions in the network are viable, eliminate unauthorised spending or double spending and validate that the participants of the network are honest.
As a Consensus Protocol in a DLT allows unknown participants to communicate with each other over the distributed network and reach a decision about the set of validated entries to be added to the ledger, it needs three basic features:
Security: Guarantee safety of information propagated through the network and
Security: Guarantee safety of information propagated through the network and considered to be valid in order to be appended in the ledger. This could only be achieved if all the participants offer the same output and the outputs are valid as per the set rules. This is also known as consistency of the shared state.
Real-time Value: Ensure real-time execution, with all members honestly participating and arriving at consensus during a given time interval
Fault-tolerance: Be resistant to alteration of process of logic due to some external or internal attempts to force it to change and accept something it should not.
The three diverse objectives are sometimes at loggerheads with each other. Like, validating needs time and thus compromises with the speed of authentication, while attempting to make things go fast one can unknowingly design vulnerabilities reducing the fault-tolerance of the system and so on. In 2009, when the first successful consensus protocol was designed and proposed through Bitcoin, it had a Turnaround Time (TAT) of approximately 10 minutes. While the TAT of Ethereum, the second successful DLT in operation is somewhere between 10 to 19 seconds. Bitcoin blocks could also have been generated much faster, but it purposefully adjusted the difficulty level of block generation and kept it at a minimum of 10 minutes. In 2009, the transaction confirmation TAT of 10 minutes was acceptable given the robustness of the fault-tolerance and security exhibited by the first DLT. However, by 2015 it was considered to be very slow. The very features of robustness in that particular protocol was now perceived to be the factor slowing it down. Hence, attempts to alter the same through a change in protocols were made, which led to an entirely new DLT being proposed and launched, expanding its applicability in the real world and also improving on the TAT.
A number of Consensus Protocols are being continuously researched, suggested and implemented in various different DLT environment. These are suited for the specific needs of certain enterprises and therefore are changed as per the changing needs.
The advantage of Consensus Protocols is that they offer options to optimise between security, real-time value and fault-tolerance capabilities of the DLT. The disadvantages are unique to the particular protocol in question and would require additional mechanisms to prevent misuse of the compromises that a protocol has to make over the three features discussed.
Proof-of-Work in cryptographic parlance means a Consensus Algorithm. There are several Consensus Algorithms – Pow being one of them. A Consensus Algorithm is a process which is used to validate the entries of a ledger before including them in an immutable distributed ledger. The concept was invented by Cynthia Dwork and Moni Naor as presented in a 1993 journal article. The term ‘Proof-of-Work’ was first coined and formalized in a 1999 paper by Markus Jakobsson and Ari Juels.
A Proof-of-Work consensus algorithm involves the network to find a piece of data. Finding the piece of data has to be difficult and should involve cost, proportionate to the time-consumed in completing the process. This piece of information has to be difficult to be produced but easy for other participants of the network to verify quickly. The verification process of this piece of information broadcasted involves checking the data against certain known parameters which satisfies certain requirements. Producing a Proof-of-Work can be a random process with low probability, hence a lot of trial and error is required before a valid Proof-of-Work is generated.
The Proof-of-Work mechanism used by Bitcoin, in its first application to a blockchain technology, is known as the Hashcash Proof-of-Work system. This system was developed by Adam Back, and was initially used to limit email spam and denial-of-service attacks. Hashcash was proposed in 1997 by Adam Back and described more formally in Back's 2002 paper ‘Hashcash - A Denial of Service Counter-Measure’. In the Bitcoin Proof-of-Work system, the network participants called miners have to compute an unknown random number called Nonce, in order to compute the hash of the block being generated, the hash being verified and accepted by all other network participants as being true to the pre-determined parametric conditions, determined by the difficulty assigned by the protocol, in order to limit block generation in the network to approximately one block every 10 minutes. In such a system, the probability of successful completion of the Pow requirements remain very low. Hence, the CPU which gets the solution first remains unpredictable. Thus the chances of winning the incentive associated with the block generation remains equal to all the CPUs connected to the network.
The most widely used Proof-of-Work scheme is based on SHA-256 and was introduced as a part of Bitcoin. SHA -256 is also used in NameCoin (NMC), Devcoin (DVC), IxCoin (IXC) to name a few others. Some other hashing algorithms that are used for Proof-of-Work (and coins using those algorithms) include Scrypt (Litecoin (LTC), Dogecoin(DOGE), FeatherCoin (FTC), WorldCoin (WDC), Reddcoin (RDD)), Blake-256 (Decred, BlakeBitcoin, Blakecoin (BLC), Dirac (XDQ), Electron (ELT), Photon (PHO)), CryptoNight (Monero, Bytecoin, AEON, Electroneum and SumoKoin), HEFTY1 (Heavycoin (HVC), Mjollnircoin (MNR)), Quark (Animecoin (ANI), BitQuark (BTQ), Diamondcoin (DMC)), SHA-3 (MaxCoin (MAX), Slothcoin (SLOTH), Cryptometh (METH)), scrypt-jane (YaCoin (YAC), Ultracoin (UTC), Velocitycoin (VEL)), scrypt-n (Vertcoin (VTC), ExeCoin (EXE), GPUcoin (GPUC), ParallaxCoin (PLX), SiliconValleyCoin (XSV)) and combinations thereof.
One of the major advantages of a Proof-of-Work driven consensus mechanism is that it provides the process a robust defence against spams or DoS (Denial-of-Service) attacks. As the work needed to validate records cannot be easily replicated – it demands effort to execute each step of the process of record entry into the DLT. The attacks would require a lot of computational power and expended time to undertake the recalculations in order to alter the records already entered into the DLT system. Such attacks, although are possible, but is rendered useless since the costs would be too high and with the fraction of such effort one can actually contribute honestly into the network and get rewarded. Also, this system makes the entire network and the process egalitarian. Every reiteration of the block generation process provides an equal chance to all the involved CPUs to complete the PoW irrespective of their previous success rates.
The Pow system however, involves all the CPU powers to expend a huge amount of computation power in each iteration of the block generation process, while just one participant emerges successful. The rest of the expended computation go unrewarded although being honest to the purpose of the network. The amount of unrewarded expenditure of computational power goes up proportionately to the network participation and although lending robustness to the network, a lot of useless expenditure is incurred. The PoW system is also vulnerable to a Network Dominance attack – or a 51% attack although chances of such attacks get slimmer with increased participation.
In a Proof-of-Work system, the validator who satisfies the Pow requirement and successfully appends the DLT by adding a new block is called the miner and is rewarded freshly generated coins. However, in a Proof-of-Stake system, the process does not usually generate new coins. The validator, known as the forger, gets the rewards in the form of a transaction fees for validating the transactions. The transaction fees can go up or down incentivising the number of participants who want to participate in the validator pool and become a selected forger.
In a Proof-of-Work system, the validator who satisfies the POW requirement and successfully appends the DLT by adding a new block is called the miner and is rewarded freshly generated coins. However, in a Proof-of-Stake system, the process does not usually generate new coins. The validator, known as the forger, gets the rewards in the form of a transaction fees for validating the transactions. The transaction fees can go up or down incentivising the number of participants who want to participate in the validator pool and become a selected forger.
To participate in the process, the participant needs to pre-own the coin. This is because the participant would need to stake a portion of their pre-owned coins to gain a ticket into the validator pool in order to get selected, become a forger and win the rewards of validation. This forger is chosen in a two-step process which can be called a semi-random process. The first element of consideration for choosing a forger is the amount of coins staked by the aspirant. The higher the stakes put up by a participant, the better are the chances of being selected as the forger. It might seem that this system is skewed to the richer participants of the network. Richer the participant, bigger the stake and better the chances of winning the role of forger over and over again, and getting richer and richer in the process. However, the benefits of putting up a stake is that the validator aspirant then cannot play foul, once they get selected as the forger. Because the larger the stake, larger the collateral at risk, which automatically lowers the chances that the forger will act maliciously and lose the stake.
A part of the non-egalitarian bias, of the rich getting better chances to get richer, can be eliminated by introducing an element of randomness to the selection process. This is done in the second step of the process, and it varies widely from blockchain to blockchain. However, two most commonly used processes are Randomised Block Selection and Coin Age Selection or a combination of both. In a Randomised Block Selection process the network scans for the stake-to-hash value ratio. Choosing a high stake to low hash value ratio increases the chances of high staking infrequent forgers to get selected as the forger for the next block. The Coin Age Selection method chooses validators based on how long their tokens have been staked for. These are by no means the only methods of selecting validators, though. Some currencies combine the aforementioned methods while others are experimenting with their own. Thus, the selection process tries to undo the bias which is derived from higher stakes to higher win probability parameter.
A Proof-of-Stake system holds many advantages over the Proof-of-Work system. It is energy efficient and has a better security system. As a stake has been put, it ensures that one will remain honest in purpose to the network or they stand to lose a large stake. PoS systems achieve true decentralisation. In Pow systems large mining pools can control 51% of the network and can become a threat, however such possibility of a forger owing 51% of the entire coins of the system is remote, and also to devalue them by attacking the network is next to unimaginable.
However, no system is without drawbacks. One of the major drawbacks of a PoS system is the ‘Nothing at Stake’ problem. The NAS problem actually arises when the network faces a fork. The forgers will have nothing to lose by supporting the different branches putting the entire DLT in a conflicting situation. In fact, on the contrary, the forgers will gain rewards at all the branches duplicating the same stake at both ends of the forked blockchain. This problem does not arise in a Pow mechanism since to support a fork in the Pow system, the network needs to distribute the computing power at both ends of the forked blockchain.
Also, the PoS network can be vulnerable to Long Range Attacks since duplicating the block generation process does not involve costly computational spending of energy and time.
Another risk of a POS system is called Stake Grinding. In a Stake Grinding attack, a malicious note will, little by little, take advantage of a manipulated Consensus Algorithm in order to have more stake. This is possible when an attacker has the opportunity to sign a block. When their turn arrives, the attacker creates more than one block and analyses the future probability of again signing a block depending on the block chosen. So progressively, the attacker will gain more stake and thus sign more blocks, with the objective of becoming the major stakeholder inside the network.
There could also be the problem of No Participation. In the event of No participation the election of a new block signer should be random enough to avoid repeating the same signer for a long time. The direct consequence is that there is no interest from the nodes to stay online, as they will not get any benefit from the network. If this situation continues for long periods of time, the same stakeholders will sign blocks more often, obtaining more and more stake.
Then there could be other Range Attacks on the network. The attack consists of creating a sub-chain (fork) at some point of time with the idea of being the selected chain for other nodes in the system. With this, the attacker will have the option of modifying the history of the chain, obtaining a benefit from it.
Some of the coins using a PoS system are Binance Coin (BNB), Cardano (ADA), Stellar (XLM), Cosmos (ATOM), Neo (NEO), Dash (DASH), Ontology (ONT), Celo (CELO), Algorand (ALGO), Qtum(QTUM) to name a few.
The two DLTs have come into existence for different reasons and differ at various levels.
Purpose: Bitcoin was introduced in 2008 during the financial crisis involving financial institutions. The turbulent economic environment was causing a havoc depletion of trust in authoritative third parties and their actions. Public money was used to cover up for institutions who were heavily leveraged on risky assets and then faced liquidation due to defaulters. Bitcoin purposed itself as a decentralised peer-to-peer electronic cash system which would let participants have full control over their finances. Bitcoin also addressed the issue of inflationary pressures due to supply of unlimited fiat currencies issued by central banks. Bitcoin has a limited supply, which makes it a digital store-for-value asset but it lacks the scalability of a currency system.
Ethereum was conceptualised to realise the full potential of DLT. Bitcoin as a DLT was the first, successful and most robust DLT built. However, its purpose was limited to just recording transactions like a financial institution. A DLT can be much more. By leveraging the DLT, developers can build real world applications and deploy them. Ethereum was built to develop digital smart contracts and bring the benefits of DLT to every aspect of real life. The smart contracts over Ethereum could be used by participants to directly interact and transact with each other without involving an external third party. A smart contract executes itself through the chronological sequence of activities executed one step at a time. A collection of such smart contracts can help build a decentralised application or DApp. In turn, several coordinating DApps can help build a Decentralised Autonomous Oganisation or DAO. In fact, the world is witnessing DeFis being set up, i.e. Decentralised Financial Institutions, which are essentially traditional financial services being offered over the decentralised platform.
1.Currency Issuance: Bitcoin creates 6.75 new Bitcoins every 10 minutes (or 40.5/hr) while Ethereum creates 3 new Ethers every 15 seconds (or 720/hr).
2.Currency Cap: Bitcoin is limited to 21 million Bitcoins, of which 17m have been created so far. Ethereum has no hard cap currently, but there are plans to reduce or stop issuance in a year or two. There are currently 100 million Ethers approximately.
3.Block Creation: Bitcoin creates a new block every 10 minutes (on average). Ethereum creates a new block every 15 seconds.
4.Block Size: Each block in Bitcoin is limited to 1MB (or 8BM in the case of Bitcoin Cash). In Ethereum, blocks are capped by the gas-limit, the total overhead of all the operations in the block. In practice, Bitcoin can process 4 transactions per second, and Ethereum approximately 15.
5.Transactions Per Day: Bitcoin blockchain can process about 3,00,000 to 4,00,000 transactions per day. Maximum transactions recorded in a day were 4,98,327 on 15th December 2017. Ethereum blockchain can process about 8,00,000 to 12,00,000 transactions per day. Maximum transactions recorded in a day were 1.35 million on 4th January 2018.
6.Average Cost Per Transaction: Bitcoin is around US$ 27 per transaction while Ethereum is around US$ 1.5 per transaction. This parameter is price dependent.
7.Hash Rates: In August 2020, Bitcoin is experiencing the highest hash of all time at about 135-140 m total hashes per second, while the Ethereum hash rates are around 190-210 TH/s.
8.Atomic Units: The atomic unit of Bitcoin is Satoshi, 108 Satoshis make a Bitcoin. The atomic unit of Ether is Wei, 1018 Weis make an Ether.
1.Supply: Bitcoin supply is capped at 21 million coins while Ethereum has no supply caps. It is estimated that Ethereum’s supply would be around 10 million Ethereum per year. However, about 72 million Ethereum were pre-mined before it was launched for the purpose of raising development fund. The total circulation of Ethereum is around 112 million, while that of Bitcoin is around 18.47 million as on 27th August 2020. This makes the Bitcoin deflationary by nature, i.e. with the passage of time as its demand increases with increased use cases, the supply will sharply reduce, making Bitcoin more valuable. On the other hand, Ethereum will exhibit inflationary conditions, i.e. as its use cases will multiply so will the generation of Ether, thereby keeping its value range bound. It is anticipated that as Ethereum would move to Casper based POS Consensus Protocol, its inflation range would be limited to 1-2%.
2.Demand and Price Outlook: Bitcoin is expected to continue to make strides and become an accepted currency worldwide. According to Coinbase, one of the largest Bitcoin exchanges, around 20% of activity on its network was payment related rather than speculative investment. While that percentage may seem small, it is growing as Bitcoin becomes more accepted as a currency. One thing the public expects from a currency is being able to easily spend it. Bitcoin made strides in that area when Coinbase introduced the first Bitcoin debit card. With regulations warming up towards cryptocurrencies, more mature investments and use cases of Bitcoin can be expected. With the regulations being introduced, cryptocurrencies could find a way in the mainstream. As more people use cryptocurrencies, their demand will increase and therefore their value. With a dominance of over 60% of the total Digital Currency market capitalisation as of July 2020, Bitcoin stands to become the largest benefactor of such proliferation of usage. It is expected to move into a period of greater stability and adoption as a currency, without the volatility of its infancy phase. The future of Ether depends on Ethereum’s technology being widely used. Ethereum made great strides in having its technology accepted as the blockchain standard when a multitude of corporations adopted it and started offering it as a service. It is very logical for corporations to choose Ethereum over Bitcoin, because despite its many great use cases Bitcoin lacks the flexibility and extensibility offered by Ethereum. One industry that is already developing many uses for Ethereum is the Internet of Things (IoT). For example, the Ethereum computer could unlock doors when someone rents an office or apartment space. As more and more devices are connected to the internet, the ability for them to interact with one another by using Ethereum’s smart contracts becomes ever more valuable. Ethereum has many uses in the financial services industry as well. Nearly every bank currently uses SWIFT messaging to securely process transactions, but Ethereum smart contracts could cause this network to become archaic.
3.Incentive: Bitcoin block mining incentive was 50 BTC to start off and currently it is just 6.25 BTC per block. Bitcoin mining incentive undergoes a process called Halving every 21000 blocks. May 2020 was when the last Halving happened and brought the Bitcoin mining rewards to 6.25 from 12.5 BTC per block. Ethereum mining incentive remains constant to 2 ETH per block mined.
1.Programming Languages: While Bitcoin has a scripting language built-in, it’s very limited in functionality with only a few dozen operations. Ethereum has a full general-purpose language integrated (known in computer-speak as Turing-complete). Programs written in this built-in language are known as ‘smart contracts’. Bitcoin is written in C++ while Ethereum is written in a Turing-complete contract oriented programming language called Solidity. Ethereum works as a world computer and uses different languages for different functions like Java for runtime environment and script running and other languages like Python, Ruby, Go and even C++ for specific purposes.
2.Transaction Validation: Ethereum assigns a cost, known as gas, to every operation or use of storage on the blockchain. Bitcoin transaction costs are based simply on their size. The participants can assign gas quantity and gas price limits to Ethereum contracts while requesting validation, better incentives result in faster validation. In the Bitcoin ecosystem, the participants can assign better transaction fees in order to get a faster validation.
3.Code Structure and Accounts: Ethereum smart contract code lives at its own address on the blockchain as opposed to being within a transaction as in the case of Bitcoin. Therefore, Ethereum has two account types, one to hold user funds, the second to hold computer code (which can also hold funds). Bitcoin accounts are basically public keys or wallet addresses to which values are assigned.
4.Orphan and Ommer Blocks: Ethereum includes blocks that are valid but were outpaced by other newly accepted blocks. These almost-accepted blocks are known as “uncles/ommers” and their incorporation provides added security to the chain and allows Ethereum to have shorter block times. Bitcoin blocks which are orphan blocks are recomputed and added back to the chain, which increases block time.
5.Cryptography: Bitcoin’s hashing algorithm (SHA-256) can be performed efficiently with special purpose hardware, known as ASICs (application-specific integrated circuit). The Ethereum uses EtHash, a DAG based Pow algorithm that has KECCAK-256 hashing algorithm. EtHash is memory intensive and ASIC resistant than Bitcoin, so it’s far more difficult to build an economical special-purpose chip for. This allows Ethereum to have greater mining decentralisation.
6.Consensus Protocols: Ethereum has plans to move away from mining altogether by changing the consensus algorithm from Proof-of-Work (Pow) to Proof-of-Stake (PoS). PoS creates blocks based on the token holdings of the nodes rather than computational power. In addition, Ethereum plans to tackle scalability by implementing ‘sharding’. Sharding breaks up the blockchain into several interconnected sub-blockchains. Bitcoin currently has no such plans.
7.Recording Transactions: Bitcoin uses binary Merkle trees to compress the transaction records of a block and stores the Merkel root in the block header, while Ethereum uses a hexary Merkel Patricia tree to save the states and modified states in its block.
8.Computational Logic: Bitcoin uses UTXOs or Unspent Transaction Outputs method to conduct computation and validation of its transactions, while Ethereum uses State Transition model in an account format to represent the computation and validation of its transactions.