The thing I find fascinating about ‘Forking’ is understanding why entrepreneurs have chosen to fork, who has previously forked and who will fork next. How do developers decide the point at which to fork, who do they decide to fork with and for what purpose.
Definition: Forking is to take the Source Code from an open source software program and develop an entirely new program.
No, I am completely serious…
Open Source software has many benefits given it's normally FREE for you to download and play with, and there is normally a community of custodians that looks after it, augment and nurture it, upgrade it and protect its integrity who are normally available to support your journey.
When Satoshi produced the first paper and the first Bitcoin version came to light that created the genesis Block, it was inevitable people would fork the code in an attempt to improve it and divert the architecture to suit their purpose. Like any emerging technology, early versions can be improved. With each new interaction of Open Source people keep forking and, given the recent DAO situation, we saw the Ethereum community take a hard fork decision so that the people who lost out were not disadvantaged. But then this is quite normal with any new emerging technology and with Open Source software.
The world of Bitcoin, of course, came from a Libertarian cyberpunk movement that wanted to create a more equitable financial system using crypto-currencies. It was no surprise this emerged in 2008 as the banking crisis took hold. The mission was to create a technology-based alternative (a protocol) to FIAT currency, as a tradable token of value; to exchange value without the need for central body, and where consensus is decided by investing in solving a mathematical problem called Byzantine General Problem.
The sheer effort (difficulty expended) is called Proof of Work to ensure the network delivers consensus whilst making sure the miners have plenty of power (investment) to mine Bitcoin. Handy if you have your very own Power Station!
The genius of the bitcoin code was that it solved not only the double spend issue but it allowed other genius people to fork the original and take it into new areas and deliver new functionality to support key markets. The true genius of Bitcoin is often missed - a P2P network delivering consensus with a focus on security and assurance compromises the transaction throughput which is limited to approx. two to three transactions per second, and where the ‘block’ updates each 10 minutes (the time to reach consensus having queued the transactions).
The performance of most software runs the gauntlet with delivering security and protection from Sybil attacks that try to change identities, as well as other cyber threats as we enter the realms of cryptography - not just any, though, Bitcoin uses Elliptic Curve multiplication cryptography, so it is time to brush up on your logarithms. I all of a sudden need my slide ruler and logs tables – not.
If you have been in computing for a while you will understand there are mathematical and physical limits that present a significant challenge with considering a distributed architecture, where the nodes are also often both client and server. In a Peer 2 Peer network, the challenge is also in who does the work and who provides the computing processing; that to date is part of the investment in public mining.
But what about data storage and whose responsibility is it to make the investment? What about the size of the Blocks that are currently limited to a few KB in size? And, let's be realistic, miners are not going to store your data and hash transactions to the Public Blockchain for ever increasing volumes.
And, to make things tougher, the Bitcoin protocol halves the rate at which new Bitcoins are created from a fixed total of 21 million, than runs until 2040. The general concern amongst the Blockchain community is the diminishing incentive that will limit the investment in nodes the miners put in.
So are we saying we could hit the limits?
Not at all, hence the need to do more forking. Despite the Bitcoin network having millions of times the scale and processing power of the top 100 largest companies and top 50 super computers, there remain limits… But with each new iteration of Blockchain that bring Altcoins (alternate crypto-coins as a financial transaction rail) we see improvements in performance, hybrid models that look like a distributed ledger and new architectures that interact with other new tech such as Interplanetary File System (IFPS) another open source revelation discussed below4.
Smart Contracts is the active code that sits above the ledger ‘plumbing’ conducting operations at the business logic layer. Whilst much has been done to ensure the security of the network and the ledger, Smart Contracts is an area that have been shown to be most vulnerable and given their role need to be verified.
So there it is, the limits and compromises of Blockchain are real and laid bare. But then I am a Blockchain business operating model architect, not a coder, and I don’t have to deal with stuffing lots of data into a tiny block and storing your data.
Getting involved in Blockchain forking involves a major decision. It begins with the decision to Fork or not. Well, I know what some of you are thinking, who gives a fork right (sorry but it has to be said). My apologies only I had to get this in.
The next decision is the financial rails and it is tough to ignore Bitcoin given its longevity, liquidity, and elegance. But then hashing transactions to the public block but using a forked Blockchain implementation takes some finessing and above all genius design, as Smart Contracts provide the glue and inspiration to make it work.
But then when implementing Blockchain for real in any organization or market there are big decisions to make. The big one is consensus and there are a growing number of choices emerging for both proof of work, proof of stake popular in China, and other hybrid options emerging. My preference is hybrid forking…no, I am just kidding, but then again what a thought. So the public hashing to the public block gets you down the track, but then there is storage to consider.
In operating model design terms there are many things to consider, to test prior to implementation hence the plethora of consortiums offering the obligatory proof of concepts, labs, and sandboxes. But, then again, all of this really matters given the complexity and what is at stake.
NB: Blockchain Operating Models will be more efficient, they will have a lower cost base by at least 50%, they will be faster and be able to deploy and issue new products and services much quicker. They will redefine markets and commerce, how customers are served and how their value is treated, appreciated, and moved between parties.
When we look back, Bitcoin will be seen as an elegant solution, a definitive piece of work comparable to a Da Vinci work. But then I believe the best is yet to come…
So, the main question – How to store data in a decentralized environment, in a robust way, encrypted of course (hashed SHA-256), when the Bitcoin network has no means to store massive data, given each node holds a complete copy going back to the genesis block.
You could increase the size of the blocks, but then miners still have to pay. But storing data is no short term thing. Distributed Hash Tables have been around for a while, capable of indexing data across distributed copies, which can become very complex.
How do you give people access to their data, the ability to file share and protect the information? How do you monitor access and movement? What of version control? In the early days, BitTorrent showed us the way, building on Napster and others… Then there is HTTP:// using DropBox and other data storages. But then, let's not, as I am not a fan.
Earlier this year I came across the Inter-Planetary File System (IPFS) that will at some point replace HTTP:// as IFPS:// because they have an elegant solution for storing data in a distributed architecture, and it is again Open Source. Oh good, more forking potential…like the progeny Filecoin…
But then everyone is working on this - the next big challenge - distributed storage. Ethereum Swarm, a Turing complete language, that caters for storage and some hyper virtual machine…and others are working hard to find an elegant scalable (massive scale) solution. It will come, and thank goodness this young man Vitalik Buterin gave us the Ethereum path (fork).
So what do you get when you Fork?
If you want to push the code in another direction to solve some of the initial challenges and underlying issues then consider an alternative – called Altcoin, or whatever you want to name it, LieCoin, CitiCoin, Ether, Steems. But then only Proof of Work protects assuming your stakeholders are OK with it being public, or do they have a stake as a participant in your Peer 2 Peer network, and how will consensus deliver your versions of governance. But then you have other chains and data sources… And then there is bootstrapping to get access to new sources of computing power, assuming your source is not unlimited.
Look, we are approaching the limits of Blockchain 1.0 and we will inevitably enter 2.0 as we glide into 2017. But then I already have my favorite first division Blockchain players, my second division emerging, and div three wannabees in mind. But who will be last man standing?
As we move to Blockchain 2.0 there will be a rush of patenting activity as each version looks to protect its ‘secret sauce’. Deciding when to Fork requires genius to choose the point and follow a different path, linking libraries, developing and supporting rich languages, and consensus model, as well as the opportunity for users to create smart contracts, let alone finding the resources you will need and the user wallet to get things going.
New Forks and Plumbing
There are new forked versions everywhere. A few weeks ago I stumbled on Zooz tokens linked to the Uber business model which is truly fascinating, and then there are some colleagues at eMunie and Interledger. But I will save this for another day.
The real focus of 2017 will be on Dapps (Distributed Applications) that have the potential to really unlock the full capability of the Blockchain at the process layer to supported Smart Contracts that direct the logic traffic at the user interface.
Given much of what is discussed above about Blockchain, it's really the continuing development of the underlying protocol, the infrastructure (the plumbing) that underpins it all. And where Microsoft and IBM are placing some very big bets but doing it their way - their own version of Ethereum which they fork again and again it seems. Forking the code is initially about plumbing and Smart Contracts is the fuel for the boiler or Dapp.
But why all the versions? Choice. Why all the forking activity to begin with? Is it because in each industry, approach and architecture is focusing on a slightly different pattern of things they want their version of Blockchain to deliver. In the end, there will be only a dozen or so versions of Blockchain that will become dominant as standards are enforced and chains are connected together. A more significant battle than VHS versus Betamax, or Token Ring versus Ethernet, where in both cases the best technology didn’t win.
It has been a Privilege
I have met so many brilliant people in the world of Blockchain, a place where genius is just around the corner, it sits in front of you in a world where a PhD doesn’t get you as far as you may think… A world of polymaths and polyglots, of talent, vision, and guts to energize and amaze you.
They have the courage to build a new world of commerce, they see the future and they want to get their first.
And then there is our version - SmartLedger (www.smartledger.io) deploys a different Blockchain technology stack aimed at Capital Markets, Asset Management and Custody, Insurance and Re-Insurance, Broking and Registration businesses models. We have our reasons!
© Digital BOOM 2016
Author: Nick Ayton
T: @NickAyton @SmartLedger