Now that we're a good couple of years into the fascination with the potential of blockchain, some breakthrough uses should be popping up, right? Instead, we're starting to see articles like this one from McKinsey that suggests scaling back ambitions, at least in the short run. What gives?
Some of the disappointment may be inevitable. We've all seen the hype curve and know that new technologies, especially ones with as much potential for fundamental change as blockchain, often produce massive expectations, only to then descend into the Slough of Despond for months or years, before reaching their destiny.
Some of the pushback comes because blockchain's limitations are becoming clear—it's computationally intensive enough that it's not well-suited to massive storage of information, for instance—and because alternative technologies can solve many of the problems that were initially assumed to be the province of blockchain.
But if you'll permit me a geeky analogy, harking back to my days covering the world of technology for the Wall Street Journal in the 1980s and 1990s, those driving adoption of blockchain are behaving too much like Unix and not enough like Linux.
Unix is a well-regarded operating system that became positively adored by almost every major computer company not named Microsoft. Windows had achieved a monopoly on all but the modest number of personal computers then sold by Apple. Companies needed a way to compete with Microsoft, so they rallied around Unix, a key version of which was in the public domain. The problem—and the lesson for the insurance industry—is that just about every company tweaked that version of Unix.
Something isn't really a standard, is it, if I have my version, you have yours and Sally down the street has another?
Eventually, people in the industry realized they were ceding the key advantage to Microsoft—any program written for Windows could run on any PC-compatible, while programs written for one version of Unix had to be revised before they could run fully on another version. So, the industry formed a consortium to produce a single "kernel" for Unix—and everyone tweaked that.
I can't even tell you how many presentations I sat through from IBM, HP, Sun, etc. about how their version of Unix was the best, or how little response I got when I argued that the fragmentation of the Unix effort was going to kill everybody's Unix and keep the market clear for Microsoft.
While insurance isn't showing the knuckleheadedness that I saw in the computer world in the '90s, there still is a lot of fragmentation in the efforts to develop blockchain technology. It's tempting to try to set the standard, because a company that sets the rules usually wins the game. There's a reason Bill Gates is still second on the Forbes list of richest people in the world even though he keeps giving his money away through his and his wife's foundation.
The industry would be much better off with a focused, joint effort to develop the core blockchain technology, at which point the competition could be how to build the best uses on top of that technology. This is what happened with Linux. When Linus Torvalds wrote the kernel and put it into the public domain in 1991, development became an open-source project for the entire coder community, not a series of one-off efforts by companies. Competition became about, for instance, developing the best tools for writing apps on top of that operating system, and there was plenty of profit there: Red Hat, for instance, recently agreed to be purchased by IBM for $34 billion. Not Bill Gates money, but I'd take it.
Look at how the telecommunications world collaborates on the standards for each new generation of Wi-Fi and how much business those new standards create. We'll all end up buying new phones once 5G rolls out; video and gaming companies will find new content and services to sell us; etc.
Blockchain will still have growing pains, but we as an industry can do better.