Top 10 Tech Breakthroughs in 2023

The development of a free, industry standard for designing chips suggests that computing power will keep growing exponentially. 

Image
technology

When I transferred to the Wall Street Journal's San Francisco bureau in 1996, the bureau chief had a story all lined up for me. He had sold the front page editor on the notion that Moore's law was dying. After decades during which the computing power of a chip doubled every year and a half or so at no increase in cost, the technology for etching devices on chips seemingly couldn't be improved further. 

Every reporter lives to be on the front page, and that was especially true in those days at the WSJ, where the stories in column one and column six (known as "leders') were the stars of each day's paper. But as I reported the story, I couldn't convince myself that Moore's law was dead. Yes, the current strain of etching technology was hitting the limits of physics, but there were other, albeit highly speculative, approaches that might lead to a breakthrough, and an entire industry needed to make one of them work. So I declined to write the story.

And I'm so glad I did. The question at the time was how much smaller devices on computer chips could get than a width of .35 micron -- already impossibly small, it seemed at the time. But breakthroughs in etching technology did happen, and the generation of chips that is now being developed will use 2-nanometer technology, a factor of 175 smaller. 

Imagine if I'd become known as the guy who declared on the front page of the Wall Street Journal in 1996 that progress in chip technology had stopped. Because chips are two-dimensional, that 175-fold improvement in each dimension means today's chips can contain 175 times 175 as many devices on the same-sized semiconductor. That's an improvement by a factor of, oh, 30,625. 

There's been talk for the past few years that etching technology has, in fact, gone as far as physics will let it and that Moore's law is finally and truly dead. But the MIT Technology Review's annual look at the year's breakthroughs suggests chip technology will continue to improve at a furious pace, to the benefit of the insurance industry and our many customers.

I always enjoy the annual review because it stretches me beyond the sort of thing I typically think about. For instance, this year's list of the top 10 technologies covers: the James Webb telescope, which is peering more deeply into the universe than I ever thought would be possible; the ability to analyze ancient DNA to see what it tells us about our origins; and mass market drones, which are changing warfare at least as much as machine guns did in World War I and tanks did in World War II. 

This year, MIT highlights some developments that have the potential to revolutionize healthcare. The article talks about "organs on demand" -- initially by growing organs in pigs for transplant to humans, and over time ending the need for human donors. The piece also describes the use of CRISPR, the gene-editing tool, to reduce cholesterol. This comes on top of the recent approval of a use of CRISPR to cure sickle cell anemia, suggesting that all sorts of diseases could be cured -- not just treated. (Jennifer Doudna, who won a Nobel Prize for the development of CRISPR, describes the potential here.)

But the development that most directly affects insurance is the one that MIT Technology Review describes as "the chip that changes everything."

The key is that progress can continue at great speed even if the basic chip-making technology stalls. We've already seen the potential with AI. ChatGPT and other large language models are possible because specialized chips, initially designed for graphics for computer games, have turned out to be some 100 times as good at AI as general-purpose processors. In other words, with zero improvement in the underlying technology, we've still had a 100X improvement for a specialized, very important purpose.

MIT says that kind of improvement is now available to just about everybody because of a free, industry-standard tool that lets anyone easily design a chip for any purpose. Even if progress does finally slow on the basic technology for chips -- 30,625 times beyond where it was when I was encouraged to declare it dead -- we've just begun at optimizing the technology for the uses that matter to us.

Cheers,

Paul