This Is the End

Moore’s Law is the observation by Gordon Moore that the number of circuits on a microchip was doubling every two years while their costs were declining. It has fueled the rapid rise of technology in our daily lives. Many of the advances in so-called “artificial intelligence” are actually byproducts of the increasing power and declining price of hardware rather than improvements in the algorithms that comprise the field of artificial intelligence. Moore’s law is over.

From an article at MIT Technology Review:

“It’s over. This year that became really clear,” says Charles Leiserson, a computer scientist at MIT and a pioneer of parallel computing, in which multiple calculations are performed simultaneously. The newest Intel fabrication plant, meant to build chips with minimum feature sizes of 10 nanometers, was much delayed, delivering chips in 2019, five years after the previous generation of chips with 14-nanometer features. Moore’s Law, Leiserson says, was always about the rate of progress, and “we’re no longer on that rate.” Numerous other prominent computer scientists have also declared Moore’s Law dead in recent years. In early 2019, the CEO of the large chipmaker Nvidia agreed.

In truth, it’s been more a gradual decline than a sudden death. Over the decades, some, including Moore himself at times, fretted that they could see the end in sight, as it got harder to make smaller and smaller transistors. In 1999, an Intel researcher worried that the industry’s goal of making transistors smaller than 100 nanometers by 2005 faced fundamental physical problems with “no known solutions,” like the quantum effects of electrons wandering where they shouldn’t be.

For years the chip industry managed to evade these physical roadblocks. New transistor designs were introduced to better corral the electrons. New lithography methods using extreme ultraviolet radiation were invented when the wavelengths of visible light were too thick to precisely carve out silicon features of only a few tens of nanometers. But progress grew ever more expensive. Economists at Stanford and MIT have calculated that the research effort going into upholding Moore’s Law has risen by a factor of 18 since 1971.

Likewise, the fabs that make the most advanced chips are becoming prohibitively pricey. The cost of a fab is rising at around 13% a year, and is expected to reach $16 billion or more by 2022. Not coincidentally, the number of companies with plans to make the next generation of chips has now shrunk to only three, down from eight in 2010 and 25 in 2002.

If we are to see improvements in technology like those we have seen over the last 40 years, it will need to come from more efficient software. That in turn means that a transition from the present emphasis on producing software at the lowest possible cost to producing the best possible software.

6 comments… add one
  • CuriosiOnlooker Link

    There was a lot of mismanagement by Intel – they squandered a 2-3 year lead on the rest of the industry.

    TSMC and Samsung didn’t have issues and continued to make progress for the past 5 years, and looks they will make progress for 5-10 more (which is as far as the industry can see forward).

    As for software, it is moving there. The shift to public cloud results in more efficient resource utilization.

  • Guarneri Link

    So it appears that the grind it out engineering improvents are converging with the limits of physics. The highway is plain and simply clogged?

    I don’t know jack about software, but I take it that your comment is essentially accomplish the task with fewer cars on the highway, because the highway is done.

  • Software consumes resources. Among these are main memory (RAM), storage (hard drive space), and execution time. Modern software frequently uses these resources profligately (called “bloat”). It just hasn’t been that important because developers could always expect the next generation of hardware to make up for software bloat.

    If Moore’s Law truly ends, that expectation will no longer hold.

  • TarsTarkas Link

    A big barrier to circuit miniaturization is heat. The breakthrough a lot of people have been hoping for and waiting for for many years is at least liquid nitrogen-level superconductors that work at normal atmospheric pressures. That’s been stalled for years, and although some progress has been made fabricating materials that can superconduct up to -70 C, said compounds have to be squeezed to millions of atmospheres to make them superconduct.

  • That’s been just around the corner since I was in grad school which is now a half century ago.

    I should make a list. Quite a few things have been said to be “just around the corner” since then. Practical nuclear fusion, quantum computers (those may actually be just around the corner).

  • Guarneri Link

    Sounds like a good time to be a clever software guy, or venture capitalist.

Leave a Comment