BeeLine to the Future:The End of Moore’s Law

Robert Bee
Robert Bee's picture

In the mid-60s Gordon E. Moore predicted that computer processors would double in speed and memory every 18 months. This prediction has been so accurate since then that it has become known as Moore’s law; however, Moore’s law will eventually come grinding to a halt because of leakage, heat, and other inevitable physical limits.

In a video the physicist and futurist Michio Kaku comments on the end of Moore’s law by pointing out that over the next ten years we will supplement silicon computing with three-dimensional chips (similar to one Intel recently released), but beyond that we will have to rely on molecular and quantum computers.

Kaku points out that the problem with Moore’s law is that

“a Pentium chip today has a layer almost down to 20 atoms across. When that layer gets down to about 5 atoms across, it’s all over. You have two effects. Heat--the heat generated will be so intense that the chip will melt. You can literally fry an egg on top of the chip, and the chip itself begins to disintegrate. And second of all, leakage--you don’t know where the electron is anymore. The quantum theory takes over. The Heisenberg Uncertainty Principle says you don’t know where that electron is anymore, meaning it could be outside the wire, outside the Pentium chip, or inside the Pentium chip. So there is an ultimate limit set by the laws of thermal dynamics and set by the laws of quantum mechanics as to how much computing power you can do with silicon”( ).

He concludes: “If I were to put money on the table, I would say that in the next ten years we’ll simply tweak Moore’s Law a bit with chip-like computers in three dimensions, but beyond that we may have to go to molecular computers and perhaps late in the 21st century quantum computers” (

The end of Moore’s law is an important issue because futurists who predict the rapid development of technology, or who believe in a hard singularity, rely on the exponential development of computer technology as the core of their argument. If computer processing capacity stops doubling every 18 months that could dramatically slow down the future. In the long run, either molecular or quantum computers may replace silicon, but it will not be an easy transition, and we are likely to face speedbumps and obstacles in the road to the future.