The Singularity Q&A

Q: How does Moore's Law work?

A: In the mid 1960's, Intel co-founder Gordon Moore noticed that the number of transistors-per-inch on a computer chip had been doubling ever year for some time, and he predicted that advances in manufacturing would continue to result in similar steady gains.  As the industry become more established, the rate stabilized to about 18 months per transistor doubling -- a pace which soon came to officially define Moore's Law, and which has held true for the last three decades. 

The most noticeable effects of Moore's Law are smaller, cheaper, more energy efficient, and, of course, faster computers.  In fact, each doubling of transistor density results in an effective quadrupling of computational power, for reasons that will be explained shortly.

But first, we should discuss why Moore's Law should hold up as well as it does.  It is, after all, just a prediction;  there is no property of cosmic physics shrinking the transistors in you computer, merely human engineers finding creative new ways to refine their product.

One reason is purely economic.  Moore's Law has held up for a long time in a very competitive industry; corporations that fail to steadily produce faster chips run the grave risk of being outflanked by a more aggressive or paranoid firm.  And so, each company attempts to keep pace with the Law and cut a little ahead of it, which results in a leapfrogging progression of chip designs lining up very nicely with Moore's prediction.

But there must be scientific reasons why engineers are continuously able to shrink their transistors.  Indeed, there are.  As small as transistors are -- and they are already the most microscopically tiny devices ever assembled by the human race -- they have plenty of room to become smaller.  Theoretical experiments have shown that computation can be performed within single atoms or even smaller particles, and today's transistors are still comprised of millions of atoms.  We cannot jump straight to molecular computing with the tools we have now, but faster computers and specialized devices produced by a given phase of engineering consistently enable engineers to design the next phase, in a spiral of ever-shrinking designs.

Lest we glorify chip engineers too much, it should be mentioned that the natural consequences of shrinking transistor size do most of the work.  Why?  Because as transistors shrink, so does the time it takes them to perform their switching operations.  Smaller transistors, as a rule, are faster and more energy efficient, and can therefore do more work per given unit of time and energy.  Shrinking the transistors by 50% on a chip of a given size yields approximately quadruple the computing power, because both the number and speed of the transistors have been increased. (And don't forget the power savings per calculation!)

Here is an analogy that works because the mathematical calculations are identical to those used to approximate the performance of a simple, hypothetical computer chip:

Suppose you have an ice-cube tray, and your goal is to produce as many ice cubes as possible without changing the size of the tray -- only the size of the boxes where the cubes will form.  Here is your word problem for the day:

You start with boxes small enough that you can pack them in 10 high and 10 wide.  This 10 x 10 grid gives you 100 ice cubes.  How many total squares high and wide would you need to be able to squeeze in if you hoped to produce 200 ice cubes? 

If 20 x 20 came to mind... bzzzt!  The correct answer, of course, is 15 x 15, which will produce 225 squares.  Hence, shrinking the size of each box by a third will give you more than double the number ice cubes.  If you managed to shrink them by 50%, as would be the case in a 20 x 20 grid, you could make 400 ice cubes -- quadruple the number.

For the fun of it, we can take this analogy to three dimensions, to simulate what might happen if or when engineers might be able to construct chips using a fully three dimensional process -- as opposed to the layered two dimensional "wafering" done now.  As you will see, the effects of transistor size reduction are even more pronounced:

A 10 x 10 x 10 ice machine -- 10 x 10 trays stacked 10 high -- gives you 1,000 ice cubes.  But shrinking each cube by 33% percent to make a 15 x 15 x 15 machine would more than triple the number of cubes produced: 3,375.  A 50% reduction in cube size yields 8,000 cubes -- eight times the original number!

Three dimensional engineering is just one of a number of possible ways Moore's Law could be extended once current techniques peter out -- as they are expected to do at some point thanks to "strange" physics that begin to come into play at smaller scales.  In fact, although many futurists and industry specialists say that the impending need for a completely different manufacturing approach means Moore's Law can only be "assured" for another five or ten years, very few of these prognosticators express a lack of confidence in a new method picking up the baton; by one tally of computing history it has already been passed five times.

Naturally, when you keep doubling a number at periodic intervals, you end up with a nifty exponential progression, similar to the one discussed in the previous question.  Over periods of a decade or so the results can be pretty astonishing.   If you have been using computers for at least ten years, wax nostalgic with me for a moment.

In 1992 you might have used a computer running at about 33 MHz.  (Megahertz isn't quite the same thing as computing power, but is a useful measure of speed which tends to double along with transistor density.) In contrast, by 2002 you easily could have used a computer running at closer to 2 gigahertz (2,000 MHz).  This works out to the approximately 6.67 doublings that occur in a 10 year period under Moore's Law, and this same formula predicts you will have about a 250 gigahertz (250,000 MHz) machine in 2012, and a 34 terahertz machine (34 million MHz) by 2022.

But no discussion of computer improvements would be complete without mentioning all of the related innovation in the industry, much of which has outpaced Moore's Law by a considerable margin.  My computer is about 50 times faster than it was ten years ago, but it has about 60 times as much memory and 200 times as much hard drive space.  The writeable CD-ROM discs I use each hold more than 400 times the data of an old old floppy disk.  My computer can also read DVD discs, which hold more than 2,400 times as much as those old floppies!

Finally, since this is a Singularity Q&A, I should mention how Moore's Law predicts the day when computers will be able to match -- and then quickly exceed -- the number of operations-per-second performed by the human brain.  Depending on whose estimate of brain activity you use, the date when the world's most powerful computer will achieve this feat could be as early as 2004 or as late as 2024.   But in any case, the event is purely symbolic; brains and computers perform very different types of operations, a point I expect to belabor exhaustingly in upcoming responses to questions on Artificial Intelligence.

[Back to Futurism] [Back to Main]  ©2002 by Mitchell Howe