Saturday, March 7, 2009

Living Moore's Law





We're reaching the limit of Moore's Law. I'm living it.

Just in case you don't know, Moore's "Law" is by no means a physical law; it's an observation made in 1965 by Gordon Moore, who was the CEO of Intel at the time. He noticed that the number of transistors being crammed onto a silicon wafer was doubling roughly every two years. He predicted that this would continue, and he was right.

This truth means that we all get more powerful computers for less money every year.

I bought my first personal computer for home in 1994. I taught a graduate mathematics course during summer session as adjunct faculty. The two-thousand dollars I earned by working all summer to come up with notes, homework, exams and grades was earmarked from day one. I chose a Gateway 2000 - how quaint. I'm fuzzy on the details now, but I believe it was on the order of 90MHz CPU, 8MB RAM, and 500MB hard drive. It seemed like a supercomputer compared to the VAX machines I shared with the rest of my department when I started my engineering career. It ran Windows 3.1 for its operating system and a heavy monitor. I had Microsoft Office and the Borland C++ development environment. It was the machine I used when I started taking computer science classes. The memory and disk didn't make development of software very easy. I was used to Unix workstations, so the discrepancy in power made software development at home painful. The hard drive was soon full; I had to buy a 1GB slave to extend the 500 MB master. By the time I was able to replace it in 1998 I couldn't wait to get rid of it.

The successor came from Dell. It ran Windows 95 on a CPU that was 5X faster than the Gateway. The hard drive was a green field. It would be impossible to ever fill up a device that huge. The monitor was still a big tube affair, but the view was more expansive. I was able to do all sorts of things - for a while. As time went on the cruft that accumulated made the wait times interminable. My work machines were still far superior to my kit at home. I took all the usual steps: I upgraded the memory, I added a slave hard drive. The few service calls I had to make to Dell were infuriating. Never again!

After seven years I couldn't bear it anymore. I was working for an employer who gave me a bonus, so I earmarked some of that money and bought an HP. It was a dual core 2.2GHz AMD CPU with 4GB of RAM and a 120GB hard drive, running Windows XP. It's the machine that I'm writing this blog on today.

Here's where Moore's Law comes in: Four years later I'm still happy with this machine. The hard drive is augmented by Passport external drives, but it's still doing fine. The memory is the most I can have without upgrading to 64 bits. The specs compare favorably in every way to the machine I use at work, so there's little drop-off when I switch between the two locations.

I could go out and buy a netbook or a laptop for a price that would have made me weep years ago. I'm sure there are blazing machines for gaming and video production that would eclipse my current setup. But the curve has flattened out. I'm not feeling too far behind after four years. I don't hate the machine. I don't have that sense that if I don't upgrade right away that I'll burst.

What happened?

Heat is the great enemy. Increased clock speeds mean more heat generated.

We're arguably running up against physical law. Decreasing the distance between transitors makes it harder to avoid unwanted interactions and is terribly expensive. Last I heard it cost Intel $1B to open a new factory with a new process; I'm sure that figure is higher now.

The marginal utility of faster CPUs is decreasing because the things we use computers for don't tax the processor very much. Most of the time it's doing nothing, waiting for something to operate on. The trend is to have more CPUs and divide the work between them. Software developers will have to become more adept at writing multi-threaded code, otherwise all those extra cores will sit idle.

Even number crunching applications like scientific computing are finding out that memory and the bus to transport data to the CPU is the bottleneck. Something fundamental will have to change.

It's bad news for the computer manufacturers out there. I think my requirements and expectations as a software developer are far beyond those of the general population that uses computers for little more than Internet access, e-mail, and doing their taxes. If my experience is typical, and other people don't have that feeling of "gotta buy or I'll explode", one of the most compelling reasons for buying computers is dead.

Corporate sales must be plummeting. My employer has decided to bypass Vista altogether and wait to see how Windows 7 turns out. Until then, we'll all be running Windows XP on the desktop. No profits, no capital investment.

Progress continues to be made, but the rate of change has declined. Where will it end?

1 comment:

Unknown said...

When CPU speeds were increasing rapidly, it seemed like the PC in your home was becoming slower. And in a way, it was. It's not just envy, knowing that a faster PC was available. Software vendors increased their expectations, effectively making your computer slower.

Something similar has happened with bandwidth. A 56k modem is slower now than it used to be because web sites serve images that assume you have high bandwidth.