The End of Moore
Moore's Law is a law that governs the advancement of technology in the computer industry. The original is quite lengthy and talks about transistor densities but the commonly accepted snappy, quotable version is something along the lines of "computer power doubles every two years".
And it applies to pretty much everything in computers. A hard drive that you buy in two years is likely to be twice the size of one you buy now. Computer processors are likely to be twice as fast then too. Memory would be twice as much, flash drives twice as big and internet connections would be twice as fast. So far, Moore's law has held firm for nearly fifty years - with just the occasional plateau as some limit is reached that requires a little extra thought to get around.
However, Moore's law is coming to an end.
Yes, yes. People have said so before and Moore's Law has always prevailed past whatever limitation seemed to have been looming. However, this is different. This time the stopping point won't be a technical problem, but rather it will be us. The precedent is there. Moore's law has already ground to a halt in some areas.
Sound cards for computers used to follow Moore's Law. They started pretty miserable and got better and better over time. As with any computer hardware, you would have to keep buying new ones to keep up, and the price was high. Sound cards cost up around two hundred dollars (Australian).
Moore's Law pushed sound cards not only to be better but, in fact, the sheer momentum of the upgrade cycle pushed them to be better than the human ear is capable of distinguishing. One day, someone realised that the difference between their new model sound card and their old model sound card was only audible to dogs. Sound cards were as good as they could be - not because of technology, but because of the needs of the users. Moore's Law shuddered still and beyond specialist needs - sound cards for professional musicians and so forth - that was that.
Colour on video cards followed suit. All video cards now use thirty-two bit colour. Without getting technical, allow me to just say that this means they are capable of displaying sixteen million different colours. Unfortunately, the human eye can only manage to distinguish about ten million. Do we need to go any higher? Not really, no. Scanners have pretty much topped out, too, and so have bubblejet printers. We simply don't need an everyday consumer-level scanner or bubblejet to be any better than they already are.
And computers are rapidly approaching the same point. How much hard drive space do you really need? Surely once we have enough to store, oh lets say, four hundred high definition movies, is there really much need to go beyond that? And what about processor speed?
Most people will say computers can always use extra speed. However, as I type this, I'm running two operating systems simultaneously on the same mid-range desktop computer (MacOS X Leopard and Windows XP SP2 on an Apple iMac). Both operating systems can run Photoshop, which is probably one of the most processor hungry applications in common use, and can even both run Photoshop at the same time. More impressively I can switch between the two operating systems (and the two Photoshops) instantly with a single key press.
I think that's probably enough processor speed for now. Even power users are likely never to need that much weight of software running simultaneously in two operating systems. An average home user could get by on a quarter of that power to surf the web, dabble in Office and check their email.
Computer games, though... Computer games can always use more power, surely?
A recent game is Crysis, an amazing looking game with highly detailed environments which you can interact with in very impressive ways. You can, for example, realistically demolish buildings or flatten tracts of jungle, splintering tree trunks with bullets and watching them topple. One enterprising person on the web has even stacked up 3,000 barrels and filmed them toppling. That's pretty realistic as games go. It's not perfect yet but it's not far off. Double the processor speed just once (which would take it up to around 7 GHz) and games like Crysis will be the norm.
So, how many more times do we need to double it? If Crysis can already create a reasonably realistic, detailed environment on a 3.4 gigahertz computer, surely we must be getting close to speed being irrelevant? I mean, realistic is... well, realistic. There's nothing more to do after that.
So... seven gigahertz? Ten? Twenty? Twenty is nearly seven times greater than what Crysis is using now. That's going to make for a pretty real game. Games programmers are exceptionally good at finding new ways to use up processing power and I doubt they'll let got of Moore's law without a fight but there is definitely an upper limit here somewhere. There is a point where there is no further purpose in increasing the power or capacity of computer hardware, even for games. And what happens then?
Well, the same thing that happened to sound cards, scanners and bubblejet printers. The bottom will drop out of the market, prices will plummet and computers will become commodity items. I can buy a sound card for twenty bucks now (again, Australian). That's ten percent of their old price. Off the top of my head, it seems reasonable computers will drop the same amount.
It might be another ten years yet but eventually there will be a three hundred dollar computer which does eveything you will ever want of it.
