Enough with the terrorism for now, and let's turn our sights to a more interesting topic. I'm sure I don't need to introduce anyone here to Moore's Law, according to which computing power can be reliably expected to double every 2 years or so. This guideline has proven to be a reliable one for going on 35 years now, vagaries of CPU architecture notwithstanding, and there's good reason to believe that there's still some life left to it yet. Nonetheless, anyone familiar with the amazing rapidity of exponential growth will realize that this is a trend which ultimately must come to a halt someday, if for no reason other than because the laws of physics say so. It is therefore worthwhile asking just what the fundamental limits said laws impose upon us are: how fast can computers possibly get, even under ideal conditions, and how densely can we store the information we manipulate with them?
The first time I asked myself this question was about 10 years ago, and one answer I was able to come up with at the time was the Bekenstein bound, which sets a firm limit on the amount of information which can be stored in a given volume of space; beyond that, I made little further progress. It therefore comes as a very pleasant surprise to run into this page by Michael Frank, who is currently an Assistant Professor at FSU. Not only does Frank give an extensive reading list of research articles one can consult for more information, but he also provides lecture notes of his own, as well as information leading to this page containing related material by Warren D. Smith.
It may well be that the limits of computation are so great that we are unlikely to approach them at any point in time we can currently envision, but I have a feeling that the opposite is true, and all the extropians, transhumanists and other assorted techno-boosters are going to find their fondest hopes disappointed in the space of a few decades. At any rate, the seemingly straightforward question of just how fast and how big computers can get turns out to be tied intimately to the question of just what the fundamental laws governing our universe are ...
PS: Check out this paper by Seth Lloyd, and take a look at this blog entry by Sun Distinguished Engineer Jeff Bonwick on why 128-bits really should be enough for any conceivable filesystem.
Here is Feynman on the subject, years ago. There is room at the bottom:
http://www.zyvex.com/nanotech/feynman.html
Posted by: eoin | July 13, 2005 at 01:40 PM
Sure there's plenty of room at the bottom, but "plenty" isn't "infinite", and it doesn't take more than a few generations of doubling to make that plenty seem pretty cramped. The point of looking at the fundamental limits set by the laws of physics is that no new idea, however clever, can evade them: for instance, neither nanotechnology nor quantum computing will enable us to surpass the storage density restrictions imposed by the Bekenstein bound.
Posted by: Abiola | July 13, 2005 at 02:10 PM
...limited by the laws of physics that we -currently- know. It doesn't seem likely to me, but there's still that possibility.
Posted by: Mitch | July 13, 2005 at 03:52 PM
Rather than density or speed, most transhumanist singularity types project their /hopes|fears/ onto emergent phenomena arising in complex networks. This might not need individual node power or link bandwidth any higher than present. I personally think it unlikely because of bugs. When somebody demonstrates a millions of loc program that can figure out on its own how to fix a bug provoked by unpredicted input, I might begin to worry a bit.
Posted by: Russell L. Carter | July 13, 2005 at 06:17 PM