In 1965 Gordon Moore, one of Intel's co-founders, wrote a paper stating that the number of individual switches (transistors) on an integrated circuit was doubling roughly every eighteen to twenty-four months. This was soon known as Moore's Law, and remarkably his prediction has held true ever since. Today's unbelievably fast modern processors are roughly twice as large - and complex - as those of two years ago. The greater the complexity, and the smaller the components, the faster a chip can operate. It has other side effects as well; in the case of computer memory, more transistors can be fitted on the same-size memory chip.
The end of Moore's Law has been predicted since at least the early eighties, yet it has never come to pass. Each time a limitation has been approached, engineers have improved processes to shuffle them back. Unfortunately this will not continue forever. In a previous post, I wrote about how the speed of light was becoming a limitation in the clock speed - and therefore the size - of computer chips.
There are several other limitations, perhaps the most fundamental of which is something called Heisenberg's uncertainty principle. This is a complex topic that can perhaps best be described as follows: it is impossible to know to any certainty the position and momentum of an individual particle to any certainty. If that sounds confusing, then consider the following.
You have a simple light switch. Flick it into one position, and bulb 1 will light. Flick it into the second position, and bulb 2 will light. As long as a mechanical or electrical problem do not interrupt the circuit, then you can always guarantee that the correct bulb will light according to the switch position.
Unfortunately, a man called Werner Heisenberg worked out in 1927 that this consistency does not hold at the level of an individual electron or other particle. This is unimportant at the scale of a light switch, as other factors massively outweigh the uncertainty. As computer chips get smaller, however, the individual transistors get smaller and the uncertainty principle will start to have effect. Taking the analogy above, you could flick the switch without knowing with any certainty which bulb will light. Obviously this is a very bad thing for chips.
Today I came across a short article in the August 2008 Proceedings of the IEEE, entitled 'The Quantum Limits to Moore's Law' (available to subscribers on the IEEE website). In it, the author performs calculations to show when, if Moore's Law continues to hold, that the uncertainty limit will be reached. There is little point in reproducing the equations here, but the end result is noteworthy: if chip technology was altered to use electron spin as a transistor (a technology demonstrated in labs, but a long way from production) then the uncertainty limit would be reached in 2036.
It should be noted that this is a best-case estimate; there are many other physical limitations, such as heat and noise (*), that could stop chips from getting more powerful. As noted above, however, engineers have proved remarkably adept at pushing these physical limitations.
As might be seen, I am fascinated by the ultimate limitations to the amazing technology that we have today. Perhaps the most important of these is in no way physical, but cost: it may simply cost too much to work around. When this happens the engineers will have to look elsewhere in their never-ending quest for more speed.
(*) There are many types of electronic noise. Particularly important with respect to chips is thermal noise: this is is the noise generated by the equilibrium fluctuations of the electric current inside an electrical conductor, which happens regardless of any applied voltage, due to the random thermal motion of electrons in the conducting medium. (from http://thermalnoise.wordpress.com/about/) This noise can cause problems both in the circuit itself, and in adjacent circuits.
My own Pacifica Hybrid review
4 years ago
No comments:
Post a Comment