It has lately occurred to me that I cannot live without the Internet for even one day, perhaps the cell phone, too. Most of you probably feel the same way. This makes me worry about the future when the silicon era comes to an end due to technological or other reasons. According to many scientists, the silicon era will end in 15 to 20 years within our generation.
The great advance in electronics technology is driven by miniaturization where more and more transistors and circuits are packed onto a small piece of silicon wafer called a chip. The current technology allows printing hundreds of millions of transistors onto a chip the size of your fingernail. The result is doubling the computing power every 18 months, as depicted by Moore’s law (Gordon Moore is a founder of Intel Corporation). Can this continue indefinitely? If not, what will replace the silicon?
One obvious limiting factor is heat. Increasing circuit density means more heat generated on the silicon chip until some point is reached when it melts. A new cooling system has to be invented instead of the present one using a small electric fan.
The other limiting factor has to do with the laws of physics. The current manufacturing technology employs ultraviolet light that can potentially carve out on the silicon wafer a transistor as small as 30 atoms across. When the transistor reaches that minute size, the electrons will behave differently according to the laws of quantum physics. The Heisenberg uncertainty principle states that you cannot determine the position and velocity of the electron. This means that the electron passing through the transistor may leak out, causing a short circuit. In other words, nature puts a limit on the minimum size of the transistor.
The two limiting factors described above present a real hardware problem for Moore’ law to continuously deliver twice the computing power every 18 months.
With the future slowdown of Moore’s law on silicon chips, some experts suggest using software to compensate, that is, employing parallel processing techniques in software writing. As you know, the human brain can do a lot of parallel processing, such as talking, thinking, and music listening while driving at the same time, even texting too! However, the human brain does not depend on software to function. Its complexity has evolved through millions of years to arrive at the present stage. Software programmers do not have this luxury of time to perfect parallel processing before the silicon era expires. Therefore, the solution has to come from hardware, not software.
Some companies are presently researching into the creation of atomic or molecular transistors. The substances used include: rotaxane, bezenethiol, and graphene. Graphene shows more promise, consisting of an extremely thin sheet of carbon atoms. The biggest hurdle is how to wire them up and mass-produce the circuits for computing purpose. The uncertainty principle still remains a nuisance at the atomic level. However, some new standards can be devised using probability theory (expressed in percentages) instead of the absolute 1’s and 0’s in current digital signals.
Besides the atomic transistor on a non-silicon wafer, other hardware solutions for the future include:
Optical computer: Use light beams of high frequencies instead of electrons to do computing. A light beam carries a lot more data than a moving elecron. Furthermore, no wire or circuit is needed for electrons to pass through.
Quantum dot computer: Silicon chips can be etched into tiny dots consisting of 100 atoms. At this scale, the atoms begin to vibrate in unison. A new computing standard can be based on the vibrations of these tiny dots.
DNA computer: A strand of DNA carries encoded data represented by the letters A.T.C.G, which have more capacity than the digital 1’s and 0’s. A major challenge is that DNA works in a special solution that needs to be stored in the computer.
The limits of miniaturization of the silicon chip are real and approaching, but the future computer still looks like science fiction. It will be interesting to see how things develop in the next ten years. For more details, see Michio Kaku, “Physics of the Future”, 2011.