Thursday, August 27, 2009

The quantum leap

In 1965, Intel co-founder Gordon E. Moore described a business trend that the number of transistors which can feasibly be placed in an integrated circuit has increased at a fixed exponential rate since the invention of the integrated circuit in 1958. To the present day, we have seen processing speeds doubling roughly every 18 months. This trend, which has come to be known as Moore's Law, has seen the inception of personal computing devices in the 80s, their ascent in the 90s, and the widespread networking of them in the 2000s. However, the trend is not without bounds.

It is true that the generally held belief is such that the trend described by Moore's Law will continue for at least another decade; but then what? Physicists allege that Moore's Law will collapse within a few decades due to the limitations of silicon processors. Currently, the smallest chip has a layer of silicon no more than 20 atoms across and before long this number is expected to drop to a mere 5 atoms. At this point, due to the uncertainty principle in such a confined area, the silicon will be unstable and result in short circuits.

Of course this seeming inevitability does not mean the end of computing. Rather, where silicon semiconductors break down, is the operating level of quantum computers. Quantum computing was first proposed in the 1980s by Richard Feynman and Paul Benioff -- a process which exploits the ambiguities of quantum mechanics. That is to say, quantum computers run on mere atoms, utilizing the fundamental principle and superposition which states that the state of a particle such as an electron is ambiguous. Thus, in a quantum computer, the atoms are in an unfixed state and by using probabilities, a simple two quantum bits, or qubits, of information can have four disparate values (00, 01, 10, 11 in binary); five qubits can have twenty-five values and so on. When functioning with sizable chunks, the processing speed will far surpass current digital processors.

The upshot of quantum computing I see as twofold. Human thought operates at a speed of about five-hundred trillion bits per second, a level that would take approximately 50 years to achieve with our conventional methods, a duration which cannot be sustained. It is likely we will see serious and considerable breakthroughs in artificial intelligence only by way of harnessing quantum computing. Although current efforts in the realm of quantum computing have been relegated to qubits in the single digits, many believe it is only a short matter of time the field gains momentum. In addition to implications for artificial intelligence, the revolution of quantum computing will first see the drop in demand for silicon and corresponding drop in price. Before silicon valley becomes the new rust belt, it is likely to have widespread applications and give rise to ubiquitous computing -- putting processors in pretty much anything and everything.

1 comment:

r. said...

once again andy. trill.