Quantum theory is the theoretical basis of modern physics that explains the nature and behavior of matter and energy on the atomic and subatomic level. In 1900, physicist Max Planck presented his papers on quantum theory to the German Physical Society. Planck had sought to discover the reason that radiation from a glowing body changes in color from red, to orange, and, finally, to blue as its temperature rises.

He found that by making the assumption that energy existed in individual units in the same way that matter does, rather than just as a constant electromagnetic wave - as had been formerly assumed - and was therefore quantifiable, he could find the answer to his question. The existence of these units became the first assumption of quantum theory. Planck wrote a mathematical equation involving a figure to represent these individual units of energy, which he called quanta. The equation explained the phenomenon very well; Planck found that at certain discrete temperature levels (exact multiples of a basic minimum value), energy from a glowing body will occupy different areas of the color spectrum.

Planck assumed there was a theory yet to emerge from the discovery of quanta, but, in fact, their very existence implied a completely new and fundamental understanding of the laws of nature. Planck won the Nobel Prize in Physics for his theory in 1918, but developments by various scientists over a thirty-year period all contributed to the modern understanding of quantum theory.

Now as can see this theory has so many untapped regions which could harness so many innovations to the scientific community. Based on this theory a million side streams have come up And the foremost is quantum technology, under which comes quantum computation.

The history of computer technology has involved a sequence of changes from one type of physical realisation to another --- from gears to relays to valves to transistors to integrated circuits and so on. Today’s advanced lithographic techniques can create chips with features only a fraction of micron wide. Soon they will yield even smaller parts and inevitably reach a point where logic gates are so small that they are made out of only a handful of atoms.

On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional logic gates. So if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits to silicon and multiplying the clock-speed of microprocessors. It can support entirely new kind of computation with qualitatively new algorithms based on quantum principles!

From a physical point of view a bit is a physical system which can be prepared in one of the two different states representing two logical values --- no or yes, false or true, or simply 0 or 1.

Quantum bits, called qubits, are implemented using quantum mechanical two state systems; these are not confined to their two basic states but can also exist in superpositions: effectively this means that the qubit is both in state 0 and state 1.

We can push this idea further. Any classical register composed of three bits can store in a given moment of time only one out of eight different numbers. A quantum register composed of three qubits can store in a given moment of time all eight numbers in a quantum superposition.

Once the register is prepared in a superposition of different numbers one can perform operations on all of them.

Thus quantum computers can perform many different calculations in parallel: a system with n qubits can perform 2n calculations at once! This has impact on the execution time and memory required in the process of computation and determines the efficiency of algorithms.

For an algorithm to be efficient, the time it takes to execute the algorithm must increase no faster than a polynomial function of the size of the input. Think about the input size as the total number of bits needed to specify the input to the problem—for example, the number of bits needed to encode the number we want to factorize. If the best algorithm we know for a particular problem has the execution time (viewed as a function of the size of the input) bounded by a polynomial then we say that the problem belongs to class P.

Problems outside class P are known as hard problems. Thus we say, for example, that multiplication is in P whereas factorization is not in P. “Hard” in this case does not mean “impossible to solve” or “non-computable.” It means that the physical resources needed to factor a large number scale up such that for all practical purposes, it can be regarded as intractable. However some of quantum algorithms can turn hard mathematical problems into easy ones --- factoring being the most striking example so far. The difficulty of factorisation underpins the security of what are currently the most trusted methods of public key encryption, in particular of the RSA (Rivest, Shamir and Adelman) system (you can find a lot about the RSA system if you google it), which is often used to protect electronic bank accounts. Once a quantum factorisation engine (a special-purpose quantum computer for factorising large numbers) is built, all such cryptographic systems will become insecure.

Potential use of quantum factoring for code-breaking purposes has raised the obvious suggestion of building a quantum computer.

**Quantum networks**

In principle we know how to build a quantum computer; we start with simple quantum logic gates and connect them up into quantum networks. A quantum logic gate, like a classical gate, is a very simple computing device that performs one elementary quantum operation, usually on two qubits, in a given time. Of course, quantum logic gates differ from their classical counterparts in that they can create and perform operations on quantum superpositions. As the number of quantum gates in a network increases, we quickly run into some serious practical problems. The more interacting qubits are involved, the harder it tends to be to engineer the interaction that would display the quantum properties. The more components there are, the more likely it is that quantum information will spread outside the quantum computer and be lost into the environment, thus spoiling the computation. This process is called decoherence. Thus our task is to engineer sub-microscopic systems in which qubits affect each other but not the environment.

It is not clear which technology will support quantum computation in future. Today simple quantum logic gates involving two qubits are being realised in laboratories. Current experiments range from trapped ions via atoms in an array of potential wells created by a pattern of crossed laser beams to electrons in semiconductors. The next decade should bring control over several qubits and, without any doubt, we shall already begin to benefit from our new way of harnessing nature.

All this implies a better future for mankind. Of course, man always finds a way of stealing science's thunder by using a new technology for evil purposes. But that is something which is outside the scope of this topic. The advances in quantum computation means that within a few years we can be operating a computer working at quanta speeds, and chatting with a friend on the quanta based internet, which i would like to call the Quantranet. I hope that somebody thinks of using this word when it is invented. On that highly optimistic note i am signing off, and until my next post, bye and take care.

*Rohit*