By Mukhtar Ahmad Farooqi
Quantum computing has recently been buzzing in the world of technology, but the question is its nature? Quantum Computing makes direct use of quantum physics by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or qubits (quantum bits), to perform operations on data. Qubits do not rely on the traditional binary nature of computing. While traditional computers encode information into bits using binary numbers, either a 0 or 1, and can only do calculations on one set of numbers at once, quantum computers encode information as a series of quantum-mechanical states such as spin directions of electrons or polarization orientations of a photon that might represent a 1 or a 0, a combination of the two or a number expressing that the state of the qubit is somewhere between 1 and 0, or a superposition of many different numbers at once.
First proposed in the 1970s, the field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in the 1980’s, by Richard Feynman in 1982, and David Deutsch in 1985. A quantum computer with spins as quantum bits was also formulated for use as a quantum Space–time in 1968. However, it was the company D-Wave that introduced what it called “the world’s first commercially available quantum computer” in 2010. Since then the company has doubled the number of qubits, or quantum bits, in its machines roughly every year. Today, its D-Wave 2X system boasts more than 1,000.
A quantum computer can do an arbitrary reversible classical computation on all the numbers simultaneously, which a binary system cannot do, and also has some ability to produce interference between various different numbers. By doing a computation on many different numbers at once, then interfering the results to get a single answer, a quantum computer has the potential to be much more powerful than a classical computer of the same size. In using only a single processing unit, a quantum computer can naturally perform myriad operations in parallel. Quantum computing is not well suited for tasks such as word processing and email, but it is ideal for tasks such ascryptography, modeling and indexing very large databases.
David Di Vincenzo, of IBM, listed the following requirements for a practical quantum computer:
Scalable physically to increase the number of qubits
Qubits that can be initialized to arbitrary values
Quantum gates that are faster than decoherence time
Universal gate set
Qubits that can be read easily.
As per a report published in PC World Magazine recently, many approaches are being tried in the race to develop a working quantum computer, but Google this week reported using a combination of techniques with particularly promising results. In a paper published Wednesday in the Journal Nature, a team of researchers from Google and several academic institutions describe a method that they call “quantum annealing with a digital twist.” Essentially, they combined the quantum annealing approach with the “gate” model of quantum computing and found that they could get the best of both worlds.
IBM is one of the best-known companies associated with quantum computing today, not least because of its big announcement a few weeks ago of the five-qubit quantum processor and it’s developed and plans to make available via the cloud. To create this technology, IBM used the gate model in which qubits are linked together to form circuits. One of the key advantages of that approach is that it includes error correction.
A competing model, used by quantum specialist D-Wave, uses quantum annealing. Known also as the adiabatic approach, this method focuses on finding and maintaining the lowest energy state in a gradually evolving quantum system. In the researchers’ combined approach, they essentially take the adiabatic approach and add the error-correction capabilities of the gate model. In their experiment, they tested it out on a simulated system using nine qubits in which each is connected to its neighbor and individually controlled.
Rami Barends and Alireza Shabani, Quantum Electronics engineers with Google, wrote in a blog post “The crucial advantage for the future is that this digital implementation is fully compatible with known quantum error correction techniques, and can therefore be protected from the effects of noise”.
The various developments in the field of quantum computing models so far distinguished by the basic elements in which the computation is decomposed but four main models of practical importance are:
Quantum gate array (computation decomposed into sequence of few-qubit quantum gates)
One-way quantum computer (computation decomposed into sequence of one-qubit measurements applied to a highly entangled initial state or cluster state)
Adiabatic quantum computer, based on Quantum annealing (computation decomposed into a slow continuous transformation of an initial Hamiltonian into a final Hamiltonian, whose ground states contain the solution)
Topological quantum computer (computation decomposed into the braiding of anyons in a 2D lattice)
The Quantum Turing machine is theoretically important but direct implementation of this model is not pursued. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead.
Although the concept of quantum computing is in its infancy but the future is not far when we see quantum computing as already introduced by IBM doing things which digital systems are not capable of doing.
—The author can be reached at: email@example.com