The nature of computing has changed dramatically over the last decade, and more innovation is needed to weather the gathering data storm.

When subatomic particles smash together at the Large Hadron Collider in Switzerland, they create showers of new particles whose signatures are recorded by four detectors. The LHC captures 5 trillion bits of data — more information than all of the world’s libraries combined — every second. After the judicious application of filtering algorithms, more than 99 percent of those data are discarded, but the four experiments still produce a whopping 25 petabytes (25×1015 bytes) of data per year that must be stored and analyzed. That is a scale far beyond the computing resources of any single facility, so the LHC scientists rely on a vast computing grid of 160 data centers around the world, a distributed network that is capable of transferring as much as 10 gigabytes per second at peak performance. Continue reading »

A quantum computer can solve tasks not tractable with conventional supercomputers. The question of how one can, nevertheless, verify the reliability of a quantum computer was recently answered in an experiment at the University of Vienna. The conclusions are published in the reputed scientific journal Nature Physics.

The image is an illustration of the fundamental question: can quantum computations be verified by entities that are inherently unable to compute the results themselves? (Copyright: EQUINOX GRAPHICS)

The harnessing of quantum phenomena, such as superposition and entanglement, holds great promise for constructing future supercomputers. One huge advantage of such quantum computers is that they are capable of performing a variety of tasks much quicker than their conventional counterparts. The use of quantum computers for these purposes raises a significant challenge: how can one verify the results provided by such a computer?
It is only recently that theoretical developments have provided methods to test a quantum computer without having an additional quantum computer at hand. The international research team around Philip Walther at the University of Vienna have now demonstrated a new protocol, where the quantum computational results can be verified without using additional quantum computer resources. Continue reading »

An international research collaboration led by scientists from the University of Bristol, UK, has developed a new approach to quantum computing that could lead to the mass-manufacture of new quantum technologies.

The alphabet of data processing could include more elements than the “0″ and “1″ in future. An international research team has achieved a new kind of bit with single electrons, called quantum bits, or qubits. With them, considerably more than two states can be defined. So far, quantum bits have only existed in relatively large vacuum chambers. The team has now generated them in semiconductors. They have put an effect in practice, which the RUB physicist Prof. Dr. Andreas Wieck had already theoretically predicted 22 years ago. This represents another step along the path to quantum computing.

Together with colleagues from Grenoble and Tokyo, Wieck from the Chair of Applied Solid State Physics reports on the results in the journal Nature Nanotechnology.

- Latest results bring device performance near the minimum requirements for implementation of a practical quantum computer.
- Scaling up to hundreds or thousands of quantum bits becomes a possibility.

Scientists at IBM Research (NYSE: IBM)/ (#ibmresearch) have achieved major advances in quantum computing device performance that may accelerate the realization of a practical, full-scale quantum computer. For specific applications, quantum computing, which exploits the underlying quantum mechanical behavior of matter, has the potential to deliver computational power that is unrivaled by any supercomputer today.

The most extensive quantum computation in history took just 270 milliseconds, say quantum physicists.

Quantum computers are in danger of losing their lustre. These machines exploit the strange rules of quantum mechanics to carry out calculations that are vastly more powerful than anything that conventional computers can do.

Or so we’re told. Quantum computers in one form or another have been carrying out calculations for more than a decade. But far from putting conventional computers to shame, these devices have yet to outperform the calculating abilities of a primary school child.

Ten years ago, physicists used a quantum computer to factorise the number 15 using seven quantum bits or qubits. The result received great acclaim. Last year, they beat this record by factorising the number 143 using four qubits. Hardly a meteoric improvement.

Researchers at the University of Pittsburgh have made advances in better understanding correlated quantum matter that could change technology as we know it, according to a study published in the Nov. 20 edition of Nature.

W. Vincent Liu, associate professor of physics in Pitt’s Department of Physics and Astronomy, in collaboration with researchers from the University of Maryland and the University of Hamburg in Germany, has been studying topological states in order to advance quantum computing, a method that harnesses the power of atoms and molecules for computational tasks. Through his research, with more than $1 million in funding from two consecutive four-year grants from the U.S. Army Research Office and a five-year shared grant from the DARPA Optical Lattice Emulator Program, Liu and his team have been studying orbital degrees of freedom and nano-Kelvin cold atoms in optical lattices (a set of standing wave lasers) to better understand new quantum states of matter.

UCSB Physicists Demonstrate the Quantum von Neumann Architecture, a Quantum Processor, and a Quantum Memory on a Chip.

The quantum von Neumann machine: Two qubits are coupled to a quantum bus, realizing a quCPU. Each qubit is accompanied by a quantum memory as well as a zeroing register. The quantum memories together with the zeroing register realize the quRAM. Credit: Peter Allen, UCSB

(Santa Barbara, Calif.) –– A new paradigm in quantum information processing has been demonstrated by physicists at UC Santa Barbara. Their results are published in this week’s issue of Science Express online.

UCSB physicists have demonstrated a quantum integrated circuit that implements the quantum von Neumann architecture. In this architecture, a long-lived quantum random access memory can be programmed using a quantum central processing unit, all constructed on a single chip, providing the key components for a quantum version of a classical computer.

The UCSB hardware is based on superconducting quantum circuits, and must be cooled to very low temperatures to display quantum behavior. The architecture represents a new paradigm in quantum information processing, and shows that quantum large-scale-integration is within reach.

Matteo Mariantoni Credit: George Foulsham, Office of Public Affairs, UCSB

The quantum integrated circuit includes two quantum bits (qubits), a quantum communication bus, two bits of quantum memory, and a resetting register comprising a simple quantum computer. “Computational steps take a few billionths of a second, comparable to a classical computer, but the great power is that a quantum computer can perform a large number of calculations simultaneously,” said Matteo Mariantoni, postdoctoral fellow in the Department of Physics. “In our new UCSB architecture we have explored the possibility of writing quantum information to memory, while simultaneously performing other quantum calculations.

“On the quantum von Neumann architecture, we were able to run the quantum Fourier transform and a three-qubit Toffoli gate –– key quantum logic circuits for the further development of quantum computing,” said Mariantoni.

The UCSB experiment was pursued primarily by Mariantoni, under the direction of Andrew N. Cleland and John M. Martinis, both professors of physics. Mariantoni was supported in this work by an Elings Prize Fellowship in Experimental Science from UCSB’s California NanoSystems Institute.

The quantum bit (blue) is entangled with the auxiliary qubits (red). If an error occurs, the state of the defective quantum bit is corrected. (Credit: Harald Ritsch)

A general rule in data processing is that disturbances cause the distortion or deletion of information during data storage or transfer. Methods for conventional computers were developed that automatically identify and correct errors: Data are processed several times and if errors occur, the most likely correct option is chosen. As quantum systems are even more sensitive to environmental disturbances than classical systems, a quantum computer requires a highly efficient algorithm for error correction. The research group of Rainer Blatt from the Institute for Experimental Physics of the University of Innsbruck and the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences (IQOQI) has now demonstrated such an algorithm experimentally.

“The difficulty arises because quantum information cannot be copied,” explains Schindler. “This means that we cannot save information repeatedly and then compare it.” Therefore, the physicists use one of the peculiarities of quantum physics and use quantum mechanical entanglement to perform error correction.

Quick and efficient error correction

The Innsbruck physicists demonstrate the mechanism by storing three calcium ions in an ion trap. All three particles are used as quantum bits (qubits), where one ion represents the system qubit and the other two ions auxiliary qubits. “First we entangle the system qubit with the other qubits, which transfers the quantum information to all three particles,” says Philipp Schindler. “Then a quantum algorithm determines whether an error occurs and if so, which one. Subsequently, the algorithm itself corrects the error.” After having made the correction, the auxiliary qubits are reset using a laser beam. “This last point is the new element in our experiment, which enables repetitive error correction,” says Rainer Blatt. “Some years ago, American colleagues demonstrated the general functioning of quantum error correction. Our new mechanism allows us to repeatedly and efficiently correct errors.”

Leading the field

“For a quantum computer to become reality, we need a quantum processor with many quantum bits,” explains Schindler. “Moreover, we need quantum operations that work nearly error-free. The third crucial element is an efficient error correction.” For many years Rainer Blatt’s research group, which is one of the global leaders in the field, has been working on realizing a quantum computer. Three years ago they presented the first quantum gate with fidelity of more than 99 percent. Now they have realized another key element: repetitive error correction.

This research work is supported by the Austrian Science Fund (FWF), the European Commission, the European Research Council and the Federation of Austrian Industries Tyrol and is published in the scientific journal Science.