System Bits: Aug. 28

Characterizing quantum computers
To accelerate and simplify the imposing task of diagnosing quantum computers, a Rice University computer scientist and his colleagues have proposed a method to do just this.

The development of a nonconventional method as a diagnostic tool for powerful, next-generation computers that depend on the spooky actions of quantum bits — aka qubits — which are switches that operate under rules that differ from the 1s and 0s in classical computers was led by Anastasios Kyrillidis, an assistant professor of computer science.

An illustration shows rubidium atom qubits isolated by scientists at the National Institute of Standards and Technology and proposed for use in quantum computers. A team led by Rice University computer scientist Anastasios Kyrillidis has proposed a scalable algorithm to significantly accelerate the task of validating the accuracy of quantum computers.Source: Rice University

The team reminded that quantum computers exploit the principles of quantum mechanics to quickly solve tough problems that would take far longer on conventional supercomputers, promising future breakthroughs in drug design, advanced materials, cryptography and artificial intelligence.

Kyrillidis said like any new hardware quantum computer systems are prone to bugs that need to be squashed, which takes continuous testing to validate their capabilities. The sheer complexity of quantum computers that do exponentially more with every bit requires an immense amount of validation, he said.

The method focuses on quantum state tomography, a process inspired by medical tomography in which images of a body are captured in slices that are later reassembled into a three-dimensional map, he said. “Quantum state tomography differs as it takes ‘images’ of the state of a quantum computer’s qubits. When a quantum computer executes an algorithm, it starts at a specific state; think of it as the input to the algorithm. As the computer progresses through steps of the algorithm, it’s going through many states. The state at the very end is the answer to your algorithm’s question.”

By reassembling the full state from these measurements, Kyrillidis said one can later pinpoint hardware or software errors that may have caused the computer to deliver unexpected results.

That takes a lot of measurements, and the computational cost of reconstruction can be high, even for classical computers, he said. Tomography-based analysis of quantum computers with even as few as five or six qubits would be prohibitive without somehow simplifying the task – and state-of-the-art machines have 50 qubits or more, and even a modest increase in the number of qubits in a computer dramatically increases its power.

“In a system with five qubits, the state can be represented by a 2-to-the-5 times 2-to-the-5 matrix, so it’s a 32-by-32 matrix,” he said. “That’s not big. But in a 20-qubit system like the one at IBM, the state can be characterized by a million-by-million matrix. If we were taking full measurements with regular tomography techniques, we would need to poll the system roughly a million-squared times in order to get enough information to recover its state.”

Kyrillidis reported that he and his team solved the validation problem with an algorithm they call Projected Factored Gradient Decent (ProjFGD), which takes advantage of compressed sensing, a method that minimizes the amount of incoming data while still ensuring accurate results. He said the method would cut the number of measurements for a 20-qubit system to a mere million or so. “That’s still a big number, but much smaller than a million squared.”

IBM, where Kyrillidis spent a year as a research scientist before coming to Rice, has put a quantum computer in the cloud where anyone can access it and run programs. He noted that IBM reasons that the more people learn about programming for quantum computers now, the more mature their skills will be when the platform comes of age. And there’s a side benefit for him, as it gives him a ready platform to test ProjFGD.

“The quantum state tomography tool is generic, and has more to do with the nature of the qubit rather than the specific architecture. As quantum computers get more powerful, it can definitely be scaled up to certify systems,” Kyrillidis added.

Control system simulator fights hackers
A simulator developed by Georgia Tech researchers that comes complete with a virtual explosion could help the operators of chemical processing plants – and other industrial facilities – learn to detect attacks by hackers bent on causing mayhem. The simulator will also help students and researchers understand better the security issues of industrial control systems.

The team pointed out that facilities such as electric power networks, manufacturing operations and water purification plants are among the potential targets for malicious actors because they use programmable logic controllers (PLCs) to open and close valves, redirect electricity flows and manage large pieces of machinery. As such, efforts are underway to secure these facilities, and helping operators become more skilled at detecting potential attacks is a key part of improving security, they said.

Raheem Beyah, the Motorola Foundation Professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology, said, “The goal is to give operators, researchers and students experience with attacking systems, detecting attacks and also seeing the consequences of manipulating the physical processes in these systems. This system allows operators to learn what kinds of things will happen. Our goal is to make sure the good guys get this experience so they can respond appropriately.”

The simulator was developed in part by Atlanta security startup company Fortiphyd Logic, and supported by the Georgia Research Alliance.

Improving bandwidth with ultrathin optic cavities
Researchers at Purdue University reminded that the rainbow is not just colors – each color of light has its own frequency — and the more frequencies there are, the higher the bandwidth for transmitting information.

However, only using one color of light at a time on an electronic chip currently limits technologies based on sensing changes in scattered color, such as detecting viruses in blood samples, or processing airplane images of vegetation when monitoring fields or forests, they said.

As such, putting multiple colors into service at once would mean deploying multiple channels of information simultaneously, broadening the bandwidth of not only today’s electronics, but also of the even faster upcoming “nanophotonics” that will rely on photons – fast and massless particles of light – rather than slow and heavy electrons to process information with nanoscale optical devices. To this point, IBM and Intel have already developed supercomputer chips that combine the higher bandwidth of light with traditional electronic structures.

Now, as researchers engineer solutions for eventually replacing electronics with photonics, a Purdue University-led team has simplified the manufacturing process that allows utilizing multiple colors at the same time on an electronic chip instead of a single color at a time.

New ultrathin nanocavities with embedded silver strips have streamlined color production, and therefore broadened possible bandwidth, for both today’s electronics and future photonics.Source: Purdue University

At the same time, the researchers said they also addressed another issue in the transition from electronics to nanophotonics: The lasers that produce light will need to be smaller to fit on the chip.

Alexander Kildishev, associate professor of electrical and computer engineering at Purdue University, explained, “A laser typically is a monochromatic device, so it’s a challenge to make a laser tunable or polychromatic. Moreover, it’s a huge challenge to make an array of nanolasers produce several colors simultaneously on a chip.”

This requires downsizing the “optical cavity,” which is a major component of lasers. For the first time, researchers from Purdue, Stanford University and the University of Maryland embedded so-called silver “metasurfaces” – artificial materials thinner than light waves – in nanocavities, making lasers ultrathin.

“Optical cavities trap light in a laser between two mirrors. As photons bounce between the mirrors, the amount of light increases to make laser beams possible. Our nanocavities would make on-a-chip lasers ultrathin and multicolor,” Kildishev said
Currently, a different thickness of an optical cavity is required for each color, and by embedding a silver metasurface in the nanocavity, the researchers achieved a uniform thickness for producing all desired colors.

Instead of adjusting the optical cavity thickness for every single color, the team adjusted the widths of metasurface elements. Optical metasurfaces could also ultimately replace or complement traditional lenses in electronic devices. “What defines the thickness of any cell phone is actually a complex and rather thick stack of lenses. If we can just use a thin optical metasurface to focus light and produce images, then we wouldn’t need these lenses, or we could use a thinner stack,” Kildishev added.