MIT researchers check on quantum chips’ accuracy

MIT researchers check on quantum chips’ accuracy

Technology News |
By eeNews Europe

Because quantum chips perform computations using so-called quantum bits (qubits) which can represent multiple states, namely the two classic binary states 0 or 1, as well as an arbitrary quantum superposition of both states simultaneously, they largely expand on compute capacity and are reported to enable quantum computers to solve problems that are impossible for classical computers (on a practical time scale).

Although full-scale quantum computers will require millions of qubits and a lot of progress will have to be made to achieve this scale, researchers have already started to develop “Noisy Intermediate Scale Quantum” (NISQ) chips, which contain around 50 to 100 qubits, just enough to demonstrate “quantum advantage” over classical computers.

Though, the chip’s outputs can look entirely random and it takes a long time to simulate steps to determine if everything went according to plan, making verification very inefficient. This is the problem tackled by the researchers at MIT in a Nature Physics paper co-authored with physicists from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing. They described a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations and validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

The researchers’ work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

At the core of the new protocol, called “Variational Quantum Unsampling,” lies a “divide and conquer” approach, explains first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE) at MIT. This approach breaks the output quantum state into chunks. “Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way,” Carolan says.

For this, the researchers took inspiration from neural networks — which solve problems through many layers of computation to build a novel “quantum neural network” (QNN), where each layer represents a set of quantum operations.

To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimeter NISQ chip with more than 170 control parameters such as tunable circuit components that make manipulating the photon path easier. Pairs of photons are generated at specific wavelengths from an external component and injected into the chip and as the photons travel through the chip’s phase shifters, they interfer with each other. This produces a random quantum output state which represents what would happen during computation. The output is measured by an array of external photodetector sensors.

That output is then sent to the QNN. The first layer uses complex optimization techniques to dig through the noisy output to pinpoint the signature of a single photon among all those scrambled together. Then, it “unscrambles” that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuit’s specific design for the task. All subsequent layers do the same computation — removing from the equation any previously unscrambled photons — until all photons are unscrambled.

In experiments, the team successfully ran a popular computational task used to demonstrate quantum advantage, called “boson sampling,” which is usually performed on photonic chips. In this exercise, phase shifters and other optical components will manipulate and convert a set of input photons into a different quantum superposition of output photons. Ultimately, the task is to calculate the probability that a certain input state will match a certain output state. That will essentially be a sample from some probability distribution. But it’s nearly impossible for classical computers to compute those samples, due to the unpredictable behavior of photons. It’s been theorized that NISQ chips can compute them fairly quickly. Until now, however, there’s been no way to verify that quickly and easily, because of the complexity involved with the NISQ operations and the task itself.

“The very same properties which give these chips quantum computational power makes them nearly impossible to verify,” Carolan observes.

In experiments, the researchers were able to “unsample” two photons that had run through the boson sampling problem on their custom NISQ chip, in a fraction of time it would take traditional verification approaches.
While the method was designed for quantum verification purposes, it could also help capture useful physical properties of photon-emitting materials.


If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles