Pages

25 June 2022

Algorithmic Warfare: DARPA Probing Quantum Computing Capabilities

Meredith Roaten

The Defense Advanced Research Projects Agency recently funded the second phase of a quantum computing project that aims to expand the utility of emerging technology, according to one of the lead researchers on the project.

The second phase of the Georgia Tech Research Institute-led project brought its funding total to $9.2 million for the scientists to run additional experiments on a quantum computing system configured to potentially string together more computing units than ever.

The DARPA project — Optimization with Noisy Intermediate-Scale Quantum devices — aims to “demonstrate the quantitative advantage of quantum information processing by leapfrogging the performance of classical-only systems in solving optimization challenges.”

Researcher Creston Herold said one of the classic problems of optimization that quantum computing systems could solve is called the traveling salesperson.

“One famous one is this traveling salesperson problem, where you have a list of addresses you need to take a path and packages to for delivery, for example,” he said. “And you want to find the most efficient route, whether that’s in time or distance traveled, or fewest left turns made, or at least gas used.”

This type of problem shows up in a wide variety of logistics issues in defense and other government business, he noted.

Quantum computers utilize basic units known as qubits rather than 1s and 0s like traditional computers. Its computing power stems from the potential for each qubit to be both 1 and 0 simultaneously, rather than being restricted to one or the other. As a result, a quantum computer could run more complicated algorithms and operate much faster than a traditional computer.

This research aims to goes beyond most quantum computing advances made so far, Herold explained. Quantum computers exist today, but they are as big as the early traditional computers and haven’t yet developed the computing power to rival their conventional counterparts.

While most quantum computing systems use magnetic traps to isolate ions, one of the team’s researchers, Brian McMahon, developed a “unique” configuration optimized for a more efficient process.

The trapping process — called a Penning trap — uses a combination of a magnetic field and an electric field to confine two-dimensional ion crystals that perform quantum operations.

“The use of rare earths is actually in the permanent magnets, which form the trap,” Herold said. “There are magnets like neodymium or samarium cobalt. They’re very, very strong magnets.”

The trap uses these rare earth metals in place of “bulky, cryo-cooled superconducting magnets,” according to the team.

The team has already put in 18 months of trials and experiments. During that time, researchers built an ion chain the length of 10 qubits. A qubit is one of the smallest units of a quantum computing system.

Herold said building the foundation of the research with the short chain is a start for the research, but ultimately it will go much further.

“It really was about testing out the control scheme and showing that this way of operating the device would solve these problems as expected,” he said.

Adding thousands more quantum systems to the chain would result in the computer calculating more accurate solutions, Herold said. Without adding substantially more systems, the quantum computer would have roughly the same power as a classical machine, he said.

“At the outset of the project, we knew that we would need hundreds of qubits to really move the needle on solving an important problem,” he said. “We can still simulate everything that’s happening on a quantum device, and it’s just too small to attack a large enough optimization problem that we don’t already know the answer easily.”

But that doesn’t mean traditional computing doesn’t play a role in the project. Researchers are using classical computing hardware to guide the quantum hardware to a better starting point, so the system doesn’t have to check every possible solution.

“The classical nature of it is that we are using a classical process to kind of monitor the quantum hardware and decide what to do next,” Herold said.

However promising the project has proven so far, researchers still face daunting technical challenges. For example, the more complex the quantum system becomes, the more likely it is to have a significant error rate caused by “noise” — a term meaning interference with the state of the qubits in the quantum computer.

The research team includes scientists at Oak Ridge National Laboratory, who are using a supercomputer there to map the best pathway to minimizing noise in the quantum system as it is scaled up.

“With quantum hardware, we’re always fighting noise, and at some point, there will be too many errors that we can’t actually make the hardware larger,” Herold said.

While part of the research is finding how to mitigate the errors, the amount of noise will eventually limit how many qubits long the chain will be and therefore the complexity system, he explained.

However, if the researchers can come up with solutions to these challenges for the experiments, the results will be significant across industries, Herold said.

“This project will show that larger collections of qubits can solve optimization problems and in a better way than we know how to now, and that would have a really transformative impact on the way these problems are solved,” he said.

No comments:

Post a Comment