Quantum annealing shows promise for continuous-variable optimization

Quantum annealing (QA) stands as a pioneering algorithm, harnessing the remarkable facets of quantum computing to tackle intricate combinatorial optimization problems. These problems involve functions with discrete-valued variables, and quantum computers, operating under the rules of quantum physics, have the potential to solve them faster than classical counterparts. Quantum computers excel in exploring multiple problem solutions simultaneously, granting them a substantial speed advantage, chiefly because of “quantum tunneling,” where particles can traverse energy barriers seemingly beyond their reach.

Until recently, QA predominantly focused on solving problems with discrete variables, leaving the realm of optimizing continuous-variable functions largely uncharted. In an endeavor to fill this gap, a team of researchers hailing from Japan—Shunta Arai and Hidetoshi Nishimori from Tokyo Institute of Technology, along with Hiroki Oshiyama from Tohoku University—conducted experiments to assess QA’s performance on continuous-variable optimization using the D-Wave 2000Q quantum computer. They compared these findings with outcomes generated by classical algorithms, subsequently publishing their results in the journal Physical Review A.

Professor Nishimori elucidates, “We systematically delved into whether QA holds an advantage over classical algorithms in optimizing the Rastrigin function, a standard benchmark for evaluating optimization algorithms, known for its one-dimensional continuous nature.” Employing a technique known as “domain-wall encoding,” they transformed continuous variables into discrete Ising variables and conducted two sets of benchmark tests.

In the first set, they pitted D-Wave’s QA against several state-of-the-art classical algorithms, specifically designed for continuous-variable functions. These included Nelder-Mead, conjugate gradient descent, basin-hopping, and differential evolution. Remarkably, D-Wave held its ground or even outperformed classical algorithms, but only within a limited timeframe. As the execution time extended, classical algorithms took the lead while D-Wave’s performance plateaued.

In the second phase, the researchers compared D-Wave against classical discrete-variable optimization algorithms: simulated annealing (SA), simulated QA (SQA), and spin-vector Monte Carlo (SVM). They also introduced the time-evolving block decimation (TEBD) algorithm, simulating noise-free coherent QA on a classical computer. Notably, TEBD surpassed the rest in performance, while all other algorithms, including D-Wave, exhibited sensitivity to the energy barrier height—a characteristic naturally associated with SA, SQA, and SVM.

Importantly, D-Wave’s performance dependency on barrier height suggests susceptibility to thermal noise and hardware imperfections. Professor Nishimori notes, “Classical algorithms may struggle to find solutions with higher energy barriers beyond our testing, whereas QA, when executed coherently, would be less affected.” These findings underscore the potential for QA to excel over classical algorithms in continuous function optimization, provided optimized hardware mitigates noise and imperfections.

In sum, this study signifies a noteworthy stride toward methodical and quantitative exploration of continuous-variable optimization through QA, contrasting its capabilities with a suite of well-established classical algorithms.

Source: Tokyo Institute of Technology

Leave a Comment