skip to main content
10.1145/3458817.3476219acmconferencesArticle/Chapter ViewAbstractPublication PagesscConference Proceedingsconference-collections
research-article
Public Access

Overcoming barriers to scalability in variational quantum Monte Carlo

Published: 13 November 2021 Publication History

Abstract

The variational quantum Monte Carlo (VQMC) method received significant attention in the recent past because of its ability to overcome the curse of dimensionality inherent in many-body quantum systems. Close parallels exist between VQMC and the emerging hybrid quantum-classical computational paradigm of variational quantum algorithms. VQMC overcomes the curse of dimensionality by performing alternating steps of Monte Carlo sampling from a parametrized quantum state followed by gradient-based optimization. While VQMC has been applied to solve high-dimensional problems, it is known to be difficult to parallelize, primarily owing to the Markov Chain Monte Carlo (MCMC) sampling step. In this work, we explore the scalability of VQMC when autoregressive models, with exact sampling, are used in place of MCMC. This approach can exploit distributed-memory, shared-memory and/or GPU parallelism in the sampling task without any bottlenecks. In particular, we demonstrate GPU-scalability of VQMC for solving up to ten-thousand dimensional combinatorial optimization problems.

Supplementary Material

MP4 File (Overcoming Barriers to Scalability in Variational Quantum Monte Carlo.mp4.mp4)
Presentation video

References

[1]
Scott Aaronson. 2009. Why quantum chemistry is hard. Nature Physics 5, 10 (2009), 707--708.
[2]
P.-A. Absil, C. G. Baker, and K. A. Gallivan. 2007. Trust-region methods on Riemannian manifolds. Foundations of Computational Mathematics 7, 3 (2007), 303--330.
[3]
Akshay Agrawal, Robin Verschueren, Steven Diamond, and Stephen Boyd. 2018. A Rewriting System for Convex Optimization Problems. Journal of Control and Decision 5, 1 (2018), 42--60.
[4]
Shun-Ichi Amari. 1998. Natural gradient works efficiently in learning. Neural computation 10, 2 (1998), 251--276.
[5]
Yoshua Bengio and Samy Bengio. 2000. Modeling high-dimensional discrete data with multi-layer neural networks. Advances in Neural Information Processing Systems 12 (2000), 400--406.
[6]
Nicolas Boumal, Bamdev Mishra, P.-A. Absil, and Rodolphe Sepulchre. 2014. Manopt, a Matlab Toolbox for Optimization on Manifolds. J. Mach. Learn. Res. 15, 1 (2014).
[7]
Sergey Bravyi, David Gosset, Robert König, and Kristan Temme. 2019. Approximation algorithms for quantum many-body problems. J. Math. Phys. 60, 3 (2019), 032203.
[8]
Samuel Burer and Renato D.C. Monteiro. 2001. A Nonlinear Programming Algorithm for Solving Semidefinite Programs via Low-rank Factorization. Mathematical Programming (series B 95 (2001), 2003.
[9]
Giuseppe Carleo and Matthias Troyer. 2017. Solving the quantum many-body problem with artificial neural networks. Science 355, 6325 (2017), 602--606.
[10]
Steven Diamond and Stephen Boyd. 2016. CVXPY: A Python-Embedded Modeling Language for Convex Optimization. Journal of Machine Learning Research 17, 83 (2016), 1--5.
[11]
Simon Duane, Anthony D Kennedy, Brian J Pendleton, and Duncan Roweth. 1987. Hybrid monte carlo. Physics letters B 195, 2 (1987), 216--222.
[12]
Stuart Geman and Donald Geman. 1984. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on pattern analysis and machine intelligence 6, 6 (1984), 721--741.
[13]
Mathieu Germain, Karol Gregor, Iain Murray, and Hugo Larochelle. 2015. Made: Masked autoencoder for distribution estimation. In International Conference on Machine Learning. 881--889.
[14]
Michel X Goemans and David P Williamson. 1995. Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming. Journal of the ACM (JACM) 42, 6 (1995), 1115--1145.
[15]
Joseph Gomes, Keri A McKiernan, Peter Eastman, and Vijay S Pande. 2019. Classical quantum optimization with neural network quantum states. arXiv preprint arXiv:1910.10675 (2019).
[16]
Peter J Green. 1995. Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82, 4 (1995), 711--732.
[17]
W Keith Hastings. 1970. Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 1 (1970), 97--109.
[18]
Mohamed Hibat-Allah, Martin Ganahl, Lauren E Hayward, Roger G Melko, and Juan Carrasquilla. 2020. Recurrent neural network wave functions. Physical Review Research 2, 2 (2020), 023358.
[19]
Matthew D Hoffman and Andrew Gelman. 2014. The No-U-Turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res. 15, 1 (2014), 1593--1623.
[20]
M. Journée, F. Bach, P.-A. Absil, and R. Sepulchre. 2010. Low-Rank Optimization on the Cone of Positive Semidefinite Matrices. SIAM J. on Optimization 20, 5 (May 2010), 2327--2351.
[21]
Diederik P Kingma, Tim Salimans, Rafal Jozefowicz, Xi Chen, Ilya Sutskever, and Max Welling. 2016. Improving variational inference with inverse autoregressive flow. Advances in Neural Information Processing Systems.
[22]
Ivan Kobyzev, Simon Prince, and Marcus Brubaker. 2020. Normalizing flows: An introduction and review of current methods. IEEE Transactions on Pattern Analysis and Machine Intelligence (2020).
[23]
Hugo Larochelle and Iain Murray. 2011. The neural autoregressive distribution estimator. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics. 29--37.
[24]
W. L. McMillan. 1965. Ground State of Liquid He4. Phys. Rev. 138 (1965), A442--A451. Issue 2A.
[25]
Takahiro Misawa, Satoshi Morita, Kazuyoshi Yoshimi, Mitsuaki Kawamura, Yuichi Motoyama, Kota Ido, Takahiro Ohgoe, Masatoshi Imada, and Takeo Kato. 2019. mVMC---Open-source software for many-variable variational Monte Carlo method. Computer Physics Communications 235 (2019), 447--462.
[26]
Or Sharir, Yoav Levine, Noam Wies, Giuseppe Carleo, and Amnon Shashua. 2020. Deep autoregressive models for the efficient variational simulation of many-body quantum systems. Physical review letters 124, 2 (2020), 020503.
[27]
Or Sharir, Yoav Levine, Noam Wies, Giuseppe Carleo, and Amnon Shashua. 2020. FlowKet: an open-source library based on Tensorflow for running Variational Monte-Carlo simulations on GPUs. https://github.com/HUJI-Deep/FlowKet.
[28]
Sandro Sorella. 1998. Green Function Monte Carlo with Stochastic Reconfiguration. Physical Review Letters 80, 20 (1998), 4558--4561.
[29]
Aaron van den Oord, Nal Kalchbrenner, Lasse Espeholt, koray kavukcuoglu, Oriol Vinyals, and Alex Graves. 2016. Conditional Image Generation with PixelCNN Decoders. In Advances in Neural Information Processing Systems, D. Lee, M. Sugiyama, U. Luxburg, I. Guyon, and R. Garnett (Eds.), Vol. 29. Curran Associates, Inc.
[30]
Dian Wu, Lei Wang, and Pan Zhang. 2019. Solving statistical mechanics using variational autoregressive networks. Physical review letters 122, 8 (2019), 080602.
[31]
Tianchen Zhao, Giuseppe Carleo, James Stokes, and Shravan Veerapaneni. 2020. Natural evolution strategies and variational Monte Carlo. Machine Learning: Science and Technology 2, 2 (2020), 02LT01.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SC '21: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis
November 2021
1493 pages
ISBN:9781450384421
DOI:10.1145/3458817
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

In-Cooperation

  • IEEE CS

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 November 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. GPU parallelization
  2. density estimation
  3. generative models
  4. neural networks
  5. normalizing flows
  6. variational inference

Qualifiers

  • Research-article

Funding Sources

Conference

SC '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,516 of 6,373 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)177
  • Downloads (Last 6 weeks)16
Reflects downloads up to 15 Sep 2024

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media