A Hybrid Stochastic Gradient Hamiltonian Monte Carlo Method
DOI:
https://doi.org/10.1609/aaai.v35i12.17295Keywords:
Bayesian LearningAbstract
Recent theoretical analyses reveal that existing Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) methods need large mini-batches of samples (exponentially dependent on the dimension) to reduce the mean square error of gradient estimates and ensure non-asymptotic convergence guarantees when the target distribution has a nonconvex potential function. In this paper, we propose a novel SG-MCMC algorithm, called Hybrid Stochastic Gradient Hamiltonian Monte Carlo (HSG-HMC) method, which needs merely one sample per iteration and possesses a simple structure with only one hyperparameter. Such improvement leverages a hybrid stochastic gradient estimator that exploits historical stochastic gradient information to control the mean square error. Theoretical analyses show that our method obtains the best-known overall sample complexity to achieve epsilon-accuracy in terms of the 2-Wasserstein distance for sampling from distributions with nonconvex potential functions. Empirical studies on both simulated and real-world datasets demonstrate the advantage of our method.Downloads
Published
2021-05-18
How to Cite
Zhang, C., Li, Z., Shen, Z., Xie, J., & Qian, H. (2021). A Hybrid Stochastic Gradient Hamiltonian Monte Carlo Method. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10842-10850. https://doi.org/10.1609/aaai.v35i12.17295
Issue
Section
AAAI Technical Track on Machine Learning V