Statistical Inference Using SGD

Authors

  • Tianyang Li University of Texas at Austin
  • Liu Liu University of Texas at Austin
  • Anastasios Kyrillidis IBM T.J. Watson Research Center, Yorktown Heights
  • Constantine Caramanis University of Texas at Austin

DOI:

https://doi.org/10.1609/aaai.v32i1.11686

Keywords:

statistical inference, confidence interval, hypothesis testing, stochastic gradient descent

Abstract

We present a novel method for frequentist statistical inference in M-estimation problems, based on stochastic gradient descent (SGD) with a fixed step size: we demonstrate that the average of such SGD sequences can be used for statistical inference, after proper scaling. An intuitive analysis using the Ornstein-Uhlenbeck process suggests that such averages are asymptotically normal. To show the merits of our scheme, we apply it to both synthetic and real data sets, and demonstrate that its accuracy is comparable to classical statistical methods, while requiring potentially far less computation.

Downloads

Published

2018-04-29

How to Cite

Li, T., Liu, L., Kyrillidis, A., & Caramanis, C. (2018). Statistical Inference Using SGD. Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.11686