Jan 23, 2019 · In this work we present MASAGA, a variant of SAGA to optimize finite sums over Riemannian manifolds. Similar to RSVRG, we show that it converges ...
We present MASAGA, an extension of the stochastic average gradient variant SAGA on Riemannian manifolds. SAGA is a variance-reduction technique that typically ...
The experiments show that MASAGA is faster than the recent Riemannian stochastic gradient descent algorithm for the classic problem of finding the leading ...
Official code for the paper "MASAGA: A Linearly-Convergent Stochastic First-Order Method for Optimization on Manifolds".
[PDF] MASAGA: A Stochastic Algorithm for Manifold Optimization
r3za.github.io › Reza_files › masag...
We analyzed the algorithm and showed that it converges linearly when the objective function is geodesically Lipschitz smooth and strongly convex. We also showed ...
MASAGA: A linearly-convergent stochastic first-order method for optimization on manifolds. R Babanezhad, IH Laradji, A Shafaei, M Schmidt. Machine Learning ...
MASAGA: A Linearly-Convergent Stochastic First-Order Method for Optimization on Manifolds. R. Babanezhad, I. Laradji, A. Shafaei, M. Schmidt. ECML, 2018 ...
Aug 13, 2015 · We consider the stochastic optimization of finite sums over a Riemannian manifold where the functions are smooth and convex. We present MASAGA, ...
This paper develops a procedure extending stochastic gradient descent algorithms to the case where the function is defined on a Riemannian manifold and ...
Reza Babanezhad, Issam H. Laradji, Alireza Shafaei, Mark Schmidt: MASAGA: A Linearly-Convergent Stochastic First-Order Method for Optimization on Manifolds.