Sep 12, 2011 · We show that both the basic proximal-gradient method and the accelerated proximal-gradient method achieve the same convergence rate as in the error-free case.
We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using proximal-gradient methods.
inexact proximal-gradient method will have a faster convergence rate than the exact basic proximal- gradient method provided that Q0 < (1 − γ). Oddly, in our ...
We show that both the basic proximal-gradient method and the accelerated proximal-gradient method achieve the same convergence rate as in the error-free case, ...
Dec 1, 2011 · We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using proximal-gradient methods, ...
Proximal-gradient methods have the same convergence rates as [accelerated] gradient methods for smooth optimization. [Nesterov, 2007, Beck & Teboulle, 2009].
This work shows that both the basic proximal-gradient method and the accelerated proximal - gradient method achieve the same convergence rate as in the error- ...
People also ask
What is the convergence rate of the proximal gradient method?
What are the methods for solving convex optimization problems?
What are the first order methods in convex optimization?
We show that both the basic proximal-gradient method and the accelerated proximal-gradient method achieve the same convergence rate as in the error-free case, ...
We show that both the basic proximal-gradient method and the accelerated proximal-gradient method achieve the same convergence rate as in the error-free case.
Oct 22, 2024 · We show that both the basic proximal-gradient method and the accelerated proximal-gradient method achieve the same convergence rate as in the ...