Adaptive gradient descent for convex and non-convex stochastic optimization

Loading...
Thumbnail Image
Date
2019
Volume
2655
Issue
Journal
Series Titel
Book Title
Publisher
Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik
Abstract

In this paper we propose several adaptive gradient methods for stochastic optimization. Our methods are based on Armijo-type line search and they simultaneously adapt to the unknown Lipschitz constant of the gradient and variance of the stochastic approximation for the gradient. We consider an accelerated gradient descent for convex problems and gradient descent for non-convex problems. In the experiments we demonstrate superiority of our methods to existing adaptive methods, e.g. AdaGrad and Adam.

Description
Keywords
Convex and non-convex optimization, stochastic optimization, first-order method, adaptive method, gradient descent, complexity bounds, mini-batch
Citation
Ogaltsov, A., Dvinskikh, D., Dvurechensky, P., Gasnikov, A., & Spokoiny, V. (2019). Adaptive gradient descent for convex and non-convex stochastic optimization (Vol. 2655). Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik. https://doi.org//10.20347/WIAS.PREPRINT.2655
Collections
License
This document may be downloaded, read, stored and printed for your own use within the limits of § 53 UrhG but it may not be distributed via the internet or passed on to external parties.
Dieses Dokument darf im Rahmen von § 53 UrhG zum eigenen Gebrauch kostenfrei heruntergeladen, gelesen, gespeichert und ausgedruckt, aber nicht im Internet bereitgestellt oder an Außenstehende weitergegeben werden.