Search Results

Now showing 1 - 2 of 2
  • Item
    Stopping rules for accelerated gradient methods with additive noise in gradient
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Vasin, Artem; Gasnikov, Alexander; Spokoiny, Vladimir
    In this article, we investigate an accelerated first-order method, namely, the method of similar triangles, which is optimal in the class of convex (strongly convex) problems with a Lipschitz gradient. The paper considers a model of additive noise in a gradient and a Euclidean prox- structure for not necessarily bounded sets. Convergence estimates are obtained in the case of strong convexity and its absence, and a stopping criterion is proposed for not strongly convex problems.
  • Item
    Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim; Novikov, Nikita; Vasin, Artem; Gasnikov, Alexander
    In this paper we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve ill-posed Cauchy problem for Poisson equation and make a comparative analysis with Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method ``doesn't feel the noise''. But after this moment the noise start to accumulate and the quality of the solution becomes worse for further iterations.