Search Results

Now showing 1 - 3 of 3
  • Item
    Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim; Novikov, Nikita; Vasin, Artem; Gasnikov, Alexander
    In this paper we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve ill-posed Cauchy problem for Poisson equation and make a comparative analysis with Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method ``doesn't feel the noise''. But after this moment the noise start to accumulate and the quality of the solution becomes worse for further iterations.
  • Item
    Gradient methods for problems with inexact model of the objective
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Stonyakin, Fedor; Dvinskikh, Darina; Dvurechensky, Pavel; Kroshnin, Alexey; Kuznetsova, Olesya; Agafonov, Artem; Gasnikov, Alexander; Tyurin, Alexander; Uribe, Cesar A.; Pasechnyuk, Dmitry; Artamonov, Sergei
    We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [19] and relative smoothness condition [43]. We analyze gradient method which uses this inexact model and obtain convergence rates for convex and strongly convex problems. To show potential applications of our general framework we consider three particular problems. The first one is clustering by electorial model introduced in [49]. The second one is approximating optimal transport distance, for which we propose a Proximal Sinkhorn algorithm. The third one is devoted to approximating optimal transport barycenter and we propose a Proximal Iterative Bregman Projections algorithm. We also illustrate the practical performance of our algorithms by numerical experiments.
  • Item
    Flexible modification of Gauss--Newton method and its stochastic extension
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Yudin, Nikita; Gasnikov, Alexander
    This work presents a novel version of recently developed Gauss--Newton method for solving systems of nonlinear equations, based on upper bound of solution residual and quadratic regularization ideas. We obtained for such method global convergence bounds and under natural non-degeneracy assumptions we present local quadratic convergence results. We developed stochastic optimization algorithms for presented Gauss--Newton method and justified sub-linear and linear convergence rates for these algorithms using weak growth condition (WGC) and Polyak--Lojasiewicz (PL) inequality. We show that Gauss--Newton method in stochastic setting can effectively find solution under WGC and PL condition matching convergence rate of the deterministic optimization method. The suggested method unifies most practically used Gauss--Newton method modifications and can easily interpolate between them providing flexible and convenient method easily implementable using standard techniques of convex optimization.