Search Results

Now showing 1 - 6 of 6
  • Item
    Flexible modification of Gauss--Newton method and its stochastic extension
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Yudin, Nikita; Gasnikov, Alexander
    This work presents a novel version of recently developed Gauss--Newton method for solving systems of nonlinear equations, based on upper bound of solution residual and quadratic regularization ideas. We obtained for such method global convergence bounds and under natural non-degeneracy assumptions we present local quadratic convergence results. We developed stochastic optimization algorithms for presented Gauss--Newton method and justified sub-linear and linear convergence rates for these algorithms using weak growth condition (WGC) and Polyak--Lojasiewicz (PL) inequality. We show that Gauss--Newton method in stochastic setting can effectively find solution under WGC and PL condition matching convergence rate of the deterministic optimization method. The suggested method unifies most practically used Gauss--Newton method modifications and can easily interpolate between them providing flexible and convenient method easily implementable using standard techniques of convex optimization.
  • Item
    Stopping rules for accelerated gradient methods with additive noise in gradient
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Vasin, Artem; Gasnikov, Alexander; Spokoiny, Vladimir
    In this article, we investigate an accelerated first-order method, namely, the method of similar triangles, which is optimal in the class of convex (strongly convex) problems with a Lipschitz gradient. The paper considers a model of additive noise in a gradient and a Euclidean prox- structure for not necessarily bounded sets. Convergence estimates are obtained in the case of strong convexity and its absence, and a stopping criterion is proposed for not strongly convex problems.
  • Item
    Zeroth-order algorithms for smooth saddle-point problems
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Sadiev, Abdurakhmon; Beznosikov, Aleksandr; Dvurechensky, Pavel; Gasnikov, Alexander
    Saddle-point problems have recently gained an increased attention from the machine learning community, mainly due to applications in training Generative Adversarial Networks using stochastic gradients. At the same time, in some applications only a zeroth-order oracle is available. In this paper, we propose several algorithms to solve stochastic smooth (strongly) convex-concave saddle- point problems using zeroth-order oracles, and estimate their convergence rate and its dependence on the dimension n of the variable. In particular, our analysis shows that in the case when the feasible set is a direct product of two simplices, our convergence rate for the stochastic term is only by a log n factor worse than for the first-order methods. We also consider a mixed setup and develop 1/2th-order methods which use zeroth-order oracle for the minimization part and first-order oracle for the maximization part. Finally, we demonstrate the practical performance of our zeroth-order and 1/2th-order methods on practical problems.
  • Item
    Inexact tensor methods and their application to stochastic convex optimization
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Agafonov, Artem; Kamzolov, Dmitry; Dvurechensky, Pavel; Gasnikov, Alexander
    We propose a general non-accelerated tensor method under inexact information on higher- order derivatives, analyze its convergence rate, and provide sufficient conditions for this method to have similar complexity as the exact tensor method. As a corollary, we propose the first stochastic tensor method for convex optimization and obtain sufficient mini-batch sizes for each derivative.
  • Item
    Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Matyukhin, Vladislav; Kabanikhin, Sergey; Shishlenin, Maxim; Novikov, Nikita; Vasin, Artem; Gasnikov, Alexander
    In this paper we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve ill-posed Cauchy problem for Poisson equation and make a comparative analysis with Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method ``doesn't feel the noise''. But after this moment the noise start to accumulate and the quality of the solution becomes worse for further iterations.
  • Item
    Tensor methods for strongly convex strongly concave saddle point problems and strongly monotone variational inequalities
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Ostroukhov, Petr; Kamalov, Rinat; Dvurechensky, Pavel; Gasnikov, Alexander
    In this paper we propose three tensor methods for strongly-convex-strongly-concave saddle point problems (SPP). The first method is based on the assumption of higher-order smoothness (the derivative of the order higher than 2 is Lipschitz-continuous) and achieves linear convergence rate. Under additional assumptions of first and second order smoothness of the objective we connect the first method with a locally superlinear converging algorithm in the literature and develop a second method with global convergence and local superlinear convergence. The third method is a modified version of the second method, but with the focus on making the gradient of the objective small. Since we treat SPP as a particular case of variational inequalities, we also propose two methods for strongly monotone variational inequalities with the same complexity as the described above.