Search Results

Now showing 1 - 1 of 1
  • Item
    Accelerated variance-reduced methods for saddle-point problems
    (Amsterdam : Elsevier, 2022) Borodich, Ekaterina; Tominin, Vladislav; Tominin, Yaroslav; Kovalev, Dmitry; Gasnikov, Alexander; Dvurechensky, Pavel
    We consider composite minimax optimization problems where the goal is to find a saddle-point of a large sum of non-bilinear objective functions augmented by simple composite regularizers for the primal and dual variables. For such problems, under the average-smoothness assumption, we propose accelerated stochastic variance-reduced algorithms with optimal up to logarithmic factors complexity bounds. In particular, we consider strongly-convex-strongly-concave, convex-strongly-concave, and convex-concave objectives. To the best of our knowledge, these are the first nearly-optimal algorithms for this setting.