Search Results

Now showing 1 - 3 of 3
  • Item
    Inexact model: A framework for optimization and variational inequalities
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Stonyakin, Fedor; Gasnikov, Alexander; Tyurin, Alexander; Pasechnyuk, Dmitry; Agafonov, Artem; Dvurechensky, Pavel; Dvinskikh, Darina; Piskunova, Victorya
    In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal method for variational inequalities with composite structure. This method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. We also generalize our framework for strongly convex objectives and strongly monotone variational inequalities.
  • Item
    Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Stonyakin, Fedor; Gasnikov, Alexander; Tyurin, Alexander; Pasechnyuk, Dmitry; Agafonov, Artem; Dvurechensky, Pavel; Dvinskikh, Darina; Artamonov, Sergei; Piskunova, Victorya
    In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods, level-set methods, Bregman proximal methods. The idea of the framework is based on constructing an inexact model of the main problem component, i.e. objective function in optimization or operator in variational inequalities. Besides reproducing known results, our framework allows to construct new methods, which we illustrate by constructing a universal conditional gradient method and universal method for variational inequalities with composite structure. These method works for smooth and non-smooth problems with optimal complexity without a priori knowledge of the problem smoothness. As a particular case of our general framework, we introduce relative smoothness for operators and propose an algorithm for VIs with such operator. We also generalize our framework for relatively strongly convex objectives and strongly monotone variational inequalities.
  • Item
    Oracle complexity separation in convex optimization
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Ivanova, Anastasiya; Gasnikov, Alexander; Dvurechensky, Pavel; Dvinskikh, Darina; Tyurin, Alexander; Vorontsova, Evgeniya; Pasechnyuk, Dmitry
    Ubiquitous in machine learning regularized empirical risk minimization problems are often composed of several blocks which can be treated using different types of oracles, e.g., full gradient, stochastic gradient or coordinate derivative. Optimal oracle complexity is known and achievable separately for the full gradient case, the stochastic gradient case, etc. We propose a generic framework to combine optimal algorithms for different types of oracles in order to achieve separate optimal oracle complexity for each block, i.e. for each block the corresponding oracle is called the optimal number of times for a given accuracy. As a particular example, we demonstrate that for a combination of a full gradient oracle and either a stochastic gradient oracle or a coordinate descent oracle our approach leads to the optimal number of oracle calls separately for the full gradient part and the stochastic/coordinate descent part.