Search Results

Now showing 1 - 4 of 4
  • Item
    Optimality conditions for convex stochastic optimization problems in Banach spaces with almost sure state constraint
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Geiersbach, Caroline; Wollner, Winnifried
    We analyze a convex stochastic optimization problem where the state is assumed to belong to the Bochner space of essentially bounded random variables with images in a reflexive and separable Banach space. For this problem, we obtain optimality conditions that are, with an appropriate model, necessary and sufficient. Additionally, the Lagrange multipliers associated with optimality conditions are integrable vector-valued functions and not only measures. A model problem is given demonstrating the application to PDE-constrained optimization under uncertainty.
  • Item
    Optimality conditions and Moreau--Yosida regularization for almost sure state constraints
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Geiersbach, Caroline; Hintermüller, Michael
    We analyze a potentially risk-averse convex stochastic optimization problem, where the control is deterministic and the state is a Banach-valued essentially bounded random variable. We obtain strong forms of necessary and sufficient optimality conditions for problems subject to equality and conical constraints. We propose a Moreau--Yosida regularization for the conical constraint and show consistency of the optimality conditions for the regularized problem as the regularization parameter is taken to infinity.
  • Item
    Stochastic proximal gradient methods for nonconvex problems in Hilbert spaces
    (New York, NY [u.a.] : Springer Science + Business Media B.V., 2021) Geiersbach, Caroline; Scarinci, Teresa
    For finite-dimensional problems, stochastic approximation methods have long been used to solve stochastic optimization problems. Their application to infinite-dimensional problems is less understood, particularly for nonconvex objectives. This paper presents convergence results for the stochastic proximal gradient method applied to Hilbert spaces, motivated by optimization problems with partial differential equation (PDE) constraints with random inputs and coefficients. We study stochastic algorithms for nonconvex and nonsmooth problems, where the nonsmooth part is convex and the nonconvex part is the expectation, which is assumed to have a Lipschitz continuous gradient. The optimization variable is an element of a Hilbert space. We show almost sure convergence of strong limit points of the random sequence generated by the algorithm to stationary points. We demonstrate the stochastic proximal gradient algorithm on a tracking-type functional with a L1 -penalty term constrained by a semilinear PDE and box constraints, where input terms and coefficients are subject to uncertainty. We verify conditions for ensuring convergence of the algorithm and show a simulation.
  • Item
    Optimality Conditions and Moreau-Yosida Regularization for Almost Sure State Constraints
    (Paris : EDP Sciences, 2022) Geiersbach, Caroline; Hintermüller, Michael
    We analyze a potentially risk-averse convex stochastic optimization problem, where the control is deterministic and the state is a Banach-valued essentially bounded random variable. We obtain strong forms of necessary and sufficient optimality conditions for problems subject to equality and conical constraints. We propose a Moreau-Yosida regularization for the conical constraint and show consistency of the optimality conditions for the regularized problem as the regularization parameter is taken to infinity.