Search Results

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Item

Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems

2021, Matyukhin, Vladislav, Kabanikhin, Sergey, Shishlenin, Maxim, Novikov, Nikita, Vasin, Artem, Gasnikov, Alexander

In this paper we propose the gradient descent type methods to solve convex optimization problems in Hilbert space. We apply it to solve ill-posed Cauchy problem for Poisson equation and make a comparative analysis with Landweber iteration and steepest descent method. The theoretical novelty of the paper consists in the developing of new stopping rule for accelerated gradient methods with inexact gradient (additive noise). Note that up to the moment of stopping the method ``doesn't feel the noise''. But after this moment the noise start to accumulate and the quality of the solution becomes worse for further iterations.

Loading...
Thumbnail Image
Item

Reproducing kernel Hilbert spaces and variable metric algorithms in PDE constrained shape optimisation

2016, Eigel, Martin, Sturm, Kevin

In this paper we investigate and compare different gradient algorithms designed for the domain expression of the shape derivative. Our main focus is to examine the usefulness of kernel reproducing Hilbert spaces for PDE constrained shape optimisation problems. We show that radial kernels provide convenient formulas for the shape gradient that can be efficiently used in numerical simulations. The shape gradients associated with radial kernels depend on a so called smoothing parameter that allows a smoothness adjustment of the shape during the optimisation process. Besides, this smoothing parameter can be used to modify the movement of the shape. The theoretical findings are verified in a number of numerical experiments.