Search Results

Now showing 1 - 3 of 3
  • Item
    Analysis of profile functions for general linear regularization methods
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2006) Mathé, Peter; Hofmann, Bernd
    The stable approximate solution of ill-posed linear operator equations in Hilbert spaces requires regularization. Tight bounds for the noise-free part of the regularization error are constitutive for bounding the overall error. Norm bounds of the noise-free part which decrease to zero along with the regularization parameter are called profile functions and are subject of our analysis. The interplay between properties of the regularization and certain smoothness properties of solution sets, which we shall describe in terms of source-wise representations is crucial for the decay of associated profile functions. On the one hand, we show that a given decay rate is possible only if the underlying true solution has appropriate smoothness. On the other hand, if smoothness fits the regularization, then decay rates are easily obtained. If smoothness does not fit, then we will measure this in terms of some distance function. Tight bounds for these allow us to obtain profile functions. Finally we study the most realistic case when smoothness is measured with respect to some operator which is related to the one governing the original equation only through a link condition. In many parts the analysis is done on geometric basis, extending classical concepts of linear regularization theory in Hilbert spaces ...
  • Item
    Convergence bounds for empirical nonlinear least-squares
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Eigel, Martin; Trunschke, Philipp; Schneider, Reinhold
    We consider best approximation problems in a nonlinear subset of a Banach space of functions. The norm is assumed to be a generalization of the L2 norm for which only a weighted Monte Carlo estimate can be computed. The objective is to obtain an approximation of an unknown target function by minimizing the empirical norm. In the case of linear subspaces it is well-known that such least squares approximations can become inaccurate and unstable when the number of samples is too close to the number of parameters. We review this statement for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and we show sufficient conditions for the RIP to be satisfied with high probability. Several model classes are examined where analytical statements can be made about the RIP. Numerical experiments illustrate some of the obtained stability bounds.
  • Item
    Randomized optimal stopping algorithms and their convergence analysis
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Bayer, Christian; Belomestny, Denis; Hager, Paul; Pigato, Paolo; Schoenmakers, John G. M.
    In this paper we study randomized optimal stopping problems and consider corresponding forward and backward Monte Carlo based optimization algorithms. In particular we prove the convergence of the proposed algorithms and derive the corresponding convergence rates.