Search Results

Now showing 1 - 4 of 4
  • Item
    Patch-Wise Adaptive Weights Smoothing in R
    (Los Angeles, Calif. : UCLA, Dept. of Statistics, 2020) Polzehl, Jörg; Papafitsoros, Kostas; Tabelow, Karsten
    Image reconstruction from noisy data has a long history of methodological development and is based on a variety of ideas. In this paper we introduce a new method called patch-wise adaptive smoothing, that extends the propagation-separation approach by using comparisons of local patches of image intensities to define local adaptive weighting schemes for an improved balance of reduced variability and bias in the reconstruction result. We present the implementation of the new method in an R package aws and demonstrate its properties on a number of examples in comparison with other state-of-the art image reconstruction methods. © 2020, American Statistical Association. All rights reserved.
  • Item
    A function space framework for structural total variation regularization with applications in inverse problems
    (Bristol [u.a.] : Inst., 2018) Hintermüller, Michael; Holler, Martin; Papafitsoros, Kostas
    In this work, we introduce a function space setting for a wide class of structural/weighted total variation (TV) regularization methods motivated by their applications in inverse problems. In particular, we consider a regularizer that is the appropriate lower semi-continuous envelope (relaxation) of a suitable TV type functional initially defined for sufficiently smooth functions. We study examples where this relaxation can be expressed explicitly, and we also provide refinements for weighted TV for a wide range of weights. Since an integral characterization of the relaxation in function space is, in general, not always available, we show that, for a rather general linear inverse problems setting, instead of the classical Tikhonov regularization problem, one can equivalently solve a saddle-point problem where no a priori knowledge of an explicit formulation of the structural TV functional is needed. In particular, motivated by concrete applications, we deduce corresponding results for linear inverse problems with norm and Poisson log-likelihood data discrepancy terms. Finally, we provide proof-of-concept numerical examples where we solve the saddle-point problem for weighted TV denoising as well as for MR guided PET image reconstruction.
  • Item
    Variable Step Mollifiers and Applications
    (Berlin ; Heidelberg : Springer, 2020) Hintermüller, Michael; Papafitsoros, Kostas; Rautenberg, Carlos N.
    We consider a mollifying operator with variable step that, in contrast to the standard mollification, is able to preserve the boundary values of functions. We prove boundedness of the operator in all basic Lebesgue, Sobolev and BV spaces as well as corresponding approximation results. The results are then applied to extend recently developed theory concerning the density of convex intersections. © 2020, The Author(s).
  • Item
    Optimization with learning-informed differential equation constraints and its applications
    (Les Ulis : EDP Sciences, 2022) Dong, Guozhi; Hintermüller, Michael; Papafitsoros, Kostas
    Inspired by applications in optimal control of semilinear elliptic partial differential equations and physics-integrated imaging, differential equation constrained optimization problems with constituents that are only accessible through data-driven techniques are studied. A particular focus is on the analysis and on numerical methods for problems with machine-learned components. For a rather general context, an error analysis is provided, and particular properties resulting from artificial neural network based approximations are addressed. Moreover, for each of the two inspiring applications analytical details are presented and numerical results are provided.