Inverse learning in Hilbert scales

Loading...
Thumbnail Image
Date
2023
Volume
112
Issue
Journal
Series Titel
Book Title
Publisher
Dordrecht [u.a.] : Springer Science + Business Media B.V
Abstract

We study linear ill-posed inverse problems with noisy data in the framework of statistical learning. The corresponding linear operator equation is assumed to fit a given Hilbert scale, generated by some unbounded self-adjoint operator. Approximate reconstructions from random noisy data are obtained with general regularization schemes in such a way that these belong to the domain of the generator. The analysis has thus to distinguish two cases, the regular one, when the true solution also belongs to the domain of the generator, and the ‘oversmoothing’ one, when this is not the case. Rates of convergence for the regularized solutions will be expressed in terms of certain distance functions. For solutions with smoothness given in terms of source conditions with respect to the scale generating operator, then the error bounds can then be made explicit in terms of the sample size.

Description
Keywords
Hilbert Scales, Minimax convergence rates, Reproducing kernel Hilbert space, Spectral regularization, Statistical inverse problem
Citation
Rastogi, A., & Mathé, P. (2023). Inverse learning in Hilbert scales. 112. https://doi.org//10.1007/s10994-022-06284-8
Collections
License
CC BY 4.0 Unported