Convergence bounds for empirical nonlinear least-squares

Loading...
Thumbnail Image
Date
2022
Volume
56
Issue
1
Journal
Mathematical modelling and numerical analysis : an international journal on applied mathematics
Series Titel
Book Title
Publisher
Les Ulis : EDP Sciences
Link to publishers version
Abstract

We consider best approximation problems in a nonlinear subset ā„³ of a Banach space of functions (š’±,āˆ„ā€¢āˆ„). The norm is assumed to be a generalization of the L 2-norm for which only a weighted Monte Carlo estimate āˆ„ā€¢āˆ„n can be computed. The objective is to obtain an approximation vā€„āˆˆā€„ā„³ of an unknown function uā€„āˆˆā€„š’± by minimizing the empirical norm āˆ„uā€…āˆ’ā€…vāˆ„n. We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the specified nonlinear least squares setting. Several model classes are examined and the analytical statements about the RIP are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.

Description
Keywords
Citation
Eigel, M., Schneider, R., & Trunschke, P. (2022). Convergence bounds for empirical nonlinear least-squares (Les Ulis : EDP Sciences). Les Ulis : EDP Sciences. https://doi.org//10.1051/m2an/2021070
Collections
License
CC BY 4.0 Unported