Search Results

Now showing 1 - 4 of 4
  • Item
    Gaussian processes with multidimensional distribution inputs via optimal transport and Hilbertian embedding
    (Ithaca, NY : Cornell University Library, 2020) Bachoc, François; Suvorikova, Alexandra; Ginsbourger, David; Loubes, Jean-Michel; Spokoiny, Vladimir
    In this work, we propose a way to construct Gaussian processes indexed by multidimensional distributions. More precisely, we tackle the problem of defining positive definite kernels between multivariate distributions via notions of optimal transport and appealing to Hilbert space embeddings. Besides presenting a characterization of radial positive definite and strictly positive definite kernels on general Hilbert spaces, we investigate the statistical properties of our theoretical and empirical kernels, focusing in particular on consistency as well as the special case of Gaussian distributions. A wide set of applications is presented, both using simulations and implementation with real data.
  • Item
    Bayesian inference for spectral projectors of the covariance matrix
    (Ithaca, NY : Cornell University Library, 2018) Silin, Igor; Spokoiny, Vladimir
    Let X1,…,Xn be an i.i.d. sample in Rp with zero mean and the covariance matrix Σ∗. The classical PCA approach recovers the projector P∗J onto the principal eigenspace of Σ∗ by its empirical counterpart ˆPJ. Recent paper [24] investigated the asymptotic distribution of the Frobenius distance between the projectors ∥ˆPJ−P∗J∥2, while [27] offered a bootstrap procedure to measure uncertainty in recovering this subspace P∗J even in a finite sample setup. The present paper considers this problem from a Bayesian perspective and suggests to use the credible sets of the pseudo-posterior distribution on the space of covariance matrices induced by the conjugated Inverse Wishart prior as sharp confidence sets. This yields a numerically efficient procedure. Moreover, we theoretically justify this method and derive finite sample bounds on the corresponding coverage probability. Contrary to [24, 27], the obtained results are valid for non-Gaussian data: the main assumption that we impose is the concentration of the sample covariance ˆΣ in a vicinity of Σ∗. Numerical simulations illustrate good performance of the proposed procedure even on non-Gaussian data in a rather challenging regime.
  • Item
    Change-point detection in high-dimensional covariance structure
    (Ithaca, NY : Cornell University Library, 2018) Avanesov, Valeriy; Buzun, Nazar
    In this paper we introduce a novel approach for an important problem of break detection. Specifically, we are interested in detection of an abrupt change in the covariance structure of a high-dimensional random process – a problem, which has applications in many areas e.g., neuroimaging and finance. The developed approach is essentially a testing procedure involving a choice of a critical level. To that end a non-standard bootstrap scheme is proposed and theoretically justified under mild assumptions. Theoretical study features a result providing guaranties for break detection. All the theoretical results are established in a high-dimensional setting (dimensionality p≫n). Multiscale nature of the approach allows for a trade-off between sensitivity of break detection and localization. The approach can be naturally employed in an on-line setting. Simulation study demonstrates that the approach matches the nominal level of false alarm probability and exhibits high power, outperforming a recent approach.
  • Item
    Convergence analysis of Tikhonov regularization for non-linear statistical inverse problems
    (Ithaca, NY : Cornell University Library, 2020) Rastogi, Abhishake; Blanchard, Gilles; Mathé, Peter
    We study a non-linear statistical inverse problem, where we observe the noisy image of a quantity through a non-linear operator at some random design points. We consider the widely used Tikhonov regularization (or method of regularization) approach to estimate the quantity for the non-linear ill-posed inverse problem. The estimator is defined as the minimizer of a Tikhonov functional, which is the sum of a data misfit term and a quadratic penalty term. We develop a theoretical analysis for the minimizer of the Tikhonov regularization scheme using the concept of reproducing kernel Hilbert spaces. We discuss optimal rates of convergence for the proposed scheme, uniformly over classes of admissible solutions, defined through appropriate source conditions.