Search Results

Now showing 1 - 10 of 20
  • Item
    Bootstrap confidence sets under a model misspecification
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2014) Spokoiny, Vladimir; Zhilova, Mayya
    A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered for finite samples and possible model misspecification. Theoretical results justify the bootstrap consistency for small or moderate sample size and allow to control the impact of the parameter dimension: the bootstrap approximation works if the ratio of cube of the parameter dimension to the sample size is small. The main result about bootstrap consistency continues to apply even if the underlying parametric model is misspecified under the so called Small Modeling Bias condition. In the case when the true model deviates significantly from the considered parametric family, the bootstrap procedure is still applicable but it becomes a bit conservative: the size of the constructed confidence sets is increased by the modeling bias. We illustrate the results with numerical examples of misspecified constant and logistic regressions.
  • Item
    Spatially adaptive estimation via fitted local likelihood techniques
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2006) Katkovnik, Vladimir; Spokoiny, Vladimir
    This paper offers a new technique for spatially adaptive estimation. The local likelihood is exploited for nonparametric modelling of observations and estimated signals. The approach is based on the assumption of a local homogeneity of the signal: for every point there exists a neighborhood in which the signal can be well approximated by a constant. The fitted local likelihood statistics is used for selection of an adaptive size of this neighborhood. The algorithm is developed for quite a general class of observations subject to the exponential distribution. The estimated signal can be uni- and multivariable. We demonstrate a good performance of the new algorithm for Poissonian image denoising and compare of the new method versus the intersection of confidence interval $(ICI) $ technique that also exploits a selection of an adaptive neighborhood for estimation.
  • Item
    Statistical inference for Bures--Wasserstein barycenters
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Kroshnin, Alexey; Spokoiny, Vladimir; Suvorikova, Alexandra
    In this work we introduce the concept of Bures--Wasserstein barycenter $Q_*$, that is essentially a Fréchet mean of some distribution $P$ supported on a subspace of positive semi-definite $d$-dimensional Hermitian operators $H_+(d)$. We allow a barycenter to be constrained to some affine subspace of $H_+(d)$, and we provide conditions ensuring its existence and uniqueness. We also investigate convergence and concentration properties of an empirical counterpart of $Q_*$ in both Frobenius norm and Bures--Wasserstein distance, and explain, how the obtained results are connected to optimal transportation theory and can be applied to statistical inference in quantum mechanics.
  • Item
    Adaptive gradient descent for convex and non-convex stochastic optimization
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2019) Ogaltsov, Aleksandr; Dvinskikh, Darina; Dvurechensky, Pavel; Gasnikov, Alexander; Spokoiny, Vladimir
    In this paper we propose several adaptive gradient methods for stochastic optimization. Our methods are based on Armijo-type line search and they simultaneously adapt to the unknown Lipschitz constant of the gradient and variance of the stochastic approximation for the gradient. We consider an accelerated gradient descent for convex problems and gradient descent for non-convex problems. In the experiments we demonstrate superiority of our methods to existing adaptive methods, e.g. AdaGrad and Adam.
  • Item
    Diffusion tensor imaging : structural adaptive smoothing
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2007) Tabelow, Karsten; Polzehl, Jörg; Spokoiny, Vladimir; Voss, Henning U.
    Diffusion Tensor Imaging (DTI) data is characterized by a high noise level. Thus, estimation errors of quantities like anisotropy indices or the main diffusion direction used for fiber tracking are relatively large and may significantly confound the accuracy of DTI in clinical or neuroscience applications. Besides pulse sequence optimization, noise reduction by smoothing the data can be pursued as a complementary approach to increase the accuracy of DTI. Here, we suggest an anisotropic structural adaptive smoothing procedure, which is based on the Propagation-Separation method and preserves the structures seen in DTI and their different sizes and shapes. It is applied to artificial phantom data and a brain scan. We show that this method significantly improves the quality of the estimate of the diffusion tensor and hence enables one either to reduce the number of scans or to enhance the input for subsequent analysis such as fiber tracking.
  • Item
    Adaptive manifold clustering
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2020) Besold, Franz; Spokoiny, Vladimir
    Clustering methods seek to partition data such that elements are more similar to elements in the same cluster than to elements in different clusters. The main challenge in this task is the lack of a unified definition of a cluster, especially for high dimensional data. Different methods and approaches have been proposed to address this problem. This paper continues the study originated by [6] where a novel approach to adaptive nonparametric clustering called Adaptive Weights Clustering (AWC) was offered. The method allows analyzing high-dimensional data with an unknown number of unbalanced clusters of arbitrary shape under very weak modeling as-sumptions. The procedure demonstrates a state-of-the-art performance and is very efficient even for large data dimension D. However, the theoretical study in [6] is very limited and did not re-ally address the question of efficiency. This paper makes a significant step in understanding the remarkable performance of the AWC procedure, particularly in high dimension. The approach is based on combining the ideas of adaptive clustering and manifold learning. The manifold hypoth-esis means that high dimensional data can be well approximated by a d-dimensional manifold for small d helping to overcome the curse of dimensionality problem and to get sharp bounds on the cluster separation which only depend on the intrinsic dimension d. We also address the problem of parameter tuning. Our general theoretical results are illustrated by some numerical experiments.
  • Item
    Locally time homogeneous time series modelling
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2008) Elagin, Mstislav; Spokoiny, Vladimir
    In this paper three locally adaptive estimation methods are applied to the problems of variance forecasting, value-at-risk analysis and volatility estimation within the context of nonstationary financial time series. A general procedure for the computation of critical values is given. Numerical results exhibit a very reasonable performance of the methods.
  • Item
    Stopping rules for accelerated gradient methods with additive noise in gradient
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2021) Vasin, Artem; Gasnikov, Alexander; Spokoiny, Vladimir
    In this article, we investigate an accelerated first-order method, namely, the method of similar triangles, which is optimal in the class of convex (strongly convex) problems with a Lipschitz gradient. The paper considers a model of additive noise in a gradient and a Euclidean prox- structure for not necessarily bounded sets. Convergence estimates are obtained in the case of strong convexity and its absence, and a stopping criterion is proposed for not strongly convex problems.
  • Item
    Parameter estimation in time series analysis
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2009) Spokoiny, Vladimir
    The paper offers a novel unified approach to studying the accuracy of parameter estimation for a time series. Important features of the approach are: (1) The underlying model is not assumed to be parametric. (2) The imposed conditions on the model are very mild and can be easily checked in specific applications. (3) The considered time series need not to be ergodic or stationary. The approach is equally applicable to ergodic, unit root and explosive cases. (4) The parameter set can be unbounded and non-compact. (5) No conditions on parameter identifiability are required. (6) The established risk bounds are nonasymptotic and valid for large, moderate and small samples. (7) The results describe confidence and concentration sets rather than the accuracy of point estimation. The whole approach can be viewed as complementary to the classical one based on the asymptotic expansion of the log-likelihood. In particular, it claims a consistency of the considered estimate in a rather general sense, which usually is assumed to be fulfilled in the asymptotic analysis. In standard situations under ergodicity conditions, the usual rate results can be easily obtained as corollaries from the established risk bounds. The approach and the results are illustrated on a number of popular time series models including autoregressive, Generalized Linear time series, ARCH and GARCH models and meadian/quantile regression.
  • Item
    Robust risk management : accounting for nonstationarity and heavy tails
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2007) Chen, Ying; Spokoiny, Vladimir
    In the ideal Black-Scholes world, financial time series are assumed 1) stationary (time homogeneous) or can be modelled globally by a stationary process and 2) having conditionally normal distribution given the past. These two assumptions have been widely-used in many methods such as the RiskMetrics, one risk management method considered as industry standard. However these assumptions are unrealistic. The primary aim of the paper is to account for nonstationarity and heavy tails in time series by presenting a local exponential smoothing approach, by which the smoothing parameter is adaptively selected at every time point and the heavy-tailedness of the process is considered. A complete theory addresses both issues. In our study, we demonstrate the implementation of the proposed method in volatility estimation and risk management given simulated and real data. Numerical results show the proposed method delivers accurate and sensitive estimates.