Search Results

Now showing 1 - 4 of 4
  • Item
    A hybrid FETI-DP method for non-smooth random partial differential equations
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2018) Eigel, Martin; Gruhlke, Robert
    A domain decomposition approach exploiting the localization of random parameters in highdimensional random PDEs is presented. For high efficiency, surrogate models in multi-element representations are computed locally when possible. This makes use of a stochastic Galerkin FETI-DP formulation of the underlying problem with localized representations of involved input random fields. The local parameter space associated to a subdomain is explored by a subdivision into regions where the parametric surrogate accuracy can be trusted and where instead Monte Carlo sampling has to be employed. A heuristic adaptive algorithm carries out a problemdependent hp refinement in a stochastic multi-element sense, enlarging the trusted surrogate region in local parametric space as far as possible. This results in an efficient global parameter to solution sampling scheme making use of local parametric smoothness exploration in the involved surrogate construction. Adequately structured problems for this scheme occur naturally when uncertainties are defined on sub-domains, e.g. in a multi-physics setting, or when the Karhunen-Loéve expansion of a random field can be localized. The efficiency of this hybrid technique is demonstrated with numerical benchmark problems illustrating the identification of trusted (possibly higher order) surrogate regions and non-trusted sampling regions.
  • Item
    An adaptive stochastic Galerkin tensor train discretization for randomly perturbed domains
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2018) Eigel, Martin; Marschall, Manuel; Multerer, Michael
    A linear PDE problem for randomly perturbed domains is considered in an adaptive Galerkin framework. The perturbation of the domains boundary is described by a vector valued random field depending on a countable number of random variables in an affine way. The corresponding Karhunen-Loève expansion is approximated by the pivoted Cholesky decomposition based on a prescribed covariance function. The examined high-dimensional Galerkin system follows from the domain mapping approach, transferring the randomness from the domain to the diffusion coefficient and the forcing. In order to make this computationally feasible, the representation makes use of the modern tensor train format for the implicit compression of the problem. Moreover, an a posteriori error estimator is presented, which allows for the problem-dependent iterative refinement of all discretization parameters and the assessment of the achieved error reduction. The proposed approach is demonstrated in numerical benchmark problems.
  • Item
    Variational Monte Carlo - Bridging concepts of machine learning and high dimensional partial differential equations
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2018) Eigel, Martin; Trunschke, Philipp; Schneider, Reinhold; Wolf, Sebastian
    A statistical learning approach for parametric PDEs related to Uncertainty Quantification is derived. The method is based on the minimization of an empirical risk on a selected model class and it is shown to be applicable to a broad range of problems. A general unified convergence analysis is derived, which takes into account the approximation and the statistical errors. By this, a combination of theoretical results from numerical analysis and statistics is obtained. Numerical experiments illustrate the performance of the method with the model class of hierarchical tensors.
  • Item
    Adaptive stochastic Galerkin FEM for lognormal coefficients in hierarchical tensor representations
    (Berlin : Weierstraß-Institut für Angewandte Analysis und Stochastik, 2018) Eigel, Martin; Marschall, Manuel; Pfeffer, Max; Schneider, Reinhold
    Stochastic Galerkin methods for non-affine coefficient representations are known to cause major difficulties from theoretical and numerical points of view. In this work, an adaptive Galerkin FE method for linear parametric PDEs with lognormal coefficients discretized in Hermite chaos polynomials is derived. It employs problem-adapted function spaces to ensure solvability of the variational formulation. The inherently high computational complexity of the parametric operator is made tractable by using hierarchical tensor representations. For this, a new tensor train format of the lognormal coefficient is derived and verified numerically. The central novelty is the derivation of a reliable residual-based a posteriori error estimator. This can be regarded as a unique feature of stochastic Galerkin methods. It allows for an adaptive algorithm to steer the refinements of the physical mesh and the anisotropic Wiener chaos polynomial degrees. For the evaluation of the error estimator to become feasible, a numerically efficient tensor format discretization is developed. Benchmark examples with unbounded lognormal coefficient fields illustrate the performance of the proposed Galerkin discretization and the fully adaptive algorithm.