Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Projection-based model reduction is among the most widely adopted methods for
constructing parametric Reduced-Order Models (ROM). Utilizing the snapshot data
from solving full-order governing equations, the Proper Orthogonal
Decomposition (POD) computes the optimal basis modes that represent the data,
and a ROM can be constructed in the low-dimensional vector subspace spanned by
the POD basis. For parametric governing equations, a potential challenge arises
when there is a need to update the POD basis to adapt ROM that accurately
capture the variation of a system's behavior over its parameter space (in
design, control, uncertainty quantification, digital twins applications, etc.).
In this paper, we propose a Projected Gaussian Process (pGP) and formulate the
problem of adapting the POD basis as a supervised statistical learning problem,
for which the goal is to learn a mapping from the parameter space to the
Grassmann manifold that contains the optimal subspaces. A mapping is firstly
established between the Euclidean space and the horizontal space of an
orthogonal matrix that spans a reference subspace in the Grassmann manifold. A
second mapping from the horizontal space to the Grassmann manifold is
established through the Exponential/Logarithm maps between the manifold and its
tangent space. Finally, given a new parameter, the conditional distribution of
a vector can be found in the Euclidean space using the Gaussian Process (GP)
regression, and such a distribution is then projected to the Grassmann manifold
that enables us to predict the optimal subspace for the new parameter. As a
statistical learning approach, the proposed pGP allows us to optimally estimate
(or tune) the model parameters from data and quantify the statistical
uncertainty associated with the prediction. The advantages of the proposed pGP
are demonstrated by numerical experiments.