Semester project in the ANCHP chair at EPFL supervised by Prof. Daniel Kressner and Margherita Guido
- Install Julia (for example from here)
- Open the repository in your terminal. Type
julia, this opens the REPL - In the REPL type
Ctrl+]this opens package mode - In package mode type
instantiate. All the packages required by this project will now be installed in a local environment - Type
activate .to activate this current environment
While this environment is active you can run code from the project within it. Examples/Full_example.jl provides an example with all the methods and row sampling.
You can also run Julia as a script by writing:
julia --project=<path-to-repo> <Location-of-file-to-be-ran>
To simplify building a problem we have made an example. You can specify the dimension
N=1000
Δt = 1e-5
t₀=0.1
N = 100
prob = LSMOD.Example1.make_prob(N)An order reduction strategy can be specified as:
RandNYS = LSMOD.Nystrom(prob.internal^2,35,14,6);Where we use the Nyström method as an example with src/ReductionStrategies.jl.
A sampling strategy can be specified as:
rows=50
LS_strat(A,rhs,args...) = LSMOD.UniformRowSampledLS(A,rhs,rows,args...)We can finally run the solver using these reduction strategies:
sol = LSMOD.solve(t₀, Δt , N, deepcopy(prob), RandNYS, LS_strat);The folder src contains the main components of the code:
Example_problems.jl: Contains an example problem taken from hereLinearSystem.jl: Contains theupdateLinearSystemfunction which updates the linear system related to the Elliptic PDE problem we solveProblem.jl: Contains theEllipticPDEandDifferentialOperators2D.- The
EllipticPDEstructs holds information about the PDE to be solved. DifferentialOperators2Dmanages the discrete differential operators
- The
RandomizedLeastSquares.jl: Contains methods that allow for subsampling the rows of the least squares problem- The most important such method is
UniformLSwhich uniformly samples the rows of a matrix and the rhs of the equation.
- The most important such method is
ReductionStrategies.jl: Contains the various order reduction, initial strategies we implement (Nyström, Randomized Range Finder, Randomized SVD and POD).Solver.jl: Contains thesolvefunction, which takes aProblemand solves it iteratively with a given timestep
We consider a linear problem that evolves in time: $$ \mathbf{A}(t_i)\mathbf{x} = \mathbf{b}(t_i), \quad \mathbf{A}(t_i) \in \mathbb{R}^{n\times n}, \mathbf{b}(t_i)\in \mathbb{R}^{n} $$
In particular we are interested in a case where
The initial guess approach involves three stages:
- Generate a reduced basis
$\mathbf{Q}\in\mathbb{R}^{n\times m}$ from a history matrix$\mathbf{X}$ of the$M$ previous solutions Assumption:$m\leq M << n$ - Solve the reduced problem
$\mathbf{s}^* =\text{argmin}_{s\in \mathbb{R}^{m}} ||\mathbf{A}(t_i)\mathbf{X} s - \mathbf{b}(t_i)||_2$ - Run GMRES with starting vector
$\mathbf{x}_0=\mathbf{X}\mathbf{s}^*$
Simply said: We use the previous solution to create a smaller, more friendly space in which we can solve our problem (order reduction). Then we solve this smaller, much cheaper problem and hope that we get very close to the GMRES solution.
There two typical baselines when doing initial guess generation:
- Use the previous state. This is a "free" initial guess, but it may be far away from the next state.
- Use the Principal Orthogonal Decomposition (POD). This involves two stages:
- Perform a SVD of
$\mathbf{X}$ :$\text{SVD}(\mathbf{X}) = [\mathbf{\Psi},\mathbf{\Sigma}, \mathbf{\Phi}]$ - Truncate
$\mathbf{\Psi}$ to the first$m$ columns
- Perform a SVD of
The POD is the optimal
Goal: Find a matrix
This method involves three steps:
- Sample a sketching matrix
$\Omega \in \mathbb{R}^{M\times m}$ - Compute
$\mathbf{X}\Omega$ - Compute QR of
$\mathbf{X}\Omega = [\mathbf{Q},R]$
Where
Goal: Compute a cheap SVD of
The Randomized Range Finder is the first step of the Randomized SVD
Full method:
- Obtain
$\hat{Q}$ from the Randomized Range Finder - Compute
$\hat{Q}^\top \mathbf{X}$ - Compute the SVD of
$\hat{Q}^\top \mathbf{X} = [\hat{U}, \Sigma, V]$ - Set
$\mathbf{Q}=\hat{Q}\hat{U}$
This method requires extra steps compared with the Randomized Range Finder, but may result in a better basis. In practice, it turns out that the Randomized Range Finder is generally sufficient.
Goal: Obtain a left side projection matrix from the Generalized Nyström approximation
The Generalized Nyström is given by:
- Where
$\Omega_1 \in \mathbb{R}^{M \times k}$ and$\Omega_2 \in \mathbb{R}^{n \times (k+p)}$ are random matrices
We will use
We implement the method in the following steps using
- Sample
$\Omega_1 \in \mathbb{R}^{M \times k}$ and$\Omega_2 \in \mathbb{R}^{n \times (k+p)}$ - Compute
$\mathbf{X}\Omega_1$ - Compute
$\Omega_2^\top\mathbf{X}\Omega_1$ - Compute
$\mathbf{Q} = \mathbf{X}\Omega_1 (\Omega_2^\top\mathbf{X}\Omega_1)^\dagger$
This method will be cheaper than the Randomized Range Finder and Randomized SVD, but the error guarantees are not as strong.
As an example problem we use the one from this paper.