Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I run underworld at desired resolution and apple a periodic boundary condition? #1

Open
jmansour opened this issue Sep 28, 2016 · 3 comments

Comments

@jmansour
Copy link
Contributor

Hi all,
I have two problems when using Underworld (version 1.7.0).
(1) I find it's wired that when I run a 3D subduction model (similar to Capitanio(2011)'s configuration) at cell number of 96x64x96 (or 64x32x64, 96x48x96 or 128x64x128, etc.) in x-, y- and z- direction, respectively, at multigrid level of both 4 and 5 using 120 CPU cores on the supercomputer Tianhe-2, Underworld runs extremely slow, not even one step in one hour; while I try the same model with higher resolution (e.g., 192x64x192), Underworld runs well and the consumed CPU hours is about 5~10 minutes per step, which looks reasonable. So I have no idea why model with high resolution (e.g., #cell=192x64x192) takes much less CPU hours than the one with 1/8 or 1/4 of total number of cells (e.g., #cell=96x32x96). This is quite strange for a multigrid solver. I simply want to use #cell of 96x32x96, not 192x64x192, to get what I want without having to deal with much more output data and cosuming much more CPU hours.

(2) No-slip boundary condition is not the best choice in some case when redistibution of materials leads to space problem and thus yields realistic results. So how can I apply a periodic velocity boundary condition on the sidewalls of a box to allow inflow/outflow of the materials? I tried but failed to find an example of this type of bondary condition in the documents of Underworld.

Any instructions or hints will be appreciated.

Best regards,
Qingwen

Reference
Capitanio, F, et al, 2011, Subduction dynamics and the origin of Andean orogeny and the Bolivian orocline. Nature.

@jmansour
Copy link
Contributor Author

  1. It's hard to know exactly what is going on with your model. Does the standard output suggest any errors? I'd suggest trying less multigrid levels (try 2 or 3) and seeing if that makes a difference.. and perhaps even removing multigrid altogether for these smaller simulations.

  2. I'm not 100% sure I understand what you're after here. Periodic conditions will result in material moving out one side of the box, and transferring back in through the other side (the Nightmare.xml gives an example of usage). If you want open boundary conditions, where material can leave/enter the domain freely, it is a bit more difficult at the sidewalls as you would require some stress condition to prevent material just flowing straight out. I don't think this is really an option currently for UW1, though could probably be managed with UW2.

@QwZhang
Copy link

QwZhang commented Sep 30, 2016

Hi @jmansour,
Thank you for your reply.
I had run some new models of 2D and 3D subduction to highlight my iusses.

  • The graph below shows how UW1 goes inefficient or even refuses to go ahead at certain #cells (e.g., 128x32x128, 128x64x128)
    2
    comp4
    I have no idea why this happen and I have no choice but use multigrid for these 3D models. The following is what recorded in log of the run2:
Cartesian generator: 'linearMeshGenerator'
    Target mesh: 'linearMesh'
    Global element size: 128x32x128
    Local offset of rank 0: 0x0x0
    Local range of rank 0: 22x16x13
    Generating vertices...
        ... done.
    Generating elements...
        ... done.
    Generating edges...
        ... done.
    Generating faces...
        ... done.
    Generating element-vertex incidence...
        ... done.
    Generating edge-vertex incidence...
        ... done.
    Generating face-vertex incidence...
        ... done.
    Generating vertex neighbours...
        ... done.
    Generating geometry...
        ... done.
    Generating element types...
        ... element types are 'Mesh_HexType',
        ... mesh algorithm type is 'Mesh_RegularAlgorithms',
        ... done.
    Assigning FeMesh element types...
        ... FE element types are 'TrilinearElementType',
        ... done.
Stiffness matrix: 'k_matrix'
    Calculating number of nonzero entries...
        Found 124679139 nonzero entries.
        Done.
    Assigning FeMesh element types...
        ... FE element types are 'ConstantElementType',
        ... done.
FeEquationNumber: 'defaultFeVariableFeEqName'
    Generating equation numbers...
        BCs set to be removed.
        Assigned 524288 global equation numbers.
        [0] Assigned 4576 local equation numbers, within range 0 to 4576.
    ... Completed in 0.00749898 [min] / 0.0123 [max] seconds.
Stiffness matrix: 'g_matrix'
    Calculating number of nonzero entries...
        Found 12370560 nonzero entries.
        Done.
Stiffness matrix: 'preconditioner'
    Calculating number of nonzero entries...
        Found 524288 nonzero entries.
        Done.
Linking materials and ppc... 
done

Initialising Stg_Components from the live-component register

Laying out material 'UM' within PolygonShape 'Shape_UM':
Laying out material 'LM' within PolygonShape 'Shape_LM':
Laying out material 'Slab01_L01' within PolygonShape 'Shape_Slab01_L01':
Laying out material 'Slab01_L02' within PolygonShape 'Shape_Slab01_L02':
Laying out material 'Slab01_L03' within PolygonShape 'Shape_Slab01_L03':
Laying out material 'CBL' within PolygonShape 'Shape_CBL':
Laying out material 'WZL' within PolygonShape 'Shape_WZL':
In func Materials_Register_AssignParticleProperties(): for swarm "gaussMaterialSwarm"
    Assigning initial particle properties to the 120 global particles
        done 100% (1 particles)...
Materials_Register_AssignParticleProperties(): finished setup of material properties for swarm "gaussMaterialSwarm"
    took 0.00106311 [min] / 0.00409889 [max] secs
Laying out material 'UM' within PolygonShape 'Shape_UM':
Laying out material 'LM' within PolygonShape 'Shape_LM':
Laying out material 'Slab01_L01' within PolygonShape 'Shape_Slab01_L01':
Laying out material 'Slab01_L02' within PolygonShape 'Shape_Slab01_L02':
Laying out material 'Slab01_L03' within PolygonShape 'Shape_Slab01_L03':
Laying out material 'CBL' within PolygonShape 'Shape_CBL':
Laying out material 'WZL' within PolygonShape 'Shape_WZL':
In func Materials_Register_AssignParticleProperties(): for swarm "materialSwarm"
    Assigning initial particle properties to the 15728640 global particles
        done 10% (13728 particles)...
[...]
        done 100% (137279 particles)...
Materials_Register_AssignParticleProperties(): finished setup of material properties for swarm "materialSwarm"
    took 0.353129 [min] / 0.476072 [max] secs
In func WeightsCalculator_CalculateAll(): for swarm "picIntegrationPoints"
    Calculating weights for the particles in the 524288 global cells
        done 10% (458 cells)...
        done 20% (916 cells)...
        done 30% (1373 cells)...
        done 40% (1831 cells)...
        done 50% (2288 cells)...
        done 60% (2746 cells)...
        done 70% (3204 cells)...
        done 80% (3661 cells)...
        done 90% (4119 cells)...
        done 100% (4576 cells)...
WeightsCalculator_CalculateAll(): finished update of weights for swarm "picIntegrationPoints"
Run until simulation has stepped forward 5 timeSteps
or until simulation time passes 3.15569e+15.
TimeStep = 0, Start time = 0, prev timeStep dt = 0
In SystemLinearEquations_NonLinearExecute

Non linear solver - iteration 0
Linear solver (stokesEqn-execute) 
Configuring MG level 1 
Configuring MG level 1 
[...]
Configuring MG level 2 
Configuring MG level 3 
PETScMGSolver_UpdateSolvers 0.000546932
SROpGenerator_SimpleFinestLevel: time = 2.45669e-02 
  [2] SROpGenerator_SimpleCoarserLevel: time = 5.64981e-03 
  [1] SROpGenerator_SimpleCoarserLevel: time = 2.13408e-03 
PETScMGSolver_UpdateOps 0.0327489
Updating MG matrices ...
done
PETScMGSolver_UpdateMats-WorkVecs 0.234946
Summary:
  Uzawa its. = 0001 , Uzawa residual = 1.5450354754574e-01
  |G^T u|/|u|               = 8.72883506e+00
  |f - K u - G p|/|f|       = 1.38289770e+01
  |f - K u - G p|_w/|f|_w   = 1.00000000e+00
  |u|_{\infty} = 5.74134029e-10 , u_rms = 1.72097143e-10
  |p|_{\infty} = 2.07212005e+08 , p_rms = 6.73794037e+06
  min/max(u) = -5.74134029e-10 [169584] / 2.41672187e-11 [74393]
  min/max(p) = -2.07212005e+08 [209597] / 4.13650311e+07 [210602]
  \sum_i p_i = 1.84390235e+10 
Linear solver (stokesEqn-execute), solution time 1.764060e+03 (secs)
Non linear solver - iteration 1
Linear solver (stokesEqn-execute) 
_MatrixSolver_Setup 9.53674e-07
Updating MG matrices ...
done
PETScMGSolver_UpdateMats-WorkVecs 0.112212
Summary:
  Uzawa its. = 0001 , Uzawa residual = 3.4040609463190e+02
  |G^T u|/|u|               = 8.41182285e+00
  |f - K u - G p|/|f|       = 2.25635689e+04
  |f - K u - G p|_w/|f|_w   = 1.00000000e+00
  |u|_{\infty} = 1.50186472e-06 , u_rms = 4.32265132e-07
  |p|_{\infty} = 3.79365707e+11 , p_rms = 1.29879288e+10
  min/max(u) = -1.50186472e-06 [1126752] / 2.18645426e-08 [964853]
  min/max(p) = -3.79365707e+11 [106125] / 7.59074396e+10 [210602]
  \sum_i p_i = -3.10180084e+13 
Linear solver (stokesEqn-execute), solution time 1.773123e+03 (secs)
In func SystemLinearEquations_NonLinearExecute: Iteration 1 of 100 - Residual 0.9996 - Tolerance = 0.01
Non linear solver - Residual 9.99604513e-01; Tolerance 1.0000e-02 - Not converged - 3.544570e+03 (secs)

Non linear solver - iteration 2
Linear solver (stokesEqn-execute) 
_MatrixSolver_Setup 0
Updating MG matrices ...
done
PETScMGSolver_UpdateMats-WorkVecs 0.111623
[...]

It seems that UW1 just suspends here during Non linear solver - iteration 2 and refuses to go ahead for over 8 hours.

  • As for the second issue, the periodic conditions demonstrated in the Nightmare.xml and velocityBCs.freeslip.periodicSideWalls.xml is exactly what I want. Here is how it looks like with periodic conditions applied and allow free moving of the mantle horizontally cross the side walls (click to enlarge):
    all

Thanks for your time.

Best,
Qw

@jmansour
Copy link
Contributor Author

That does seem a bit strange. Unfortunately it's very difficult for us to assist you with this. Have you tried using a different number of mgLevels? What happens when you run without MG? Do things improve when you simplify your rheologies?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants