Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

be smart about number of layers in models #16

Open
wdwatkins opened this issue Mar 6, 2018 · 8 comments
Open

be smart about number of layers in models #16

wdwatkins opened this issue Mar 6, 2018 · 8 comments

Comments

@wdwatkins
Copy link
Collaborator

Code is already there in R/get_base_lake_nml.R

@jordansread
Copy link
Member

Does this issue need more context? It might be written in a way that only makes sense to you.

@wdwatkins
Copy link
Collaborator Author

wdwatkins commented Mar 6, 2018

referring to this, in order to minimize unneeded layers and keep the output files small:

min_thick <- get_nml_value(nml, arg_name = "min_layer_thick")
  max_depth <- get_nml_value(nml, arg_name = "lake_depth")
  max_layers <- ceiling(max_depth/min_thick * 1.1)
  nml <- set_nml(glm_nml = nml, arg_name = "max_layers", arg_val = max_layers)

@jread-usgs this was working fine for Mendota, but I'm getting errors from many lakes similar to:

Array bounds error - too many layers. NumLayers = 96, M = 5
i = 95 V = 371165.207014160230756 VMax = 4133007.668881853111088 D =    0.505414803317936 DMax =    0.30000000000000

I'm guessing this has to do with the lake being shallow? Max depth is 1.2 for this lake. Should there be an alternative minimum hundred layers maybe?

@jordansread
Copy link
Member

yes, good idea. We hadn't run into this before because we always had way too many layers.

@wdwatkins
Copy link
Collaborator Author

200 seems to be a safe number so far.

@wdwatkins
Copy link
Collaborator Author

I suppose this should theoretically work now that the precipitation units are correct, although Luke pointed out we could just convert the netcdf files to netcdf4 so they use compression — this probably saves more space than messing with layers. Although I suppose if wanted to maximize disk space efficiency we could use both.

nccopy -knc7 -d9 output.nc output_4.nc adds max compression through the -d option. Converted a 1.7GB file to 48M. Might make reading a bit slower, but doesn't seem particularly noticeable.

@jordansread
Copy link
Member

nice! That is a huge gainer

@wdwatkins
Copy link
Collaborator Author

After fixing the precip data units, there were still a few lakes that failed (shallow ones again). I was able to get them to run by increasing the layer threshold to ceiling(max_depth/min_thick * 1.5).

@jordansread
Copy link
Member

We might want to propagate that change back into the functions that create the base nmls for each lake.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants