-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
be smart about number of layers in models #16
Comments
Does this issue need more context? It might be written in a way that only makes sense to you. |
referring to this, in order to minimize unneeded layers and keep the output files small:
@jread-usgs this was working fine for Mendota, but I'm getting errors from many lakes similar to:
I'm guessing this has to do with the lake being shallow? Max depth is 1.2 for this lake. Should there be an alternative minimum hundred layers maybe? |
yes, good idea. We hadn't run into this before because we always had way too many layers. |
200 seems to be a safe number so far. |
I suppose this should theoretically work now that the precipitation units are correct, although Luke pointed out we could just convert the netcdf files to netcdf4 so they use compression — this probably saves more space than messing with layers. Although I suppose if wanted to maximize disk space efficiency we could use both.
|
nice! That is a huge gainer |
After fixing the precip data units, there were still a few lakes that failed (shallow ones again). I was able to get them to run by increasing the layer threshold to |
We might want to propagate that change back into the functions that create the base nmls for each lake. |
Code is already there in
R/get_base_lake_nml.R
The text was updated successfully, but these errors were encountered: