You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some models are going to require data at much higher temporal resolution than the wider model update tick. An example here is sub-daily or daily inputs to the Abiotic model.
The input data files for this use case can be very large – not something we really want to ingest into the Data object at model startup and try and store in RAM.
So, where do we store this kind of data, and is there a way to lazily load the data as required. This might be something that dask is well-suited to as this handles lazy loading of chunked data.
The text was updated successfully, but these errors were encountered:
@vgro , we will need an example simulation with, at least, one BIG file and some indication to where it is used, so we can explore how to best handle that memory wise.
@vgro Do you happen to have a big file like this lying around? No pressure -- I've got lots to be getting on elsewhere -- but I won't be able to start on this until there's some data for me to work with, so if you do have a chance to look at it over the next few weeks, that'd be great.
dask
is well-suited to as this handles lazy loading of chunked data.The text was updated successfully, but these errors were encountered: