-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to generate SARAH cutouts #369
Comments
Hey there! I haven't seen the issue before. I think the underlying https://atlite.readthedocs.io/en/latest/examples/create_cutout_SARAH.html#Specifying-the-cutout |
Hi, I am having the same issue as you are, @rdcarr2, and I have not been able to solve it yet. Did you find a way around it? |
Hey Marta, I never managed to figure it out no matter what I tried, including reinstalling everything in a new environment from scratch. Luckily, the SARAH cutouts are no longer required for building capacity factors in PyPSA, everything that is needed can be extracted from the ERA5 cutout instead. |
Hi both, Have you tried my comment above? @martabresco Do you use the identical code, or a different one? |
Hey, Thanks for helping out! Worth noting I am quite new to atlite still. I didn't want to rechunk anything at first (I don't even know what rechunking is lol), but I kept getting this error: "ValueError: dimension lat on 0th function argument to apply_ufunc with dask='parallelized' consists of multiple chunks, but is also a core dimension. To fix, either rechunk into a single array chunk along this dimension, i.e., .chunk(dict(lat=-1)), or pass allow_rechunk=True in dask_gufunc_kwargs but beware that this may significantly increase memory usage." then i started with the rechunking stuff which you see on the screenshot above, which didn't work. It's been a while and I got a new computer, so I'll download the SARAH data overnight and try building a cutout again tomorrow. Will update this thread if it works! |
Hi, In my case, rechunking helps because I am working with an HPC and with large files. However, even without the chunk command, I am getting the same error as @rdcarr2, see below. |
Hi @martabresco , I unfortunately don't have the files for the SARAH dataset so i can't test anything easily atm.
Thanks, |
Version Checks (indicate both or one)
I have confirmed this bug exists on the lastest release of Atlite.
I have confirmed this bug exists on the current
master
branch of Atlite.Issue Description
Hey everyone, I'm trying to create SARAH cutouts for later use in producing PyPSA-Eur networks for different climate years, but keep getting the same error message after running cutout.prepare():
"ValueError: dimension lat on 0th function argument to apply_ufunc with dask='parallelized' consists of multiple chunks, but is also a core dimension. To fix, either rechunk into a single array chunk along this dimension, i.e.,
.chunk(dict(lat=-1))
, or passallow_rechunk=True
indask_gufunc_kwargs
but beware that this may significantly increase memory usage."I've tried the rechunking method suggested, but keep getting the same error regardless. I'm stuck now and not sure what to do, and ChatGPT is sending me in circles. Anyone experienced this issue and managed to find a solution?
Reproducible Example
Expected Behavior
Installed Versions
name: pypsa-eur-RDC
channels:
dependencies:
prefix: /Users/robertcarr/miniforge3/envs/pypsa-eur-RDC
The text was updated successfully, but these errors were encountered: