Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to generate SARAH cutouts #369

Open
1 of 2 tasks
rdcarr2 opened this issue Aug 9, 2024 · 7 comments
Open
1 of 2 tasks

Unable to generate SARAH cutouts #369

rdcarr2 opened this issue Aug 9, 2024 · 7 comments

Comments

@rdcarr2
Copy link

rdcarr2 commented Aug 9, 2024

Version Checks (indicate both or one)

  • I have confirmed this bug exists on the lastest release of Atlite.

  • I have confirmed this bug exists on the current master branch of Atlite.

Issue Description

Hey everyone, I'm trying to create SARAH cutouts for later use in producing PyPSA-Eur networks for different climate years, but keep getting the same error message after running cutout.prepare():

"ValueError: dimension lat on 0th function argument to apply_ufunc with dask='parallelized' consists of multiple chunks, but is also a core dimension. To fix, either rechunk into a single array chunk along this dimension, i.e., .chunk(dict(lat=-1)), or pass allow_rechunk=True in dask_gufunc_kwargs but beware that this may significantly increase memory usage."

I've tried the rechunking method suggested, but keep getting the same error regardless. I'm stuck now and not sure what to do, and ChatGPT is sending me in circles. Anyone experienced this issue and managed to find a solution?

Reproducible Example

import atlite
import logging

logging.basicConfig(level=logging.INFO)

cutout_2015_sarah = atlite.Cutout(
    path="europe-2015-sarah.nc",
    module=["sarah", "era5"],
    sarah_dir="/Users/robertcarr/Documents/HZB/Modelling/SARAH data/2014_2016",
    x=slice(-16.5, 40.5),
    y=slice(32.70, 75),
    time=slice("2015-01-01", "2015-02-01")
)


# Rechunk along the 'y' dimension
cutout_2015_sarah.data = cutout_2015_sarah.data.chunk(dict(y=-1))
cutout_2015_sarah.prepare()

Expected Behavior

Screenshot 2024-08-09 at 16 54 10

Installed Versions

name: pypsa-eur-RDC
channels:

  • bioconda
  • defaults
  • conda-forge
    dependencies:
  • affine=2.4.0=pyhd8ed1ab_0
  • ampl-mp=3.1.0=hbec66e7_1006
  • amply=0.1.6=pyhd8ed1ab_0
  • appdirs=1.4.4=pyh9f0ad1d_0
  • appnope=0.1.4=pyhd8ed1ab_0
  • argparse-dataclass=2.0.0=pyhd8ed1ab_0
  • asttokens=2.4.1=pyhd8ed1ab_0
  • atk-1.0=2.38.0=hd03087b_2
  • atlite=0.2.13=pyhd8ed1ab_0
  • attrs=23.2.0=pyh71513ae_0
  • aws-c-auth=0.7.22=h8a62e84_10
  • aws-c-cal=0.7.1=h94d0942_1
  • aws-c-common=0.9.23=h99b78c6_0
  • aws-c-compression=0.2.18=h94d0942_7
  • aws-c-event-stream=0.4.2=hb74cd8f_15
  • aws-c-http=0.8.2=had1507a_6
  • aws-c-io=0.14.10=hcdb10ff_1
  • aws-c-mqtt=0.10.4=h856d8ab_8
  • aws-c-s3=0.6.0=ha9fd6de_2
  • aws-c-sdkutils=0.1.16=h94d0942_3
  • aws-checksums=0.1.18=h94d0942_7
  • aws-crt-cpp=0.27.3=h9d3339c_2
  • aws-sdk-cpp=1.11.329=he6360a2_9
  • azure-core-cpp=1.12.0=hd01fc5c_0
  • azure-identity-cpp=1.8.0=h0a11218_1
  • azure-storage-blobs-cpp=12.11.0=h77cc766_1
  • azure-storage-common-cpp=12.6.0=h7024f69_1
  • azure-storage-files-datalake-cpp=12.10.0=h64d02d0_1
  • beautifulsoup4=4.12.3=pyha770c72_0
  • blosc=1.21.6=h5499902_0
  • bokeh=3.5.0=pyhd8ed1ab_0
  • bottleneck=1.4.0=py311h5d790af_1
  • branca=0.7.2=pyhd8ed1ab_0
  • brotli=1.1.0=hb547adb_1
  • brotli-bin=1.1.0=hb547adb_1
  • brotli-python=1.1.0=py311ha891d26_1
  • bzip2=1.0.8=h99b78c6_7
  • c-ares=1.32.3=h99b78c6_0
  • c-blosc2=2.15.0=h5063078_1
  • ca-certificates=2024.7.4=hf0a4a13_0
  • cads-api-client=1.1.0=pyhd8ed1ab_0
  • cairo=1.18.0=hc6c324b_2
  • cartopy=0.23.0=py311h4b4568b_1
  • cdsapi=0.7.0=pyhd8ed1ab_0
  • certifi=2024.7.4=pyhd8ed1ab_0
  • cffi=1.16.0=py311h4a08483_0
  • cfgv=3.3.1=pyhd8ed1ab_0
  • cfitsio=4.4.1=h793ed5c_0
  • cftime=1.6.4=py311h5d790af_0
  • charset-normalizer=3.3.2=pyhd8ed1ab_0
  • click=8.1.7=unix_pyh707e725_0
  • click-plugins=1.1.1=py_0
  • cligj=0.7.2=pyhd8ed1ab_1
  • cloudpickle=3.0.0=pyhd8ed1ab_0
  • coin-or-cbc=2.10.11=h700c273_0
  • coin-or-cgl=0.60.7=hf050ae7_0
  • coin-or-clp=1.17.8=h65c2c7c_0
  • coin-or-osi=0.108.10=h0dc0bf9_0
  • coin-or-utils=2.11.11=hafd1a81_1
  • coincbc=2.10.11=0_metapackage
  • colorama=0.4.6=pyhd8ed1ab_0
  • comm=0.2.2=pyhd8ed1ab_0
  • conda-inject=1.3.2=pyhd8ed1ab_0
  • configargparse=1.7=pyhd8ed1ab_0
  • connection_pool=0.0.3=pyhd3deb0d_0
  • contourpy=1.2.1=py311hcc98501_0
  • country_converter=1.2=pyhd8ed1ab_0
  • cppad=20240000.5=h00cdb27_0
  • cycler=0.12.1=pyhd8ed1ab_0
  • cytoolz=0.12.3=py311h05b510d_0
  • dask=2024.7.1=pyhd8ed1ab_0
  • dask-core=2024.7.1=pyhd8ed1ab_0
  • dask-expr=1.1.9=pyhd8ed1ab_0
  • datrie=0.8.2=py311heffc1b2_7
  • debugpy=1.8.2=py311hb9542d7_0
  • decorator=5.1.1=pyhd8ed1ab_0
  • deprecation=2.1.0=pyh9f0ad1d_0
  • descartes=1.1.0=py_4
  • distlib=0.3.8=pyhd8ed1ab_0
  • distributed=2024.7.1=pyhd8ed1ab_0
  • docutils=0.21.2=pyhd8ed1ab_0
  • dpath=2.2.0=pyha770c72_0
  • entsoe-py=0.6.8=pyhd8ed1ab_0
  • et_xmlfile=1.1.0=pyhd8ed1ab_0
  • exceptiongroup=1.2.2=pyhd8ed1ab_0
  • executing=2.0.1=pyhd8ed1ab_0
  • expat=2.6.2=hebf3989_0
  • filelock=3.15.4=pyhd8ed1ab_0
  • fiona=1.9.6=py311hf75b9fa_3
  • fmt=10.2.1=h2ffa867_0
  • folium=0.17.0=pyhd8ed1ab_0
  • font-ttf-dejavu-sans-mono=2.37=hab24e00_0
  • font-ttf-inconsolata=3.000=h77eed37_0
  • font-ttf-source-code-pro=2.038=h77eed37_0
  • font-ttf-ubuntu=0.83=h77eed37_2
  • fontconfig=2.14.2=h82840c6_0
  • fonts-conda-ecosystem=1=0
  • fonts-conda-forge=1=0
  • fonttools=4.53.1=py311hd3f4193_0
  • freetype=2.12.1=hadb7bae_2
  • freexl=2.0.0=hfbad9fb_0
  • fribidi=1.0.10=h27ca646_0
  • fsspec=2024.6.1=pyhff2d567_0
  • gdal=3.9.1=py311h08b0975_8
  • gdk-pixbuf=2.42.12=h7ddc832_0
  • geographiclib=2.0=pyhd8ed1ab_0
  • geojson-rewind=1.1.0=pyhd8ed1ab_0
  • geopandas=1.0.1=pyhd8ed1ab_0
  • geopandas-base=1.0.1=pyha770c72_0
  • geopy=2.4.1=pyhd8ed1ab_1
  • geos=3.12.2=h00cdb27_1
  • geotiff=1.7.3=h7e5fb84_1
  • gflags=2.2.2=hc88da5d_1004
  • giflib=5.2.2=h93a5062_0
  • gitdb=4.0.11=pyhd8ed1ab_0
  • gitpython=3.1.43=pyhd8ed1ab_0
  • glog=0.7.1=heb240a5_0
  • glpk=5.0=h6d7a090_0
  • gmp=6.3.0=h7bae524_2
  • graphite2=1.3.13=hebf3989_1003
  • graphviz=11.0.0=h9bb9bc9_0
  • gtk2=2.24.33=h91d5085_5
  • gts=0.7.6=he42f4ea_4
  • h2=4.1.0=pyhd8ed1ab_0
  • harfbuzz=9.0.0=h1836168_0
  • hdf4=4.2.15=h2ee6834_7
  • hdf5=1.14.3=nompi_hec07895_105
  • hpack=4.0.0=pyh9f0ad1d_0
  • humanfriendly=10.0=pyhd8ed1ab_6
  • hyperframe=6.0.1=pyhd8ed1ab_0
  • icu=73.2=hc8870d7_0
  • identify=2.6.0=pyhd8ed1ab_0
  • idna=3.7=pyhd8ed1ab_0
  • immutables=0.20=py311heffc1b2_1
  • importlib-metadata=8.1.0=pyha770c72_0
  • importlib_metadata=8.1.0=hd8ed1ab_0
  • importlib_resources=6.4.0=pyhd8ed1ab_0
  • iniconfig=2.0.0=pyhd8ed1ab_0
  • ipopt=3.14.16=h387674d_4
  • ipykernel=6.29.5=pyh57ce528_0
  • ipython=8.26.0=pyh707e725_0
  • jedi=0.19.1=pyhd8ed1ab_0
  • jinja2=3.1.4=pyhd8ed1ab_0
  • joblib=1.4.2=pyhd8ed1ab_0
  • json-c=0.17=he54c16a_1
  • jsonschema=4.23.0=pyhd8ed1ab_0
  • jsonschema-specifications=2023.12.1=pyhd8ed1ab_0
  • jupyter_client=8.6.2=pyhd8ed1ab_0
  • jupyter_core=5.7.2=py311h267d04e_0
  • kealib=1.5.3=h848a2d4_1
  • kiwisolver=1.4.5=py311he4fd1f5_1
  • krb5=1.21.3=h237132a_0
  • lcms2=2.16=ha0e7c42_0
  • lerc=4.0.0=h9a09cb3_0
  • libabseil=20240116.2=cxx17_h00cdb27_1
  • libaec=1.1.3=hebf3989_0
  • libarchive=3.7.4=h83d404f_0
  • libarrow=17.0.0=h2a00445_0_cpu
  • libarrow-acero=17.0.0=h00cdb27_0_cpu
  • libarrow-dataset=17.0.0=h00cdb27_0_cpu
  • libarrow-substrait=17.0.0=hc68f6b8_0_cpu
  • libblas=3.9.0=23_osxarm64_openblas
  • libbrotlicommon=1.1.0=hb547adb_1
  • libbrotlidec=1.1.0=hb547adb_1
  • libbrotlienc=1.1.0=hb547adb_1
  • libcblas=3.9.0=23_osxarm64_openblas
  • libcrc32c=1.1.2=hbdafb3b_0
  • libcurl=8.9.0=hfd8ffcc_0
  • libcxx=18.1.8=h167917d_0
  • libdeflate=1.20=h93a5062_0
  • libedit=3.1.20191231=hc8eb9b7_2
  • libev=4.33=h93a5062_2
  • libevent=2.1.12=h2757513_1
  • libexpat=2.6.2=hebf3989_0
  • libffi=3.4.2=h3422bc3_5
  • libgd=2.3.3=hfdf3952_9
  • libgdal=3.9.1=hce30654_8
  • libgdal-core=3.9.1=hf00468f_8
  • libgdal-fits=3.9.1=h7a7a030_8
  • libgdal-grib=3.9.1=hdd4b840_8
  • libgdal-hdf4=3.9.1=h94124bd_8
  • libgdal-hdf5=3.9.1=hf90b89a_8
  • libgdal-jp2openjpeg=3.9.1=h54bcb16_8
  • libgdal-kea=3.9.1=hacb1b3e_8
  • libgdal-netcdf=3.9.1=h1723b65_8
  • libgdal-pdf=3.9.1=h4cf08c4_8
  • libgdal-pg=3.9.1=h7d28298_8
  • libgdal-postgisraster=3.9.1=h7d28298_8
  • libgdal-tiledb=3.9.1=hbb20944_8
  • libgdal-xls=3.9.1=hb39617b_8
  • libgfortran=5.0.0=13_2_0_hd922786_3
  • libgfortran5=13.2.0=hf226fd6_3
  • libglib=2.80.3=h59d46d9_1
  • libgoogle-cloud=2.26.0=hfe08963_0
  • libgoogle-cloud-storage=2.26.0=h1466eeb_0
  • libgrpc=1.62.2=h9c18a4f_0
  • libhwloc=2.11.1=default_h7685b71_1000
  • libiconv=1.17=h0d3ecfb_2
  • libintl=0.22.5=h8fbad5d_2
  • libjpeg-turbo=3.0.0=hb547adb_1
  • libkml=1.3.0=h00ed6cc_1020
  • liblapack=3.9.0=23_osxarm64_openblas
  • liblapacke=3.9.0=23_osxarm64_openblas
  • libnetcdf=4.9.2=nompi_he469be0_114
  • libnghttp2=1.58.0=ha4dd798_1
  • libopenblas=0.3.27=openmp_h517c56d_1
  • libparquet=17.0.0=hcf52c46_0_cpu
  • libpng=1.6.43=h091b4b1_0
  • libpq=16.3=h7afe498_0
  • libprotobuf=4.25.3=hbfab5d5_0
  • libre2-11=2023.09.01=h7b2c953_2
  • librsvg=2.58.2=h1db61d3_1
  • librttopo=1.1.0=h31fb324_16
  • libscotch=7.0.4=h7c38b86_5
  • libsodium=1.0.18=h27ca646_1
  • libspatialite=5.1.0=hf7a34df_8
  • libsqlite=3.46.0=hfb93653_0
  • libssh2=1.11.0=h7a5bd25_0
  • libthrift=0.19.0=h026a170_1
  • libtiff=4.6.0=h07db509_3
  • libutf8proc=2.8.0=h1a8c8d9_0
  • libwebp=1.4.0=h54798ee_0
  • libwebp-base=1.4.0=h93a5062_0
  • libxcb=1.16=hf2054a2_0
  • libxml2=2.12.7=h9a80f22_3
  • libxslt=1.1.39=h223e5b9_0
  • libzip=1.10.1=ha0bc3c6_3
  • libzlib=1.3.1=hfb2fe0b_1
  • linopy=0.3.13=pyhd8ed1ab_0
  • llvm-openmp=18.1.8=hde57baf_0
  • locket=1.0.0=pyhd8ed1ab_0
  • lxml=5.2.2=py311hf9a6a72_0
  • lz4=4.3.3=py311hd44b8e9_0
  • lz4-c=1.9.4=hb7217d7_0
  • lzo=2.10=h93a5062_1001
  • mapclassify=2.6.1=pyhd8ed1ab_0
  • markupsafe=2.1.5=py311h05b510d_0
  • matplotlib=3.9.1=py311ha1ab1f8_0
  • matplotlib-base=3.9.1=py311hba6b155_0
  • matplotlib-inline=0.1.7=pyhd8ed1ab_0
  • memory_profiler=0.61.0=pyhd8ed1ab_0
  • metis=5.1.0=h13dd4ca_1007
  • minizip=4.0.7=h27ee973_0
  • mpfr=4.2.1=h41d338b_1
  • msgpack-python=1.0.8=py311h6bde47b_0
  • multiurl=0.3.1=pyhd8ed1ab_0
  • mumps-include=5.7.2=hce30654_0
  • mumps-seq=5.7.2=hab9b160_0
  • munkres=1.1.4=pyh9f0ad1d_0
  • nbformat=5.10.4=pyhd8ed1ab_0
  • ncurses=6.5=hb89a1cb_0
  • nest-asyncio=1.6.0=pyhd8ed1ab_0
  • netcdf4=1.7.1=nompi_py311h42682c7_101
  • networkx=3.3=pyhd8ed1ab_1
  • nodeenv=1.9.1=pyhd8ed1ab_0
  • nspr=4.35=hb7217d7_0
  • nss=3.102=hc42bcbf_0
  • numexpr=2.10.0=py311h4b4568b_0
  • numpy=1.26.4=py311h7125741_0
  • openjdk=22.0.1=h363fedd_1
  • openjpeg=2.5.2=h9f1df11_0
  • openpyxl=3.1.4=py311h1fc4b72_0
  • openssl=3.3.1=hfb2fe0b_2
  • orc=2.0.1=h47ade37_1
  • packaging=24.1=pyhd8ed1ab_0
  • pandas=2.2.2=py311h4b4568b_1
  • pango=1.54.0=h9ee27a3_1
  • parso=0.8.4=pyhd8ed1ab_0
  • partd=1.4.2=pyhd8ed1ab_0
  • patsy=0.5.6=pyhd8ed1ab_0
  • pcre2=10.44=h297a79d_0
  • pexpect=4.9.0=pyhd8ed1ab_0
  • pickleshare=0.7.5=py_1003
  • pillow=10.4.0=py311hd7951ec_0
  • pip=24.0=pyhd8ed1ab_0
  • pixman=0.43.4=hebf3989_0
  • pkgutil-resolve-name=1.3.10=pyhd8ed1ab_1
  • plac=1.4.3=pyhd8ed1ab_0
  • platformdirs=4.2.2=pyhd8ed1ab_0
  • pluggy=1.5.0=pyhd8ed1ab_0
  • ply=3.11=pyhd8ed1ab_2
  • polars=1.2.1=py311h9e175c1_0
  • poppler=24.07.0=h9787579_0
  • poppler-data=0.4.12=hd8ed1ab_0
  • postgresql=16.3=hdfa2ec6_0
  • powerplantmatching=0.5.15=pyhd8ed1ab_0
  • pre-commit=3.7.1=pyha770c72_0
  • progressbar2=4.4.2=pyhd8ed1ab_0
  • proj=9.4.1=hfb94cee_0
  • prompt-toolkit=3.0.47=pyha770c72_0
  • psutil=6.0.0=py311hd3f4193_0
  • pthread-stubs=0.4=h27ca646_1001
  • ptyprocess=0.7.0=pyhd3deb0d_0
  • pulp=2.8.0=py311h267d04e_0
  • pure_eval=0.2.3=pyhd8ed1ab_0
  • py-cpuinfo=9.0.0=pyhd8ed1ab_0
  • pyarrow=17.0.0=py311h35c05fe_0
  • pyarrow-core=17.0.0=py311hf5072a7_0_cpu
  • pyarrow-hotfix=0.6=pyhd8ed1ab_0
  • pycountry=24.6.1=pyhd8ed1ab_0
  • pycparser=2.22=pyhd8ed1ab_0
  • pygments=2.18.0=pyhd8ed1ab_0
  • pyogrio=0.8.0=py311he661659_2
  • pyomo=6.6.1=py311ha891d26_0
  • pyparsing=3.1.2=pyhd8ed1ab_0
  • pyproj=3.6.1=py311h5e0e26b_7
  • pypsa=0.28.0=pyhd8ed1ab_0
  • pyscipopt=5.1.1=py311hb9542d7_0
  • pyshp=2.3.1=pyhd8ed1ab_0
  • pysocks=1.7.1=pyha2e5f31_6
  • pytables=3.9.2=py311h5d6d252_3
  • pytest=8.3.1=pyhd8ed1ab_0
  • python=3.11.9=h932a869_0_cpython
  • python-dateutil=2.9.0=pyhd8ed1ab_0
  • python-fastjsonschema=2.20.0=pyhd8ed1ab_0
  • python-tzdata=2024.1=pyhd8ed1ab_0
  • python-utils=3.8.2=pyhd8ed1ab_0
  • python_abi=3.11=4_cp311
  • pytz=2024.1=pyhd8ed1ab_0
  • pyxlsb=1.0.10=pyhd8ed1ab_0
  • pyyaml=6.0.1=py311heffc1b2_1
  • pyzmq=26.0.3=py311h9bed540_0
  • qhull=2020.2=h420ef59_5
  • rasterio=1.3.10=py311he66545a_4
  • re2=2023.09.01=h4cba328_2
  • readline=8.2=h92ec313_1
  • referencing=0.35.1=pyhd8ed1ab_0
  • requests=2.32.3=pyhd8ed1ab_0
  • reretry=0.11.8=pyhd8ed1ab_0
  • rioxarray=0.17.0=pyhd8ed1ab_0
  • rpds-py=0.19.0=py311h98c6a39_0
  • scikit-learn=1.5.1=py311hbfb48bc_0
  • scip=9.1.0=h55df89c_0
  • scipy=1.14.0=py311hceeca8c_1
  • seaborn=0.13.2=hd8ed1ab_2
  • seaborn-base=0.13.2=pyhd8ed1ab_2
  • setuptools=71.0.4=pyhd8ed1ab_0
  • setuptools-scm=8.1.0=pyhd8ed1ab_0
  • setuptools_scm=8.1.0=hd8ed1ab_0
  • shapely=2.0.5=py311h0f19114_0
  • six=1.16.0=pyh6c4a22f_0
  • smart_open=7.0.4=pyhd8ed1ab_0
  • smmap=5.0.0=pyhd8ed1ab_0
  • snakemake-interface-common=1.17.2=pyhdfd78af_0
  • snakemake-interface-executor-plugins=9.2.0=pyhdfd78af_0
  • snakemake-interface-report-plugins=1.0.0=pyhdfd78af_0
  • snakemake-interface-storage-plugins=3.2.3=pyhdfd78af_0
  • snakemake-minimal=8.16.0=pyhdfd78af_0
  • snappy=1.2.1=hd02b534_0
  • snuggs=1.4.7=py_0
  • sortedcontainers=2.4.0=pyhd8ed1ab_0
  • soupsieve=2.5=pyhd8ed1ab_1
  • spdlog=1.13.0=h5fcca99_0
  • sqlite=3.46.0=h5838104_0
  • stack_data=0.6.2=pyhd8ed1ab_0
  • statsmodels=0.14.2=py311h5d790af_0
  • stopit=1.1.2=py_0
  • tabula-py=2.7.0=py311h267d04e_1
  • tabulate=0.9.0=pyhd8ed1ab_1
  • tbb=2021.12.0=h420ef59_3
  • tblib=3.0.0=pyhd8ed1ab_0
  • threadpoolctl=3.5.0=pyhc1e730c_0
  • throttler=1.2.2=pyhd8ed1ab_0
  • tiledb=2.24.2=h5def871_2
  • tk=8.6.13=h5083fa2_1
  • tomli=2.0.1=pyhd8ed1ab_0
  • toolz=0.12.1=pyhd8ed1ab_0
  • toposort=1.10=pyhd8ed1ab_0
  • tornado=6.4.1=py311hd3f4193_0
  • tqdm=4.66.4=pyhd8ed1ab_0
  • traitlets=5.14.3=pyhd8ed1ab_0
  • typing-extensions=4.12.2=hd8ed1ab_0
  • typing_extensions=4.12.2=pyha770c72_0
  • tzcode=2024a=h93a5062_0
  • tzdata=2024a=h0c530f3_0
  • ukkonen=1.0.1=py311he4fd1f5_4
  • unidecode=1.3.8=pyhd8ed1ab_0
  • unixodbc=2.3.12=h0e2417a_0
  • uriparser=0.9.8=h00cdb27_0
  • urllib3=2.2.2=pyhd8ed1ab_1
  • validators=0.33.0=pyhd8ed1ab_0
  • virtualenv=20.26.3=pyhd8ed1ab_0
  • wcwidth=0.2.13=pyhd8ed1ab_0
  • wheel=0.43.0=pyhd8ed1ab_1
  • wrapt=1.16.0=py311h05b510d_0
  • xarray=2024.6.0=pyhd8ed1ab_1
  • xerces-c=3.2.5=hf393695_0
  • xlrd=2.0.1=pyhd8ed1ab_3
  • xorg-libxau=1.0.11=hb547adb_0
  • xorg-libxdmcp=1.1.3=h27ca646_0
  • xyzservices=2024.6.0=pyhd8ed1ab_0
  • xz=5.2.6=h57fd34a_0
  • yaml=0.2.5=h3422bc3_2
  • yte=1.5.4=pyha770c72_0
  • zeromq=4.3.5=hcc0f68c_4
  • zict=3.0.0=pyhd8ed1ab_0
  • zipp=3.19.2=pyhd8ed1ab_0
  • zlib=1.3.1=hfb2fe0b_1
  • zlib-ng=2.2.1=h00cdb27_0
  • zstandard=0.23.0=py311h4a6b76e_0
  • zstd=1.5.6=hb46c0d2_0
  • pip:
    • gurobipy==11.0.3
    • highspy==1.7.2
    • oauthlib==3.2.2
    • requests-oauthlib==1.3.1
    • snakemake-executor-plugin-cluster-generic==1.0.9
    • snakemake-executor-plugin-slurm==0.8.0
    • snakemake-executor-plugin-slurm-jobstep==0.2.1
    • snakemake-storage-plugin-http==0.2.3
    • tsam==2.3.1
      prefix: /Users/robertcarr/miniforge3/envs/pypsa-eur-RDC
@euronion
Copy link
Collaborator

Hey there!

I haven't seen the issue before. I think the underlying rechunk would need to happen on the SARAH data, not on the cutout. A wild guess would be that you could you try to pass the chunk argument to the prepare(...) function similarly as shown in the example here:

https://atlite.readthedocs.io/en/latest/examples/create_cutout_SARAH.html#Specifying-the-cutout

@martabresco
Copy link

Hi, I am having the same issue as you are, @rdcarr2, and I have not been able to solve it yet. Did you find a way around it?
Thanks.

@rdcarr2
Copy link
Author

rdcarr2 commented Mar 12, 2025

Hey Marta,

I never managed to figure it out no matter what I tried, including reinstalling everything in a new environment from scratch. Luckily, the SARAH cutouts are no longer required for building capacity factors in PyPSA, everything that is needed can be extracted from the ERA5 cutout instead.

@euronion
Copy link
Collaborator

Hi both,

Have you tried my comment above?
I don't see why you would want to rechunk the data at all, but doing the chunking after the cutout is prepared should still work.

@martabresco Do you use the identical code, or a different one?

@rdcarr2
Copy link
Author

rdcarr2 commented Mar 12, 2025

Hey,

Thanks for helping out! Worth noting I am quite new to atlite still.

I didn't want to rechunk anything at first (I don't even know what rechunking is lol), but I kept getting this error:

"ValueError: dimension lat on 0th function argument to apply_ufunc with dask='parallelized' consists of multiple chunks, but is also a core dimension. To fix, either rechunk into a single array chunk along this dimension, i.e., .chunk(dict(lat=-1)), or pass allow_rechunk=True in dask_gufunc_kwargs but beware that this may significantly increase memory usage."

then i started with the rechunking stuff which you see on the screenshot above, which didn't work.

It's been a while and I got a new computer, so I'll download the SARAH data overnight and try building a cutout again tomorrow. Will update this thread if it works!

@martabresco
Copy link

Hi,

In my case, rechunking helps because I am working with an HPC and with large files. However, even without the chunk command, I am getting the same error as @rdcarr2, see below.
I saw your previous comment @euronion, but when trying to pass the chunk argument to prepare( ) I get that this is an unexpected argument.
This issue is weird, since I have tried to do a cutout with two days of SARAH-3 V004 data from 2025, exactly with the same cutout definition, and I did not have issues. But when trying to do it for 2 days of SARAH-3 V004 data from 2014, I get this error which I have not managed to fix. And I am using the same environment for both.

Image

@coroa
Copy link
Member

coroa commented Mar 12, 2025

Hi @martabresco , I unfortunately don't have the files for the SARAH dataset so i can't test anything easily atm.

  1. Please try adding chunks={"time": 100, "lat": -1, "lon": -1} to the atlite.Cutout(..., chunks=...) definition call. Those are passed through to the place where the sarah files are opened.
  2. If you are still receiving an error please get the full stacktrace (ie. after clicking on "open in a text editor") and copy it here as text (ideally within code fences). Screenshots are not searchable.

Thanks,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants