Skip to content

Commit

Permalink
Update docstring in api.py for open_mfdataset(), clarifying "chunks" …
Browse files Browse the repository at this point in the history
…argument (#9121)

Per this discussion: #9119
  • Loading branch information
arthur-e authored Jun 14, 2024
1 parent 211d313 commit 1265310
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -863,7 +863,8 @@ def open_mfdataset(
In general, these should divide the dimensions of each dataset. If int, chunk
each dimension by ``chunks``. By default, chunks will be chosen to load entire
input files into memory at once. This has a major impact on performance: please
see the full documentation for more details [2]_.
see the full documentation for more details [2]_. This argument is evaluated
on a per-file basis, so chunk sizes that span multiple files will be ignored.
concat_dim : str, DataArray, Index or a Sequence of these or None, optional
Dimensions to concatenate files along. You only need to provide this argument
if ``combine='nested'``, and if any of the dimensions along which you want to
Expand Down

0 comments on commit 1265310

Please sign in to comment.