Skip to content

Error attempting to transform xenium transcript coordinate using transform() #1105

@rached-97

Description

@rached-97

Describe the bug

I am getting "ValueError: The number of items in 'lengths' does not match the number of partitions. 4 != 13" when attempting to transform xenium transcript coordinates using spatialdata (sd) sd.transform()

I noticed a similar error reported previously (issue #1064) but I am under the impression that it was resolved and closed.

Thank you for your help.

>>> sdata = xenium(xenDataDir, cells_as_circles=False)

>>> get_transformation(sdata["transcripts"])
Scale (x, y)
    [4.70588235 4.70588235]

>>> sd.transform(sdata["transcripts"], to_coordinate_system = "global")
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[23], line 1
----> 1 sd.transform(sdata["transcripts"], to_coordinate_system = "global")

File /opt/homebrew/Cellar/python@3.14/3.14.3_1/Frameworks/Python.framework/Versions/3.14/lib/python3.14/functools.py:982, in singledispatch.<locals>.wrapper(*args, **kw)
    979 if not args:
    980     raise TypeError(f'{funcname} requires at least '
    981                     '1 positional argument')
--> 982 return dispatch(args[0].__class__)(*args, **kw)

File .venv/lib/python3.14/site-packages/spatialdata/_core/operations/transform.py:443, in _(data, transformation, maintain_positioning, to_coordinate_system)
    441 arrays = []
    442 for ax in axes:
--> 443     arrays.append(data[ax].to_dask_array(lengths=True).reshape(-1, 1))
    444 xdata = DataArray(da.concatenate(arrays, axis=1), coords={"points": range(len(data)), "dim": list(axes)})
    445 xtransformed = transformation._transform_coordinates(xdata)

File .venv/lib/python3.14/site-packages/dask/dataframe/dask_expr/_collection.py:1422, in FrameBase.to_dask_array(self, lengths, meta, optimize, **optimize_kwargs)
   1418     lengths = tuple(self.map_partitions(len).compute())
   1420 arr = self.values
-> 1422 chunks = self._validate_chunks(arr, lengths)
   1423 arr._chunks = chunks
   1425 if meta is not None:

File .venv/lib/python3.14/site-packages/dask/dataframe/dask_expr/_collection.py:2471, in FrameBase._validate_chunks(self, arr, lengths)
   2468 lengths = tuple(lengths)
   2470 if len(lengths) != self.npartitions:
-> 2471     raise ValueError(
   2472         "The number of items in 'lengths' does not match the number of "
   2473         f"partitions. {len(lengths)} != {self.npartitions}"
   2474     )
   2476 if self.ndim == 1:
   2477     chunks = normalize_chunks((lengths,))

ValueError: The number of items in 'lengths' does not match the number of partitions. 4 != 13

To Reproduce

pip freeze | grep spatialdata
napari-spatialdata==0.7.0
spatialdata==0.7.2
spatialdata-io==0.6.0
spatialdata-plot==0.3.2

pip freeze | grep dask       
dask==2026.1.1
dask-image==2025.11.0

Expected behavior
I expected the transcript data to be scaled to the global coordinate system (more specifically the coordinates of the Images in the spatialdata object) by the scaling factor returned by get_transformation()

Desktop (optional):

  • OS: MacOS running on Apple M1 Max
  • Version: Tahoe 26.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions