From ba32e1f714ffda9a2683c15ccf86a65c7a78ad66 Mon Sep 17 00:00:00 2001 From: Luca Marconato Date: Sat, 4 Jan 2025 00:58:47 +0100 Subject: [PATCH 1/4] imporved release process documentation --- CHANGELOG.md | 149 -------------------------------- docs/changelog.md | 5 +- docs/contributing.md | 196 +------------------------------------------ 3 files changed, 4 insertions(+), 346 deletions(-) delete mode 100644 CHANGELOG.md diff --git a/CHANGELOG.md b/CHANGELOG.md deleted file mode 100644 index eb370cb4..00000000 --- a/CHANGELOG.md +++ /dev/null @@ -1,149 +0,0 @@ -# Changelog - -All notable changes to this project will be documented in this file. - -The format is based on [Keep a Changelog][], -and this project adheres to [Semantic Versioning][]. - -[keep a changelog]: https://keepachangelog.com/en/1.0.0/ -[semantic versioning]: https://semver.org/spec/v2.0.0.html - -## incoming release - -- (Visium/Visium HD) lowres and hires images now mapped also to the 'global' coordinate system #230 -- (Macsima) added support @berombau #224 -- (seqFISH) support for v2 instrument #227 -- (Visium HD) added argument `annotate_table_by_labels` to rasterize the bins as labels #211 @ArneDefauw - -## [0.1.6] - 2024-11-26 - -- (MERSCOPE) added `feature_key` attribute for points (i.e., the `'gene'` column) #210 -- (Visium HD) get transformation matrices even when only images are parsed #215 -- Support for `xarray.DataTree` (which was moved from `datatree.DataTree`) #232 - -## [0.1.5] - 2024-09-25 - -### Added - -- (Xenium) added `dims` parameter for more control in `xenium_aligned_image()` - -### Fixed - -- Passing `rgb=None` to image model parser for both visium and visiumhd, leading to 3-4 channel images being - interpreted as RGB(A) -- Fix header bug Visium data #200 -- (Visium HD) Fix path parsing when images are missing #204 #206 - -## [0.1.4] - 2024-08-07 - -### Changed - -- (Xenium) changed default target of table to labels; radii of circles computed from cells, not nuclei #179 -- (Visium HD) changed default geometry to squares from circles for the bins; added parameter to choose #183 -- (CosMx) dropping points element with zero-length from the cosmx reader #191 - -## [0.1.3] - 2024-07-03 - -### Added - -- (Xenium) support reading multi-polygon selection files from the Xenium Explorer -- (ISS) An experimental loader to load elemental ISS data objects, e.g. raw.tif, label.tif and anndata.h5ad -- (Stereo-seq) Added reader @LLehner @timtreis @florianingelfinger #70 -- (MERSCOPE) Optional rioxarray backend for MERSCOPE data (reads chunks) -- (MERSCOPE) Can choose which elements should be loaded - -### Fixed - -- (Visium) Fixed issue with joining a SpatialElement with a table due to index values not being unique. - obs_names_make_unique is now called internally to enforce unique index values allowing for join operations. - -### Changed - -- (MERSCOPE) "global" coordinate system is used as a default instead of "microns" - -## [0.1.2] - 2024-03-30 - -### Added - -- (Visium HD) added reader, coauthored by @LLehner - -### Fixed - -- (Xenium) reader for 1.0.1 (paper data) and unknown versions -- (Xenium) fix in reading "minimalistic" Xenium datasets #132 - -## [0.1.1] - 2024-03-24 - -### Added - -- (Xenium) support for post-xenium aligned images (IF, HE) -- (Xenium) reader for the selection coordinates file from the Xenium Explorer -- (Xenium) support for the new Xenium 2.0.0 (multimodal segmentation) -- (Xenium) reading multiscale labels from cells.zarr.zip -- (MCMICRO) support for TMAs (such as the data of exemplar-002) -- (DBiT-seq) reader -- converter functions `experimental.to_legacy_anndata()` and `experimental.from_legacy_anndata()` -- (Visium) support for raw reads (capture locations not under tissue) - -### Fixed - -- (Xenium) fixed index (fail on write) -- (Xenium) renamed cells_as_shapes to cells_as_circles; set default to True -- (MERSCOPE) don't try to load unexisting elements #87 -- (Visium) fixed axes ordering - -## [0.0.9] - 2023-11-06 - -### Fixed - -- (Xenium) bug when converting feature_name #81, from @fbnrst -- (Visium) visium() supports file counts without dataset_id #91 - -## [0.0.8] - 2023-10-02 - -### Fixed - -- (Xenium) coerce cell id to str #64 -- (MERSCOPE) fix coordinate transformation #68 -- (MERSCOPE) Improvements/fixes: merscope reader #73 - -## [0.0.7] - 2023-07-23 - -### Fixed - -- Bugs in Xenium and MERSCOPE - -## [0.0.5] - 2023-06-21 - -### Added - -- MERFISH reader (from @quentinblampey) -- CODEX reader (from @LLehner) - -### Fixed - -- Issues on Visium reader (thanks @ilia-kats) and Xenium reader - -## [0.0.4] - 2023-05-23 - -### Added - -- Curio reader - -## [0.0.3] - 2023-05-22 - -### Merged - -- Merge pull request #40 from scverse/fix/categories - -## [0.0.2] - 2023-05-04 - -### Changed - -- Revert version regex (#37) - -## [0.0.1] - 2023-05-04 - -### Tested - -- Test installation from pypi diff --git a/docs/changelog.md b/docs/changelog.md index d9e79ba6..c37a2194 100644 --- a/docs/changelog.md +++ b/docs/changelog.md @@ -1,3 +1,4 @@ -```{include} ../CHANGELOG.md +# Changelog -``` +Please refer directly to the [Releases](https://github.com/scverse/spatialdata-io/releases) section on GitHub, where you can find curated release notes for each release. +For developers, please consult the [contributing guide](https://github.com/scverse/spatialdata/blob/main/docs/contributing.md), which explains how to keep release notes are up-to-date at each release. diff --git a/docs/contributing.md b/docs/contributing.md index 7df41462..1ee0da05 100644 --- a/docs/contributing.md +++ b/docs/contributing.md @@ -1,197 +1,3 @@ # Contributing guide -Scanpy provides extensive [developer documentation][scanpy developer guide], most of which applies to this repo, too. -This document will not reproduce the entire content from there. Instead, it aims at summarizing the most important -information to get you started on contributing. - -We assume that you are already familiar with git and with making pull requests on GitHub. If not, please refer -to the [scanpy developer guide][]. - -## Installing dev dependencies - -In addition to the packages needed to _use_ this package, you need additional python packages to _run tests_ and _build -the documentation_. It's easy to install them using `pip`: - -```bash -cd spatialdata-io -pip install -e ".[dev,test,doc]" -``` - -## Code-style - -This template uses [pre-commit][] to enforce consistent code-styles. On every commit, pre-commit checks will either -automatically fix issues with the code, or raise an error message. See [pre-commit checks](template_usage.md#pre-commit-checks) for -a full list of checks enabled for this repository. - -To enable pre-commit locally, simply run - -```bash -pre-commit install -``` - -in the root of the repository. Pre-commit will automatically download all dependencies when it is run for the first time. - -Alternatively, you can rely on the [pre-commit.ci][] service enabled on GitHub. If you didn't run `pre-commit` before -pushing changes to GitHub it will automatically commit fixes to your pull request, or show an error message. - -If pre-commit.ci added a commit on a branch you still have been working on locally, simply use - -```bash -git pull --rebase -``` - -to integrate the changes into yours. -While the [pre-commit.ci][] is useful, we strongly encourage installing and running pre-commit locally first to understand its usage. - -Finally, most editors have an _autoformat on save_ feature. Consider enabling this option for [black][black-editors] -and [prettier][prettier-editors]. - -[black-editors]: https://black.readthedocs.io/en/stable/integrations/editors.html -[prettier-editors]: https://prettier.io/docs/en/editors.html - -## Writing tests - -```{note} -Remember to first install the package with `pip install '-e[dev,test]'` -``` - -This package uses the [pytest][] for automated testing. Please [write tests][scanpy-test-docs] for every function added -to the package. - -Most IDEs integrate with pytest and provide a GUI to run tests. Alternatively, you can run all tests from the -command line by executing - -```bash -pytest -``` - -in the root of the repository. Continuous integration will automatically run the tests on all pull requests. - -[scanpy-test-docs]: https://scanpy.readthedocs.io/en/latest/dev/testing.html#writing-tests - -## Publishing a release - -### Updating the version number - -Before making a release, you need to update the version number. Please adhere to [Semantic Versioning][semver], in brief - -> Given a version number MAJOR.MINOR.PATCH, increment the: -> -> 1. MAJOR version when you make incompatible API changes, -> 2. MINOR version when you add functionality in a backwards compatible manner, and -> 3. PATCH version when you make backwards compatible bug fixes. -> -> Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format. - -We use [bump2version][] to automatically update the version number in all places and automatically create a git tag. -Run one of the following commands in the root of the repository - -```bash -bump2version patch -bump2version minor -bump2version major -``` - -Once you are done, run - -``` -git push --tags -``` - -to publish the created tag on GitHub. - -[bump2version]: https://github.com/c4urself/bump2version - -### Building and publishing the package on PyPI - -Python packages are not distributed as source code, but as _distributions_. The most common distribution format is the so-called _wheel_. To build a _wheel_, run - -```bash -python -m build -``` - -This command creates a _source archive_ and a _wheel_, which are required for publishing your package to [PyPI][]. These files are created directly in the root of the repository. - -Before uploading them to [PyPI][] you can check that your _distribution_ is valid by running: - -```bash -twine check dist/* -``` - -and finally publishing it with: - -```bash -twine upload dist/* -``` - -Provide your username and password when requested and then go check out your package on [PyPI][]! - -For more information, follow the [Python packaging tutorial][]. - -It is possible to automate this with GitHub actions, see also [this feature request][pypi-feature-request] -in the cookiecutter-scverse template. - -[python packaging tutorial]: https://packaging.python.org/en/latest/tutorials/packaging-projects/#generating-distribution-archives -[pypi-feature-request]: https://github.com/scverse/cookiecutter-scverse/issues/88 - -## Writing documentation - -Please write documentation for new or changed features and use-cases. This project uses [sphinx][] with the following features: - -- the [myst][] extension allows to write documentation in markdown/Markedly Structured Text -- [Numpy-style docstrings][numpydoc] (through the [napoloen][numpydoc-napoleon] extension). -- Jupyter notebooks as tutorials through [myst-nb][] (See [Tutorials with myst-nb](#tutorials-with-myst-nb-and-jupyter-notebooks)) -- [Sphinx autodoc typehints][], to automatically reference annotated input and output types - -See the [scanpy developer docs](https://scanpy.readthedocs.io/en/latest/dev/documentation.html) for more information -on how to write documentation. - -### Tutorials with myst-nb and jupyter notebooks - -The documentation is set-up to render jupyter notebooks stored in the `docs/notebooks` directory using [myst-nb][]. -Currently, only notebooks in `.ipynb` format are supported that will be included with both their input and output cells. -It is your reponsibility to update and re-run the notebook whenever necessary. - -If you are interested in automatically running notebooks as part of the continuous integration, please check -out [this feature request](https://github.com/scverse/cookiecutter-scverse/issues/40) in the `cookiecutter-scverse` -repository. - -#### Hints - -- If you refer to objects from other packages, please add an entry to `intersphinx_mapping` in `docs/conf.py`. Only - if you do so can sphinx automatically create a link to the external documentation. -- If building the documentation fails because of a missing link that is outside your control, you can add an entry to - the `nitpick_ignore` list in `docs/conf.py` - -#### Building the docs locally - -```bash -cd docs -make html -open _build/html/index.html -``` - - - -[scanpy developer guide]: https://scanpy.readthedocs.io/en/latest/dev/index.html -[spatialdata-io]: https://spatialdata-io.readthedocs.io/en/latest/template_usage.html -[github quickstart guide]: https://docs.github.com/en/get-started/quickstart/create-a-repo?tool=webui -[codecov]: https://about.codecov.io/sign-up/ -[codecov docs]: https://docs.codecov.com/docs -[codecov bot]: https://docs.codecov.com/docs/team-bot -[codecov app]: https://github.com/apps/codecov -[pre-commit.ci]: https://pre-commit.ci/ -[readthedocs.org]: https://readthedocs.org/ -[myst-nb]: https://myst-nb.readthedocs.io/en/latest/ -[jupytext]: https://jupytext.readthedocs.io/en/latest/ -[pre-commit]: https://pre-commit.com/ -[anndata]: https://github.com/scverse/anndata -[mudata]: https://github.com/scverse/mudata -[pytest]: https://docs.pytest.org/ -[semver]: https://semver.org/ -[sphinx]: https://www.sphinx-doc.org/en/master/ -[myst]: https://myst-parser.readthedocs.io/en/latest/intro.html -[numpydoc-napoleon]: https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html -[numpydoc]: https://numpydoc.readthedocs.io/en/latest/format.html -[sphinx autodoc typehints]: https://github.com/tox-dev/sphinx-autodoc-typehints -[pypi]: https://pypi.org/ +Please refer to the [contribution guide from the `spatialdata` repository](https://github.com/scverse/spatialdata/blob/main/docs/contributing.md). From 4811bb4c872a05026b1c5cbc61d4df3276aedecc Mon Sep 17 00:00:00 2001 From: Luca Marconato Date: Sat, 4 Jan 2025 01:22:04 +0100 Subject: [PATCH 2/4] removed unused docs file --- docs/template_usage.md | 322 ----------------------------------------- 1 file changed, 322 deletions(-) delete mode 100644 docs/template_usage.md diff --git a/docs/template_usage.md b/docs/template_usage.md deleted file mode 100644 index f8efbc16..00000000 --- a/docs/template_usage.md +++ /dev/null @@ -1,322 +0,0 @@ -# Developer guide - -Welcome to the developer guidelines! This document is split into two parts: - -1. The [repository setup](#setting-up-the-repository). This section is relevant primarily for the repository maintainer and shows how to connect - continuous integration services and documents initial set-up of the repository. -2. The [contributor guide](contributing.md#contributing-guide). It contains information relevant to all developers who want to make a contribution. - -## Setting up the repository - -### First commit - -If you are reading this, you should have just completed the repository creation with : - -```bash -cruft create https://github.com/scverse/cookiecutter-scverse -``` - -and you should have - -``` -cd spatialdata-io -``` - -into the new project directory. Now that you have created a new repository locally, the first step is to push it to github. To do this, you'd have to create a **new repository** on github. -You can follow the instructions directly on [github quickstart guide][]. -Since `cruft` already populated the local repository of your project with all the necessary files, we suggest to _NOT_ initialize the repository with a `README.md` file or `.gitignore`, because you might encounter git conflicts on your first push. -If you are familiar with git and knows how to handle git conflicts, you can go ahead with your preferred choice. - -:::{note} -If you are looking at this document in the [spatialdata-io][] repository documentation, throughout this document the name of the project is `spatialdata-io`. Otherwise it should be replaced by your new project name: `spatialdata-io`. -::: - -Now that your new project repository has been created on github at `https://github.com/scverse/spatialdata-io` you can push your first commit to github. -To do this, simply follow the instructions on your github repository page or a more verbose walkthrough here: - -Assuming you are in `/your/path/to/spatialdata-io`. Add all files and commit. - -```bash -# stage all files of your new repo -git add --all -# commit -git commit -m "first commit" -``` - -You'll notice that the command `git commit` installed a bunch of packages and triggered their execution: those are pre-commit! To read more about what they are and what they do, you can go to the related section [Pre-commit checks](#pre-commit-checks) in this document. - -:::{note} -There is a chance that `git commit -m "first commit"` fails due to the `prettier` pre-commit formatting the file `.cruft.json`. No problem, you have just experienced what pre-commit checks do in action. Just go ahead and re-add the modified file and try to commit again: - -```bash - git add -u # update all tracked file - git commit -m "first commit" -``` - -::: - -Now that all the files of the newly created project have been committed, go ahead with the remaining steps: - -```bash -# update the `origin` of your local repo with the remote github link -git remote add origin https://github.com/scverse/spatialdata-io.git -# rename the default branch to main -git branch -M main -# push all your files to remote -git push -u origin main -``` - -Your project should be now available at `https://github.com/scverse/spatialdata-io`. While the repository at this point can be directly used, there are few remaining steps that needs to be done in order to achieve full functionality. - -### Coverage tests with _Codecov_ - -Coverage tells what fraction of the code is "covered" by unit tests, thereby encouraging contributors to -[write tests](contributing.md#writing-tests). -To enable coverage checks, head over to [codecov][] and sign in with your GitHub account. -You'll find more information in "getting started" section of the [codecov docs][]. - -In the `Actions` tab of your projects' github repository, you can see that the workflows are failing due to the **Upload coverage** step. The error message in the workflow should display something like: - -``` -... - Retrying 5/5 in 2s.. - {'detail': ErrorDetail(string='Could not find a repository, try using repo upload token', code='not_found')} -Error: 404 Client Error: Not Found for url: -... -``` - -While [codecov docs][] has a very extensive documentation on how to get started, _if_ you are using the default settings of this template we can assume that you are using [codecov][] in a github action workflow and hence you can make use of the [codecov bot][]. - -To set it up, simply go to the [codecov app][] page and follow the instructions to activate it for your repository. -Once the activation is completed, go back to the `Actions` tab and re-run the failing workflows. - -The workflows should now succeed and you will be able to find the code coverage at this link: `https://app.codecov.io/gh/scverse/spatialdata-io`. You might have to wait couple of minutes and the coverage of this repository should be ~60%. - -If your repository is private, you will have to specify an additional token in the repository secrets. In brief, you need to: - -1. Generate a Codecov Token by clicking _setup repo_ in the codecov dashboard. - - If you have already set up codecov in the repository by following the previous steps, you can directly go to the codecov repo webpage. -2. Go to _Settings_ and copy **only** the token `_______-____-...`. -3. Go to _Settings_ of your newly created repository on GitHub. -4. Go to _Security > Secrets > Actions_. -5. Create new repository secret with name `CODECOV_TOKEN` and paste the token generated by codecov. -6. Past these additional lines in `/.github/workflows.test.yaml` under the **Upload coverage** step: - ```bash - - name: Upload coverage - uses: codecov/codecov-action@v3 - with: - token: ${{ secrets.CODECOV_TOKEN }} - ``` -7. Go back to github `Actions` page an re-run previously failed jobs. - -### Documentation on _readthedocs_ - -We recommend using [readthedocs.org][] (RTD) to build and host the documentation for your project. -To enable readthedocs, head over to [their website][readthedocs.org] and sign in with your GitHub account. -On the RTD dashboard choose "Import a Project" and follow the instructions to add your repository. - -- Make sure to choose the correct name of the default branch. On GitHub, the name of the default branch should be `main` (it has - recently changed from `master` to `main`). -- We recommend to enable documentation builds for pull requests (PRs). This ensures that a PR doesn't introduce changes - that break the documentation. To do so, got to `Admin -> Advanced Settings`, check the - `Build pull requests for this projects` option, and click `Save`. For more information, please refer to - the [official RTD documentation](https://docs.readthedocs.io/en/stable/pull-requests.html). -- If you find the RTD builds are failing, you can disable the `fail_on_warning` option in `.readthedocs.yaml`. - -If your project is private, there are ways to enable docs rendering on [readthedocs.org][] but it is more cumbersome and requires a different subscription for read the docs. See a guide [here](https://docs.readthedocs.io/en/stable/guides/importing-private-repositories.html). - -### Pre-commit checks - -[Pre-commit][] checks are fast programs that -check code for errors, inconsistencies and code styles, before the code -is committed. - -We recommend setting up [pre-commit.ci][] to enforce consistency checks on every commit -and pull-request. - -To do so, head over to [pre-commit.ci][] and click "Sign In With GitHub". Follow -the instructions to enable pre-commit.ci for your account or your organization. You -may choose to enable the service for an entire organization or on a per-repository basis. - -Once authorized, pre-commit.ci should automatically be activated. - -#### Overview of pre-commit hooks used by the template - -The following pre-commit checks are for code style and format: - -- [black](https://black.readthedocs.io/en/stable/): standard code - formatter in Python. -- [isort](https://pycqa.github.io/isort/): sort module imports into - sections and types. -- [prettier](https://prettier.io/docs/en/index.html): standard code - formatter for non-Python files (e.g. YAML). -- [blacken-docs](https://github.com/asottile/blacken-docs): black on - python code in docs. - -The following pre-commit checks are for errors and inconsistencies: - -- [flake8](https://flake8.pycqa.org/en/latest/): standard check for errors in Python files. - - [flake8-tidy-imports](https://github.com/adamchainz/flake8-tidy-imports): - tidy module imports. - - [flake8-docstrings](https://github.com/PyCQA/flake8-docstrings): - pydocstyle extension of flake8. - - [flake8-rst-docstrings](https://github.com/peterjc/e8-rst-docstrings): - extension of `flake8-docstrings` for `rst` docs. - - [flake8-comprehensions](https://github.com/adamchainz/e8-comprehensions): - write better list/set/dict comprehensions. - - [flake8-bugbear](https://github.com/PyCQA/flake8-bugbear): - find possible bugs and design issues in program. - - [flake8-blind-except](https://github.com/elijahandrews/flake8-blind-except): - checks for blind, catch-all `except` statements. -- [yesqa](https://github.com/asottile/yesqa): - remove unneccesary `# noqa` comments, follows additional dependencies listed above. -- [autoflake](https://github.com/PyCQA/autoflake): - remove unused imports and variables. -- [pre-commit-hooks](https://github.com/pre-commit/pre-commit-hooks): generic pre-commit hooks. - - **detect-private-key**: checks for the existence of private keys. - - **check-ast**: check whether files parse as valid python. - - **end-of-file-fixer**:check files end in a newline and only a newline. - - **mixed-line-ending**: checks mixed line ending. - - **trailing-whitespace**: trims trailing whitespace. - - **check-case-conflict**: check files that would conflict with case-insensitive file systems. -- [pyupgrade](https://github.com/asottile/pyupgrade): - upgrade syntax for newer versions of the language. -- **forbid-to-commit**: Make sure that `*.rej` files cannot be commited. These files are created by the - [automated template sync](#automated-template-sync) if there's a merge conflict and need to be addressed manually. - -### How to disable or add pre-commit checks - -- To ignore lint warnigs from **flake8**, see [Ignore certain lint warnings](#how-to-ignore-certain-lint-warnings). -- You can add or remove pre-commit checks by simply deleting relevant lines in the `.pre-commit-config.yaml` file. - Some pre-commit checks have additional options that can be specified either in the `pyproject.toml` or tool-specific - config files, such as `.prettierrc.yml` for **prettier** and `.flake8` for **flake8**. - -### How to ignore certain lint warnings - -The [pre-commit checks](#pre-commit-checks) include [flake8](https://flake8.pycqa.org/en/latest/) which checks -for errors in Python files, including stylistic errors. - -In some cases it might overshoot and you may have good reasons to ignore certain warnings. - -To ignore an specific error on a per-case basis, you can add a comment `# noqa` to the offending line. You can also -specify the error ID to ignore, with e.g. `# noqa: E731`. Check the [flake8 guide][] for reference. - -Alternatively, you can disable certain error messages for the entire project. To do so, edit the `.flake8` -file in the root of the repository. Add one line per linting code you wish to ignore and don't forget to add a comment. - -```toml -... -# line break before a binary operator -> black does not adhere to PEP8 -W503 -# line break occured after a binary operator -> black does not adhere to PEP8 -W504 -... -``` - -[flake8 guide]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html - -### API design - -Scverse ecosystem packages should operate on [AnnData][] and/or [MuData][] data structures and typically use an API -as originally [introduced by scanpy][scanpy-api] with the following submodules: - -- `pp` for preprocessing -- `tl` for tools (that, compared to `pp` generate interpretable output, often associated with a corresponding plotting - function) -- `pl` for plotting functions - -You may add additional submodules as appropriate. While we encourage to follow a scanpy-like API for ecosystem packages, -there may also be good reasons to choose a different approach, e.g. using an object-oriented API. - -[scanpy-api]: https://scanpy.readthedocs.io/en/stable/usage-principles.html - -### Using VCS-based versioning - -By default, the template uses hard-coded version numbers that are set in `pyproject.toml` and [managed with -bump2version](contributing.md#publishing-a-release). If you prefer to have your project automatically infer version numbers from git -tags, it is straightforward to switch to vcs-based versioning using [hatch-vcs][]. - -In `pyproject.toml` add the following changes, and you are good to go! - -```diff ---- a/pyproject.toml -+++ b/pyproject.toml -@@ -1,11 +1,11 @@ - [build-system] - build-backend = "hatchling.build" --requires = ["hatchling"] -+requires = ["hatchling", "hatch-vcs"] - - - [project] - name = "spatialdata-io" --version = "0.3.1dev" -+dynamic = ["version"] - -@@ -60,6 +60,9 @@ -+[tool.hatch.version] -+source = "vcs" -+ - [tool.coverage.run] - source = ["spatialdata-io"] - omit = [ -``` - -Don't forget to update the [Making a release section](contributing.md#publishing-a-release) in this document accordingly, after you are done! - -[hatch-vcs]: https://pypi.org/project/hatch-vcs/ - -### Automated template sync - -Automated template sync is enabled by default. This means that every night, a GitHub action runs [cruft][] to check -if a new version of the `scverse-cookiecutter` template got released. If there are any new changes, a pull request -proposing these changes is created automatically. This helps keeping the repository up-to-date with the latest -coding standards. - -It may happen that a template sync results in a merge conflict. If this is the case a `*.ref` file with the -diff is created. You need to manually address these changes and remove the `.rej` file when you are done. -The pull request can only be merged after all `*.rej` files have been removed. - -:::{tip} -The following hints may be useful to work with the template sync: - -- GitHub automatically disables scheduled actions if there has been not activity to the repository for 60 days. - You can re-enable or manually trigger the sync by navigating to `Actions` -> `Sync Template` in your GitHub repository. -- If you want to ignore certain files from the template update, you can add them to the `[tool.cruft]` section in the - `pyproject.toml` file in the root of your repository. More details are described in the - [cruft documentation][cruft-update-project]. -- To disable the sync entirely, simply remove the file `.github/workflows/sync.yaml`. - -::: - -[cruft]: https://cruft.github.io/cruft/ -[cruft-update-project]: https://cruft.github.io/cruft/#updating-a-project - -## Moving forward - -You have reached the end of this document. Congratulations! You have successfully set up your project and are ready to start. -For everything else related to documentation, code style, testing and publishing your project ot pypi, please refer to the [contributing docs](contributing.md#contributing-guide). - - - -[scanpy developer guide]: https://scanpy.readthedocs.io/en/latest/dev/index.html -[spatialdata-io]: https://spatialdata-io.readthedocs.io/en/latest/template_usage.html -[github quickstart guide]: https://docs.github.com/en/get-started/quickstart/create-a-repo?tool=webui -[codecov]: https://about.codecov.io/sign-up/ -[codecov docs]: https://docs.codecov.com/docs -[codecov bot]: https://docs.codecov.com/docs/team-bot -[codecov app]: https://github.com/apps/codecov -[pre-commit.ci]: https://pre-commit.ci/ -[readthedocs.org]: https://readthedocs.org/ -[myst-nb]: https://myst-nb.readthedocs.io/en/latest/ -[jupytext]: https://jupytext.readthedocs.io/en/latest/ -[pre-commit]: https://pre-commit.com/ -[anndata]: https://github.com/scverse/anndata -[mudata]: https://github.com/scverse/mudata -[pytest]: https://docs.pytest.org/ -[semver]: https://semver.org/ -[sphinx]: https://www.sphinx-doc.org/en/master/ -[myst]: https://myst-parser.readthedocs.io/en/latest/intro.html -[numpydoc-napoleon]: https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html -[numpydoc]: https://numpydoc.readthedocs.io/en/latest/format.html -[sphinx autodoc typehints]: https://github.com/tox-dev/sphinx-autodoc-typehints From 918d18b56f99cd6d73102c868b6a26b832a709d2 Mon Sep 17 00:00:00 2001 From: Luca Marconato Date: Sat, 4 Jan 2025 01:24:53 +0100 Subject: [PATCH 3/4] fix --- docs/index.md | 1 - 1 file changed, 1 deletion(-) diff --git a/docs/index.md b/docs/index.md index 46aa14d4..53ff8f2a 100644 --- a/docs/index.md +++ b/docs/index.md @@ -8,7 +8,6 @@ api.md changelog.md -template_usage.md contributing.md references.md ``` From 10951c73460e10f2e002a0f1b40291f174d00e7c Mon Sep 17 00:00:00 2001 From: Luca Marconato Date: Mon, 6 Jan 2025 12:44:49 +0100 Subject: [PATCH 4/4] small typo --- src/spatialdata_io/readers/merscope.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/spatialdata_io/readers/merscope.py b/src/spatialdata_io/readers/merscope.py index af258857..961d7b4e 100644 --- a/src/spatialdata_io/readers/merscope.py +++ b/src/spatialdata_io/readers/merscope.py @@ -117,7 +117,7 @@ def merscope( - ``{ms.CELL_METADATA_FILE!r}`` - ``{ms.BOUNDARIES_FILE!r}`` - If a dictionnary, then the following keys should be provided with the desired path: + If a dictionary, then the following keys should be provided with the desired path: - ``{ms.VPT_NAME_COUNTS!r}`` - ``{ms.VPT_NAME_OBS!r}``