Skip to content

Commit

Permalink
Reorganize packaging and hopefully fix tests (#267)
Browse files Browse the repository at this point in the history
* Reorganize packaging for project.

* Update CITATION.cff

* Reorganize the tests.

* Update config.yml

* Update config.yml

* Update config.yml

* Update config.yml

* Update config.yml

* Update Dockerfile

Update Dockerfile

Update Dockerfile

Update Dockerfile

Update Dockerfile

Update Dockerfile

Update Dockerfile

* Update Dockerfile

* Update Dockerfile

* Update.

* this is exhausting

* Try this.

* Update config.yml

* Revert "Update config.yml"

This reverts commit 4214b47.

* Revert "Try this."

This reverts commit 1415387.

* Try installing before getting version.

* Update config.yml

* Update config.yml

* Update config.yml

* Update config.yml

* I hate linux.

* Update config.yml

* Update config.yml

* Update.

* Update config.yml

* Update config.yml

* Update config.yml

* Update config.yml

* Update config.yml

Update.

Update config.yml

Update config.yml

Update config.yml

* Revert "Update config.yml"

This reverts commit b74d382.

* Update config.yml

* Update config.yml

* Streamline CircleCI config with @mattcieslak.

* Update config.yml

* Update config.yml

* Update lint.yml

* Update.

* Update config.yml

* Update config.yml

* Update config.yml

* Update.

* Update config.yml

* Update lint.yml

* Update lint.yml

* Update.

* Update lint.yml

* Fix misspellings.

* Block datalad 0.17.3.

* Update pyproject.toml

* Update tests.

* Update .readthedocs.yml
  • Loading branch information
tsalo authored Jan 12, 2024
1 parent b8f5d47 commit 068e319
Show file tree
Hide file tree
Showing 187 changed files with 799 additions and 6,187 deletions.
627 changes: 42 additions & 585 deletions .circleci/config.yml

Large diffs are not rendered by default.

6 changes: 3 additions & 3 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ on:
jobs:
lint:
runs-on: ubuntu-latest

steps:
- name: Set up environment
uses: actions/checkout@v3
Expand All @@ -16,11 +15,12 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.7'
python-version: '3.9'
- name: Install dependencies
run: |
pip install flake8 flake8-absolute-import flake8-black flake8-docstrings \
flake8-isort flake8-pyproject flake8-unused-arguments \
flake8-use-fstring pep8-naming
flake8-use-fstring pep8-naming \
codespell tomli
- name: Run linters
run: python -m flake8 cubids
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
cubids/_version.py
*.DS_Store

# Byte-compiled / optimized / DLL files
Expand Down
8 changes: 8 additions & 0 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,13 @@
version: 2

build:
os: ubuntu-22.04
tools:
python: "3.8"

sphinx:
configuration: docs/conf.py

python:
install:
- method: pip
Expand Down
5 changes: 0 additions & 5 deletions .readthedocs/environment.yaml

This file was deleted.

4 changes: 2 additions & 2 deletions AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,6 @@ Contributors

* Tinashe Tapera <tinashe.tapera at pennmedicine dot upenn .edu>

Principle Investigator
-----------------------
Principal Investigator
----------------------
* Theodore Satterthwaite <theodore.satterthwaite at pennmedicine dot upenn .edu>
107 changes: 107 additions & 0 deletions CITATION.cff
Original file line number Diff line number Diff line change
@@ -0,0 +1,107 @@
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!

cff-version: 1.2.0
title: >-
Curation of BIDS (CuBIDS)
message: If you use this software, please cite it using the metadata from this file, as well as the NeuroImage paper (doi:10.1016/j.neuroimage.2022.119609).
type: software
authors:
- given-names: Sydney
family-names: Covitz
affiliation: University of Pennsylvania
orcid: 'https://orcid.org/0000-0002-7430-4125'
- given-names: Tinashe M.
family-names: Tapera
- given-names: Azeez
family-names: Adebimpe
affiliation: University of Pennsylvania
orcid: 'https://orcid.org/0000-0001-9049-0135'
- given-names: Aaron F.
family-names: Alexander-Bloch
affiliation: University of Pennsylvania
- given-names: Maxwell
family-names: Bertolero
orcid: 'https://orcid.org/0000-0002-2691-3698'
affiliation: University of Pennsylvania
- given-names: Eric
family-names: Feczko
- given-names: Alexandre R.
family-names: Franco
- given-names: Raquel E.
family-names: Gur
- given-names: Ruben C.
family-names: Gur
- given-names: Timothy
family-names: Hendrickson
- given-names: Audrey
family-names: Houghton
- given-names: Kahini
family-names: Mehta
- given-names: Kristin
family-names: Murtha
affiliation: University of Pennsylvania
- given-names: Anders J.
family-names: Perrone
- given-names: Tim
family-names: Robert-Fitzgerald
- given-names: Jenna M.
family-names: Schabdach
- given-names: Russell T.
family-names: Shinohara
- given-names: Jacob W.
family-names: Vogel
- given-names: Chenying
family-names: Zhao
- given-names: Damien A.
family-names: Fair
- given-names: Michael
family-names: Milham
- given-names: Matthew
family-names: Cieslak
email: [email protected]
affiliation: University of Pennsylvania
orcid: 'https://orcid.org/0000-0002-1931-4734'
- given-names: Taylor
family-names: Salo
email: [email protected]
affiliation: University of Pennsylvania
orcid: 'https://orcid.org/0000-0001-9813-3167'
- given-names: Theodore
family-names: Satterthwaite
affiliation: University of Pennsylvania
orcid: 'https://orcid.org/0000-0001-7072-9399'
identifiers:
- type: doi
value: 10.5281/zenodo.6514881
description: The Zenodo DOI
- type: doi
value: 10.1016/j.neuroimage.2022.119609
description: The NeuroImage journal article
repository-code: 'https://github.com/PennLINC/CuBIDS'
url: 'https://cubids.readthedocs.io'
abstract: >-
The Brain Imaging Data Structure (BIDS) is a specification accompanied by a software
ecosystem that was designed to create reproducible and automated workflows for processing
neuroimaging data.
BIDS Apps flexibly build workflows based on the metadata detected in a dataset.
However, even BIDS valid metadata can include incorrect values or omissions that result in
inconsistent processing across sessions.
Additionally, in large-scale, heterogeneous neuroimaging datasets,
hidden variability in metadata is difficult to detect and classify.
To address these challenges, we created a Python-based software package titled
“Curation of BIDS” (CuBIDS), which provides an intuitive workflow that helps users validate
and manage the curation of their neuroimaging datasets.
CuBIDS includes a robust implementation of BIDS validation that scales to large samples and
incorporates DataLad- a version control software package for data- as an optional dependency
to ensure reproducibility and provenance tracking throughout the entire curation process.
CuBIDS provides tools to help users perform quality control on their images' metadata and
identify unique combinations of imaging parameters.
Users can then execute BIDS Apps on a subset of participants that represent the full range of
acquisition parameters that are present, accelerating pipeline testing on large datasets.
keywords:
- BIDS
- Neuroimaging
license: MIT
version: 1.0.2
date-released: '2023-09-07'
29 changes: 0 additions & 29 deletions Dockerfile

This file was deleted.

34 changes: 13 additions & 21 deletions cubids/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -763,32 +763,26 @@ def cubids_copy_exemplars():
"bids_dir",
type=Path,
action="store",
help=(
"path to the root of a BIDS dataset. "
"It should contain sub-X directories and "
"dataset_description.json."
),
help="path to the root of a BIDS dataset. "
"It should contain sub-X directories and "
"dataset_description.json.",
)
parser.add_argument(
"exemplars_dir",
type=Path,
action="store",
help=(
"absolute path to the root of a BIDS dataset "
"containing one subject from each Acquisition Group. "
"It should contain sub-X directories and "
"dataset_description.json."
),
help="absolute path to the root of a BIDS dataset "
"containing one subject from each Acquisition Group. "
"It should contain sub-X directories and "
"dataset_description.json.",
)
parser.add_argument(
"exemplars_tsv",
type=Path,
action="store",
help=(
"absolute path to the .tsv file that lists one "
"subject from each Acqusition Group "
"(*_AcqGrouping.tsv from the cubids-group output)"
),
help="absolute path to the .tsv file that lists one "
"subject from each Acquisition Group "
"(*_AcqGrouping.tsv from the cubids-group output)",
)
parser.add_argument(
"--use-datalad", action="store_true", help="check exemplar dataset into DataLad"
Expand All @@ -797,11 +791,9 @@ def cubids_copy_exemplars():
"--min-group-size",
action="store",
default=1,
help=(
"minimum number of subjects an Acquisition Group "
"must have in order to be included in the exemplar "
"dataset "
),
help="minimum number of subjects an Acquisition Group "
"must have in order to be included in the exemplar "
"dataset ",
required=False,
)
# parser.add_argument('--include-groups',
Expand Down
4 changes: 2 additions & 2 deletions cubids/cubids.py
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@ def apply_tsv_changes(self, summary_tsv, files_tsv, new_prefix, raise_on_error=T
"""Apply changes documented in the edited summary tsv and generate the new tsv files.
This function looks at the RenameKeyGroup and MergeInto
columns and modifies the bids datset according to the
columns and modifies the bids dataset according to the
specified changs.
Parameters
Expand Down Expand Up @@ -825,7 +825,7 @@ def get_param_groups_from_key_group(self, key_group):
if ret == "erroneous sidecar found":
return "erroneous sidecar found"

# add modality to the retun tuple
# add modality to the return tuple
l_ret = list(ret)
l_ret.append(modality)
tup_ret = tuple(l_ret)
Expand Down
2 changes: 1 addition & 1 deletion cubids/metadata_merge.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
def check_merging_operations(action_tsv, raise_on_error=False):
"""Check that the merges in an action tsv are possible.
To be mergable the
To be mergeable the
"""
actions = pd.read_table(action_tsv)
ok_merges = []
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Loading

0 comments on commit 068e319

Please sign in to comment.