-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Constraints demo #362
base: master
Are you sure you want to change the base?
Constraints demo #362
Conversation
Exploration of the test failures:
and
|
…lidation This is a demonstration how one existing command could adopt datalad-next's parameter constraint validation. It changes the baseclass to next's ValidatedInterface, and defines a validator with relevant parameter constraints: Specifically, the constraints are: - The provided datasets exists, or a dataset can be derived from the curdir - The path points to an existing file (ref datalad#354) - The extractorname is a string - The extractorargs is a mapping of key-value pairs This makes a dedicated check whether a file exists obsolete, and it could replace the checks that check_dataset() does (provided an additional constraint option in EnsureDataset() that allows to check for valid dataset IDs - I've created an issue about this in datalad/datalad-next#272). This change would introduce a dependency to datalad-next, and as parts of this PR were only tested with yet unreleased branches of datalad-next, it will not work right now unless you're on the right development version of datalad-next.
This adds an 'EnsureDataset()' parameter validation, and an 'EnsureStr()' parameter validation for the path argument. The immediate advantage is that there are now distinct errors for NoMetaDataStoreFound and NoDatasetFound, which were prior both resulting in a NoMetaDataStoreFound exception.
Note to self: |
This is a small demonstration of using datalad-next constraints for metalad commands using
meta-extract
andmeta-dump
. Its a draft PR because it depends on changes from today that haven't even been merged, and there is still one remaining check/constraint to implement to ensure Datasets have a valid dataset ID (so one test will fail no matter what).I see the following advantages: