Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Asdf-Support #708
base: asdf-support
Are you sure you want to change the base?
Asdf-Support #708
Changes from 15 commits
39faa60
9644bbb
af7dccb
67b8805
b62ac94
e2928ea
c2b240f
94f2065
cc65cc3
5ffb493
695f8f6
9e38352
a9c5868
835c5c7
3bb70e4
04409b5
76cc27e
3ea5e65
a31f5fc
f313395
3d9d9e8
d031c35
171dcd6
0c7135f
0a84f8b
cd31d95
7c04d35
266f88e
67358b4
6e7a338
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@braingram @ViciousEagle03 is asdf smart enough to handle this circular reference and not save out the ndcube object twice? i.e. I save an
NDCube
which has anExtraCoords
which references the sameNDCube
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Theoretically yes but I believe the converter will need to be updated.
https://asdf.readthedocs.io/en/latest/asdf/extending/converters.html#reference-cycles
That seems like a good test case.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@braingram, for the current implementation I believe it is still not storing the
NDCube
object twice, right?And also in the current
from_yaml_tree
method, it doesn't haveyield
but it can still deserialize the extra_coords object, that is, when the extra_coords converter is called the extra_coords.ndcube shows_PendingValue
and when it gets picked up in the NDCube converter and thendcube.extra_coords
is initialized the reference to the ndcube is properly mapped in thendcube.extra_coords._ndcube
.We can see the deserialized
NDCube
object and theNDCube
reference address for theExtraCoords
associated is the same for the below.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for testing this out. This looks to be working because
__set__
overwrites_ndcube
when_extra_coords
is assigned:ndcube/ndcube/ndcube.py
Line 321 in 09a83f1
I am not sure what's going on with
__set__
. @Cadair is there ever an instance where:extra_coords
attached to aNDCube
will reference a differentNDCube
?extra_coords
will exist without aNDCube
?Check warning on line 22 in ndcube/asdf/entry_points.py
ndcube/asdf/entry_points.py#L20-L22
Check warning on line 36 in ndcube/asdf/entry_points.py
ndcube/asdf/entry_points.py#L34-L36
Check warning on line 43 in ndcube/asdf/entry_points.py
ndcube/asdf/entry_points.py#L42-L43
Check warning on line 45 in ndcube/asdf/entry_points.py
ndcube/asdf/entry_points.py#L45
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I subclass NDCube and want to write my own schema for my subclass with additional properties does this get in the way? @braingram
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think so. A typical way to "subclass" a schema is to
$ref
the parent schema in aallOf
. UsingallowAdditionalProperties: False
here would prevent the sub-schema from adding properties (because any addition would cause the parent schema validation to fail, causing theallOf
to fail).There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you also enable schema tests for the new schemas? Here's the configuration for
asdf-astropy
:https://github.com/astropy/asdf-astropy/blob/1b286108042c1201199e9931cd2a5b52cc57bdce/pyproject.toml#L121-L122
I am not familiar with the ndcube tox and github actions CI but would you look into modifying whatever is necessary there to get the schema tests running for the PR? Let me know if there are more details or anything I can do to help with this. Also, perhaps one of the other maintainers has a sense for how these schema tests might be integrated in the CI.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have enabled the schema test, @Cadair could you please verify if the configuration is correct?
I followed this to enable the tests. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this has allowed the schema tests to run in the CI. I think it's related to asdf-format/asdf#1756
The core issue is that the CI run is testing files in the installed package but is running from a temporary directory. Take this test as an example:
The path to the file contains a bunch of stuff that is very specific to that test run (py312, tox, etc) and this causes the asdf pytest plugin to ignore the schema files since they appear at a similar path which doesn't match the
asdf_schema_root
.As there are test configurations where we might expect the schema tests to fail (like the oldest deps job, see this conversation for more details) we may want to consider adding a new CI job (or jobs) that test just the schemas. Minimally they could be tested against the newest released asdf and perhaps they could also be tested against development versions of asdf and the dependent schema packages. Ignoring the development testing at the moment I think this would involve:
asdf_schema_root
, providing a path to the schemas, not running in a temporary directory)There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've included a separate environment for schema testing in tox and created a workflow for it. Should I add a separate job for the schema tests as well? Please let me know. (P.S. I got a bit carried away with the workflow concept.)
The schema tests(now enabled) in the CI now doesn't run from a
.tmp
directory:There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The workflow and tox changes you've included so far look good enough to me. I don't think adding a separate job is necessary (but please let me know if I'm missing something that would provide). Thanks!