Support testing np
and tnp
on the same test
#77
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
@ev-br here's how one could test both NumPy-proper and
torch_np
on the same test. Essentially any test case just needs a parameternp
, which should be treated as the array module we want to test (i.e.numpy
,torch_np
), and we need to ensure we're using that module rather than outside imports.I reworked some of
test_indexing.py
to demonstrate how this could be utilised. It's a bit finnicky as you have to checknp is {numpy,torch_np}
for some module-specific behaviour, e.g. raised error classes and skips/xfails, and you have to make sure you're using the argumentnp
for things like the{numpy,torch_np}.testing
utils.Maybe we could support
np
as a magically param but not really bother to rework existing tests with it—instead we use it as-and-when we feel it useful to "sanity check" a test works the same for NumPy. So IMO it might be worth merging just theconftest.py
changes.