You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We check plugins superficially whether they look like they implement the required interface, but many properties can only be verified through testing.
We need some way how testing for different plugin types can be automated as much as possible, that means make the core do the heavy lifting and provide interfaces and helpers for plugin developers. Possibly, the plugin groups must be extended to support testing machinery and define tests.
for schemas specifically:
There is a set of inputs that should be accepted and is possibly normalized (so parsing maps unnormalized inputs to a subset of normalized valid inputs)
Make sure that resulting normalized inputs are loaded back as themselves
I.e. that a schema can load what it dumps and specifically, parse(dump(instance)) == instance (bijective on normalized data)
The exported json schema must accept a (non-strict) superset of inputs compared to the pydantic schema (more lenient)
The parent schema and minor revisions of a schema must also accept a superset of inputs (orthogonal to json schemas)
There should be helpers to check whether two schemas are compatible - this is useful both for parent/child schema tests and also for comparing a schema class with or without a modification (i.e. version-affecting change).
Schemas should have instances and non-instances attached that are automatically tested
Possibly looking at hypothesis would make sense for property-based checks.
In general, for schema plugins we should check using pytest (to complement the plugin loading checks):
check that instances validate against the exported json schema
some manual tests of parsing/instance construction for all expected situations and triggering different parsers etc
also automated tests with random instances, if possible
all of the above, but called for the parent schemas (check "parent compatibility")
Converting from and to partial schemas: from_partial(to_partial(instance)) == instance
associativity of partial updates?
The text was updated successfully, but these errors were encountered:
Deprioritized until it is clear whether porting schemas to LinkML would make more sense (e.g. if it does these things correctly, or adding it there would make more sense)
We check plugins superficially whether they look like they implement the required interface, but many properties can only be verified through testing.
We need some way how testing for different plugin types can be automated as much as possible, that means make the core do the heavy lifting and provide interfaces and helpers for plugin developers. Possibly, the plugin groups must be extended to support testing machinery and define tests.
for schemas specifically:
parse(dump(instance)) == instance
(bijective on normalized data)There should be helpers to check whether two schemas are compatible - this is useful both for parent/child schema tests and also for comparing a schema class with or without a modification (i.e. version-affecting change).
Possibly looking at hypothesis would make sense for property-based checks.
In general, for schema plugins we should check using
pytest
(to complement the plugin loading checks):check that instances validate against the exported json schema
some manual tests of parsing/instance construction for all expected situations and triggering different parsers etc
also automated tests with random instances, if possible
all of the above, but called for the parent schemas (check "parent compatibility")
Converting from and to partial schemas:
from_partial(to_partial(instance)) == instance
associativity of partial updates?
The text was updated successfully, but these errors were encountered: