Skip to content
This repository has been archived by the owner on Nov 27, 2024. It is now read-only.

Update argument checking for new versions of decorator package #611

Closed
wence- opened this issue Apr 21, 2021 · 11 comments
Closed

Update argument checking for new versions of decorator package #611

wence- opened this issue Apr 21, 2021 · 11 comments

Comments

@wence-
Copy link
Member

wence- commented Apr 21, 2021

The argument validation in utils.validate_base doesn't play well with decorator>4.4.2. We should figure out how to fix it.

@wence- wence- mentioned this issue Apr 21, 2021
@dham
Copy link
Member

dham commented Apr 21, 2021

We should probably redo the type checking stuff to use type annotation now that this exists.

@wence-
Copy link
Member Author

wence- commented Apr 21, 2021

Yeah, I don't know how that exactly works, but maybe @connorjward can figure it out.

@connorjward
Copy link
Collaborator

I think using type annotations is a good idea. The only thing to really consider is that this means that type checking would not actually happen while the program is running. Instead the type checking can be done by your IDE or a tool such as mypy.

I think doing this would be great. It would eliminate some overhead and means that known_pyop2_safe decorators could be eliminated.

This post gives what I think is an excellent overview.

@wence-
Copy link
Member Author

wence- commented Apr 21, 2021

If I add type hints, is there a way that I can flick a switch and at runtime confirm that everything is correct? AIUI mypy can only statically check fully typed programs, or is that wrong?

There are a bunch of the validation checks that in debug mode check correctness of values not types, and I'm pretty sure that Python is not yet on the dependent types train, so we would still need some decorators.

In my vague sketch of pyop3 API, I did a horrible hack with "import time" decision of whether or not validation would be implied with the decorators turning into no-ops otherwise.

# In some file
def check_args(valid):
    """
    Decorator for validation of arguments.

    :arg valid: Function with same signature as decoratee that asserts
         correctness of arguments.
    """
    def validator(fn):
        @wraps(fn)
        def wrapper(*args, **kwargs):
            valid(*args, **kwargs)
            return fn(*args, **kwargs)
        return wrapper
    return validator


if configuration["runtime_checking"]:
    debug_check_args = check_args
    """Decorator for validation of arguments at runtime."""
else:
    def debug_check_args(predicate):
        """
        No-op decorator skipping runtime validation of arguments.
        """
        def noop(fn):
            return fn
        return noop

# Now somewhere else
class Foo:
    def validator(self, obj, map_tuple):
        assert isinstance(obj, (Dat, DatView))

    @debug_check_args(validator)
    def __init__(self, obj, map_tuple):
        self._parloop_args_ = obj._parloop_args_
        self._parloop_argtypes_ = obj._parloop_argtypes_
        self.obj = obj
        self.map_tuple = map_tuple

@connorjward
Copy link
Collaborator

mypy won't do any runtime checks so would be useless for value checks.

I think that we have 3 options:

  1. Use a library that does value and type checking for us (e.g. PyContracts).
  2. Use the 'builtin' Python functionality. This means static type checking and asserts for value checking (I didn't know about python -O, that's cool!)
  3. Use a solution that we build ourselves - this probably isn't necessary

My preference is 2, but you guys know a lot more than me so I may be wrong.

@wence-
Copy link
Member Author

wence- commented Apr 21, 2021

I guess that the problem is no-one knows to run python -O so all the users will end up in the "slow" path.

@wence-
Copy link
Member Author

wence- commented Apr 21, 2021

The decorator approach is kind of a bit nicer because it doesn't pollute the body of the function with assertions, but maybe that's OK.

@connorjward
Copy link
Collaborator

A quick scan of PyOP2 shows me that this is largely only used for __init__ methods. If we are sensible about how often we instantiate objects isn't worrying about a 'slow' path a case of premature optimisation?

@wence-
Copy link
Member Author

wence- commented Apr 22, 2021

Yes, heavy data can do heavy checking. But things like Arg construction is actually expensive enough that you care.

@connorjward
Copy link
Collaborator

If we were to cache Arg objects (provided they didn't reference any data) I think this problem could go away. Also, a large part of the problem here is just that Python is really slow. It may be better to discuss type/value validation as part of a bigger conversation addressing the other design issues PyOP2 has.

@connorjward
Copy link
Collaborator

Closing as moved to Firedrake firedrakeproject/firedrake#3821

@connorjward connorjward closed this as not planned Won't fix, can't repro, duplicate, stale Oct 23, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants