Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decorated Function Should Be Considered Used #488

Open
eellison opened this issue Nov 13, 2019 · 20 comments
Open

Decorated Function Should Be Considered Used #488

eellison opened this issue Nov 13, 2019 · 20 comments

Comments

@eellison
Copy link

eellison commented Nov 13, 2019

Functions with a decorator are considered unused, leading to situations like #320 and sympy/sympy#17795.

If I run flake8 on the:

def my_overload(func):
    print(func)
    pass

@my_overload
def utf8(value: bytes) -> bytes:
    pass

@my_overload
def utf8(value: str) -> bytes:
    pass

def utf8(value):
    pass  # actual implementation

I get:

test.py:8:1: F811 redefinition of unused 'utf8' from line 4
test.py:12:1: F811 redefinition of unused 'utf8' from line 8

But the semantically-equivalent:

def utf8(value: bytes) -> bytes:
    pass
my_overload(utf8)

def utf8(value: str) -> bytes:
    pass
my_overload(utf8)

def utf8(value):
    pass  # actual implementation

Does not have any errors. As in my example above, annotations to functions have arbitrary side-effects and should be considered a use of the function

Alternatives:
I can either add an additional prepend overload to my own decorator

def my_decorator(func):
    return func

@overload
@my_decorator
def utf8(value: None) -> None:
    pass

Or I can add two noqa per overloaded function declaration:

@my_overload  # noqa
def utf8(value: str) -> bytes:  # noqa
    pass

These both work for now but it would be nice if I didn't have to do them.

@asottile
Copy link
Member

I'm -1 on this, decorator magic is magic and I think noqa is the appropriate action here

@asottile asottile changed the title Annotated Function Should Be Considered Used Decorated Function Should Be Considered Used Nov 13, 2019
@eellison
Copy link
Author

eellison commented Nov 13, 2019

How are they magic? IMO they are commonly used pattern with a clear syntactic meaning.

@asmeurer
Copy link
Contributor

The problem is there's no way to generically detect this. Because you can also accidentally redefine the same function, which may use a decorator which has nothing to do with dispatching, and pyflakes should report that.

My personal opinion is that libraries like multipledispatch should use more linter friendly/non-magic syntax. For example, something like

def utf8(value):
    ...

@overload(utf8)
def utf8_str(value: str):
    ...

@overload(utf8)
def utf8_bytes(value: bytes):
    ....

Or you could name every function _ (I can't remember if pyflakes supports that for function definitions, see #202 (comment)).

I don't generally like it when you are forced to rewrite code to make linters happy because they aren't smart enough, but when it comes to things that are effectively magic, I think it's a good thing, because the code is easier for humans to understand too.

In this case, it is "magic" because the overload decorator breaks the usual mental model of how Python works, which is that def function defines function and puts the name function in the namespace with that definition. @overload itself maybe isn't as bad as something like multipledispatch because it actually doesn't do anything (though this isn't obvious at first glance).

@overload specifically seems like it is part of the standard library now, with this recommended way of using it https://docs.python.org/3/library/typing.html#typing.overload. So pyflakes should probably support it. I don't know if it needs to support magic from thirdparty libraries like multipledispatch. That seems like a rabbit hole to go down that route.

@eellison
Copy link
Author

I think this boils down to a question of false positives versus false negatives.

What should be the base assumption: that decorators have side effects or that they don’t.

It is certainly possible for someone to use a decorator that doesn’t have side effects, and it would be ideal if pyflakes reported that. As above, it’s also bad when pyflakes assumes something doesn’t have side effects and reports the error.

pyflakes has currently special cased typing.overload to be considered the only side-effectful decorator (in this case, the side effect is to type checkers). Why not go the other direction ? If there is serious concerns about accidental redefinition, pyflakes can try to prove that the redeclaration didn’t have an effect.

As for syntax: I’m trying to follow the syntax to @overload as closely as possible because this is an accepted PEP with good documentation that I can point to. Changing the name of the function to utf8_str exposes the str version, which may undesirable. Altering the syntax also introduces a new api for declaring overloaded functions that is out of line with all of python’s other attempts, including typing.overload, but also https://github.com/mrocklin/multipledispatch, and also going all the way back to Guido in 2005 https://www.artima.com/weblogs/viewpost.jsp?thread=101605.

My opinion is that linters which warn about things they can’t prove quickly become noise, and noise quickly becomes ignored.

I don’t know to what extent redeclared functions with decorators is a problem versus the problem I have above; it may be the case for general users of pyflakes that the existing behavior is desirable.

@eellison
Copy link
Author

The arguments about syntax & the ignoring side effects seem out of line with pyflakes stated design philosophy:
Pyflakes makes a simple promise: it will never complain about style, and it will try very, very hard to never emit false positives.

@asottile
Copy link
Member

implementing what you're suggesting would make this no longer detected:

@pytest.mark.parametrize('s', (1, 2, 3))
def test_foo(s):
    # looks like a test, but definitely isn't being run
    ...

def test_foo():
    ....

in my opinion, the failure to detect this (a false negative) is way worse than the special case side-effect decorator false positive you're experiencing

@asmeurer
Copy link
Contributor

It does omit false positives if you do magic stuff that breaks its model of how code works. It will also not work for things that inject names into globals(), for instance. Virtually everything pyflakes checks for, except for syntax errors, can be broken by some magic. It doesn't want false positives but it also needs to be useful.

I personally am pushing back on this because I've had pyflakes save me from defining a duplicate function multiple times. Probably 50% of the time it's a function that is decorated (like @pytest.paramterize, @Property, @numba.jit, etc.). The test one in particular happens a lot because I quite often copy paste tests to modify them slightly, and then would forget to rename the test if it weren't for pyflakes warnings in my editor.

I also think special casing @typing.overload is not hard to do and makes sense.

@asottile
Copy link
Member

fwiw, pyflakes already supports @typing.overload

@eellison
Copy link
Author

@decorator
fun(): ...

is exactly the same as:

fun(): ...
fun = decorator(fun)

These are 1-1 mappings with exactly equivalent semantics. Functions are objects in python just like everything else and can be passed around and reused and redefined within the environment. There is nothing magic or model-breaking about it. Maybe it's syntax that you personally do not like, but that doesn't change whether it's an accepted part of the python language.

I also don't think a decorator functions having side-effects is magic. There are plenty of side-effectful functions that do things like capture a reference to an input but do not do things like inject names into globals().

I understand the stance that we should accept the false positives for this syntax because the cost of ignoring the false negatives is too high. It's inconvenient for me personally, and it goes against pyflakes stated design philosophy about syntax & false positives, but sometimes there are exceptions to the rules.

pyflakes' current behavior makes an assumption it can't prove; it would be nice to at least be able to toggle this assumption.

@asottile
Copy link
Member

pyflakes has no options and I don't think that's going to change

@eellison
Copy link
Author

pyflakes has no options and I don't think that's going to change

this would just be a different category of Error that you could ignore.

@sigmavirus24
Copy link
Member

this would just be a different category of Error that you could ignore.

So from what I think I've understood from this thread, making this a different category would mean being able to discern between a few basically identical cases.

@asmeurer
Copy link
Contributor

What would the category split look like? Having a different error message for function redefinition than other kinds of variable redefinition?

@eellison
Copy link
Author

What would the category split look like? Having a different error message for function redefinition than other kinds of variable redefinition?

Redefinition of unused decorated function or Redefinition of decorated function that is not used locally or something along those lines.

So from what I think I've understood from this thread, making this a different category would mean being able to discern between a few basically identical cases.

The options are:

  • leave as is
  • change behavior so that decorated functions are considered used
  • raise new warning stated above ^.

I would prefer the second option, but option 3 would also mostly solve my problem as well. I'm not sure what the sentiment of able to discern between a few basically identical cases is exactly lol. @sigmavirus24 what is your opinion here?

@sigmavirus24
Copy link
Member

I think you're assuming that pyflakes knows far more than it does hencee my "basically identical cases" comment. They're very nearly identical to pyflakes if I'm understanding this thread correctly.

@eellison
Copy link
Author

I think you're assuming that pyflakes knows far more than it does hencee my "basically identical cases" comment. They're very nearly identical to pyflakes if I'm understanding this thread correctly.

pyflakes already is able to cover this behavior with @typing.overload so my assumption is it would not be very difficult to generalize this to arbitrary decorators.

@asottile
Copy link
Member

pyflakes currently special cases typing.overload because it knows that very specific special case -- it can't realistically discern between the decorators where the function is intentionally unused afterwards and those which are not

@eellison
Copy link
Author

pyflakes currently special cases typing.overload because it knows that very specific special case -- it can't realistically discern between the decorators where the function is intentionally unused afterwards and those which are not

i'm suggesting just distinguishing between whether the existing unused function was decorated or not. If it was decorated then raise a different warning that can be ignored. That logic already exists as far as I can tell, just with a is_typing_overload_decorator check added.

@sigmavirus24
Copy link
Member

pyflakes already is able to cover this behavior with @typing.overload so my assumption is it would not be very difficult to generalize this to arbitrary decorators.

We special case typing.overload but we don't introspect the decorator to look at it.

i'm suggesting just distinguishing between whether the existing unused function was decorated or not. If it was decorated then raise a different warning that can be ignored.

What about code that looks like:

class C:
    @property
    def an_attr(self):
        return 0

    @property
    def an_attr(self):
        return 1

That would raise a different error code and should not be silenced.

My point here is that you're making assumptions and focusing very narrowly on one specific case of something that's potentially much larger, you don't seem to acknowledge anyone's explanations as to why this isn't as easy as you're imagining. It's not as easy as flipping a switch, and at this point your constant insistence that this is something easy we should just do isn't coming across as productive, cooperative, or collaborative and thus is becoming unwelcome.

I suspect that in the collective time spent discussing this issue we could have tried to implement something that would confuse and upset people for a use-case that seems to fall into the upper 2% while potentially causing regressions in other detections. But you might have also just accepted what the people familiar with the project have told you, repeatedly and respectfully.

@eellison
Copy link
Author

eellison commented Nov 16, 2019

class C:
    @property
    def an_attr(self):
        return 0

    @property
    def an_attr(self):
        return 1

Would have the error message: Redefinition of unused decorated function, and be in a different category of error than F811.

As I said above, I understand the other viewpoints in the thread that the cost of ignoring false negatives is too high. I've proposed my middle-ground solution. I am not insisting that this be accepted, I am trying to clarify what is being proposed exactly. If the proposed solution isn't accepted that's fine. Edit: I can do the PR if the solution is agreed upon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants