Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Resolve pylint and other warnings #146

Open
wants to merge 6 commits into
base: develop
Choose a base branch
from
Open

Conversation

dmnapolitano
Copy link
Contributor

Description

Hi! The changes in this PR resolve warnings we see produced by pylint during tox, deprecation warnings from pandas, and other warnings produced by edge-cases that are expected. Since warnings are often indicative of a potential problem, it's a good idea to address them so that it's easier to find new warnings encountered during future development work or other unexpected warnings generated during runtime. Thanks! 🎉

This work was previously part of #145 👍🏻

Jira Ticket

Test Steps

tox, testbed, elexmodel commands, etc. 🎉

@dmnapolitano dmnapolitano requested a review from a team as a code owner December 13, 2024 16:34
batch_margin = (
np.diff(results_dem, append=results_dem[-1]) - np.diff(results_gop, append=results_gop[-1])
) / np.diff(results_weights, append=results_weights[-1])
with warnings.catch_warnings():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I vaguely remember us doing something previously to avoid this already. At least stopping it turning up in the logs. Am I misremembering?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting 🤔 Maybe it was another spot where we can occasionally get divisions by zero? Without this, right now I get warnings here when running the unit tests 🤔 Curious 🤔

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you re-run the 2024 in the testbed (without writing anything to s3), do you see this warning?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright so I ran this:

python run.py 2024-11-05_USA_G redo --office_id S --estimands="['margin']" --features "['baseline_normalized_margin']" --end_timestamp 2024-11-06T12:00:00-05:00 --model_parameters='{"extrapolation" : True, "versioned_start_date" : "2024-11-05T21:00:00-05:00", "versioned_end_date" : "2024-11-05T21:30:00-05:00"}

And the answer right now is no, BUT, I noticed that even though we're capturing warnings, they're not currently coming through in the logs. Having changed that, this actually generates a lot of warnings 😬

PR incoming 👀

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jjcherian ! Hope you're doing well 😄 👋🏻

We noticed that these few lines can generate quite a few invalid value encountered in divide and divide by zero encountered in divide warnings. If you clone the develop branch and run, for example,

elexmodel 2024-11-05_USA_G --estimands=margin --features=baseline_normalized_margin --office_id=S --geographic_unit_type=county --pi_method bootstrap --national_summary --model_parameters '{"extrapolation" : True, "versioned_start_date" : "2024-11-05T21:00:00-05:00", "versioned_end_date" : "2024-11-05T21:30:00-05:00"}'

you should see them. We were wondering what the best way to handle these warning is. Right now, in this PR, I've added code to ignore (and therefore silence) them, but do you think there's a better way to resolve these? 🤔

Thanks in advance 😄

@@ -128,7 +132,7 @@ def wait_for_versions(self, q):
try:
future.result()
yield version, data
except Exception as e:
except Exception as e: # pylint: disable=broad-exception-caught
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is just to remove the tox error? We haven't really cared about those in the past, since we use our pre-commit hooks to make sure we are pep compliant.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In that case maybe we should disable pylint altogether. If we're just going to ignore its complaints, we might as well speed up our tox runs 😄

Another option here would be to add more things to our list of pylint complaints we ignore in setup.cfg, although I'm hesitant to do that with too many things because a lot of these are useful, but there are cases where we want to ignore them. This one in particular is here because boto3 and its related libraries can raise several different types of Exceptions and we want to handle them all the same way. Ideally, we'd break that out or specify the types but honestly, I find it hard to know up front which exception types boto3 can throw 😬

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any downside to disabling pylint? Do we use it in other live team repos? If we do, I think we just ignore the complaints.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll ask, but if they're using pylint, they're probably not ignoring the complaints. In which case, I'm not sure why we would either. Sure, some of these are annoying, but we can put in rules to have pylint not issue annoying complaints about benign things (e.g. "too many arguments") and keep the ones that are useful and point towards potential problems 👀

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the answer is we should stop using tox and use pytest directly in the unit tests. Since there is no need to use two different linters

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmmmmmmm...I'd argue in favor of using tox but removing pylint from it. tox is great because we can specify multiple Python versions and it'll run whatever suite of tests etc. for each version in isolated environments. We can also have it run pre-commit for us, e.g. https://github.com/jugmac00/flask-reuploaded/blob/master/tox.ini 🤔 But yeah it looks pretty easy to remove pylint from tox, which is good 😄

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lennybronner alright so I did some reading here and it seems like pylint is designed with the intention of being overly-cautious. The best way I saw it described is that pylint is a superset of flake8. We probably don't need the extra complaints pylint generates on top of flake8 🤔

So yeah, let me know if you want me to remove pylint in this PR or open another one 😄

tests/models/test_bootstrap_election_model.py Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants