Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Catch and verify an expected UserWarning in the test suite #8041

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

kurtmckee
Copy link
Contributor

@kurtmckee kurtmckee commented Aug 27, 2024

The test suite validates that an SQS FIFO queue with content-deduplication disabled does not receive messages from EventBridge, but doesn't catch and verify the UserWarning that gets raised in the code because of this (recent example in CI). As a result, that UserWarning gets reported by pytest at the end of the test run.

This PR uses pytest.warns() to catch and verify the expected UserWarning.

@kurtmckee kurtmckee changed the title Verify an expected UserWarning from one SQS FIFO queue Catch and verify an expected UserWarning in the test suite Aug 28, 2024
Copy link
Collaborator

@bblommers bblommers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @kurtmckee! Thank you for improving our CI logs.

The tests are run twice - once regular, and once in ServerMode.
Moto in ServerMode runs in separate Docker-container, so we can't intercept any user warnings there.

We could handle this with an if-else (if normal: intercept warning & put_events, else: put_events without intercept). I'm not sure if it's worth the effort though, considering it doesn't improve the readability of the test.

If you can think of a different non-invasive way to achieve this, I'm all ears, but the warnings in the CI don't bother me enough to warrant big changes to the test.

@kurtmckee
Copy link
Contributor Author

kurtmckee commented Aug 30, 2024

Hello @bblommers, and thanks for the quick review!

The benefit to reducing these warnings is to move toward escalating warnings to errors. It is possible to filter out warnings caused by project dependencies, but it's good hygiene to address (or filter out) existing warnings and then -- from a position of zero warnings -- escalate all warnings to errors. This allows CI to catch new Python DeprecationWarnings, or warnings issued by dependencies, early.

It looks like the CI failures are unrelated to this change; were you mentioning the two modes of execution to suggest that this change caused some of the CI failures seen here? Here are the failing tests in CI (Python 3.12, Ubuntu):

  • tests/test_core/test_proxy.py::test_http_get_request_can_be_passed_through
  • tests/test_core/test_proxy.py::test_http_post_request_can_be_passed_through
  • tests/test_core/test_proxy.py::test_https_request_can_be_passed_through

but this PR only touched this test:

  • tests/test_events/test_events_integration.py::test_send_to_sqs_fifo_queue

Looking at the Python 3.12 / Ubuntu CI logs, I'm not seeing that this is related, so please let me know if I'm misunderstanding what you're saying!

@bblommers
Copy link
Collaborator

@kurtmckee Those tests all reach out to website that was temporarily unavailable - that should be fine when restarting the build. I was talking about these logs:
https://github.com/getmoto/moto/actions/runs/10580339607/job/29315346057

E Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted.
E Emitted warnings: [].

@kurtmckee
Copy link
Contributor Author

kurtmckee commented Aug 30, 2024

Aha, I see now. Thanks for pointing that out!

If you're game for it, lemme see if there's something simple that can be done for this scenario (it'll likely be Tuesday morning due to an upcoming holiday), otherwise you can close this PR. 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants