-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
24.3 Add config for marking broken integration tests #472
Conversation
This is an automated comment for commit a77f135 with description of existing statuses. It's updated for the latest CI running ❌ Click here to open a full report in a separate page
Successful checks
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please fix mentioned issues.
Also, were you considering usage of @pytest.mark.xfail
?
First of all: it is kind of standard to pytest, second it allows point-xfail and more....
https://docs.pytest.org/en/stable/how-to/skipping.html#xfail
https://docs.pytest.org/en/stable/reference/reference.html#pytest-mark-xfail-ref
Yes, pytest marks were considered, however that requires modifying test code and makes it less convenient to see a list of all broken tests. |
I haven't added a comment field yet because the error message seems clear enough. It will be trivial to add such when we require it. |
No, let's add support for a comment field now. We don't want to revisit this functionality for such a small thing. I would call that field |
Done. I don't know how to propagate the reason field upwards at this time, so it will just serve to help someone who is looking at the config file. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
CI Fix or Improvement
Add tests/integration/broken_tests.json which is used to selectively replace FAILED and ERROR statuses with BROKEN
Supersedes #469