Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up check.sh, move stuff to pre-commit #3157

Merged
merged 10 commits into from
Dec 24, 2024
27 changes: 25 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ ci:
autofix_prs: true
autoupdate_schedule: weekly
submodules: false
skip: [regenerate-files]
skip: ["pip-compile"]

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand Down Expand Up @@ -40,7 +40,30 @@ repos:
hooks:
- id: regenerate-files
name: regenerate generated files
language: system
language: python
jakkdl marked this conversation as resolved.
Show resolved Hide resolved
entry: python src/trio/_tools/gen_exports.py
pass_filenames: false
additional_dependencies: ["astor", "attrs", "black", "ruff"]
files: ^src\/trio\/_core\/(_run|(_i(o_(common|epoll|kqueue|windows)|nstrumentation)))\.py$
- repo: https://github.com/astral-sh/uv-pre-commit
rev: 0.5.9
hooks:
# Compile requirements
- id: pip-compile
name: uv pip-compile test-requirements.in
args: [
"--universal",
"--python-version=3.9",
"test-requirements.in",
"-o",
"test-requirements.txt"]
jakkdl marked this conversation as resolved.
Show resolved Hide resolved
files: ^test-requirements\.(in|txt)$
- id: pip-compile
name: uv pip-compile docs-requirements.in
args: [
"--universal",
"--python-version=3.11",
"docs-requirements.in",
"-o",
"docs-requirements.txt"]
files: ^docs-requirements\.(in|txt)$
40 changes: 0 additions & 40 deletions check.sh
Original file line number Diff line number Diff line change
Expand Up @@ -11,44 +11,6 @@ if [ -z "${GITHUB_STEP_SUMMARY+x}" ]; then
ON_GITHUB_CI=false
fi

# Test if the generated code is still up to date
echo "::group::Generate Exports"
python ./src/trio/_tools/gen_exports.py --test \
|| EXIT_STATUS=$?
echo "::endgroup::"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I probably should do this in another PR but I think we should move gen exports out of pre-commit and back here. Unfortunately, as things currently are, if you modify a file that the pre-commit hook says it uses then commit with a git client (ie not just git commit -m "..." in the venv you have for trio), gen_exports doesn't get the dependencies it needs.

So far I've been getting that error then going to my terminal to run git there, but that's a bit much.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pre-commit with a external git client doesn't install required dependencies?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, because we don't specify them. We don't specify them because pre-commit doesn't let you install from a requirements file :(

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I specified them in the latest commit:

additional_dependencies: ["astor", "attrs", "black", "ruff"]

but that doesn't let us pin the versions (in a non-awful way), so yeah if we don't want that then it has to go outside of pre-commit

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

additional_dependencies: ["astor", "attrs", "black", "ruff"]

Keep in mind that the pre-commit cache is virtually immortal. You can clean it locally, but not in the pre-commit.ci service. For that, changing the deps list or the hook version is required. This may be problematic sometimes.

So I've found that the best thing to do here is to keep the version pins in the list and bump them periodically. Just so that the cache doesn't grow too old.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I'm personally a big fan of tox. #2790 discussed hatchling too but we ended up bikeshedding too much to actually do anything in the end.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd be happy to switch to nox -- potentially or tox. I haven't used either and attrs uses tox so it can't be that bad, but https://hynek.me/articles/why-i-like-nox/ is convincing to me...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've yet to use nox, but would def be open to.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've contributed to projects with nox, and they're slightly more inconvenient. I'd also want to run nox python files under coverage.

one thing to bear in mind with tox is that constraints are a bit of a pain, tox-uv and a lock file seems the way to go:

https://github.com/tox-dev/tox-uv?tab=readme-ov-file#uvlock-support

Copy link
Member

@webknjaz webknjaz Dec 26, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've been taking DIY lock files to extremes with tox for a while. Haven't yet tried uv for that, really. I know that they were implementing the poetry-style format, not real lock files. I'm looking forward to seeing Brett's PEP accepted, honestly.

I also have a lot more experience with tox and haven't contributed to nox. Although, I have contributed to projects using nox. There's limitations for parallelism, for example. Some scripting would be more convenient in Python. However, tox 4 has a toxfile.py that should be able to cover a lot of that. I'm yet to play with it properly.


# Autoformatter *first*, to avoid double-reporting errors
# (we'd like to run further autoformatters but *after* merging;
# see https://forum.bors.tech/t/pre-test-and-pre-merge-hooks/322)
# autoflake --recursive --in-place .
# pyupgrade --py3-plus $(find . -name "*.py")
echo "::group::Black"
if ! black --check src/trio; then
echo "* Black found issues" >> "$GITHUB_STEP_SUMMARY"
EXIT_STATUS=1
black --diff src/trio
echo "::endgroup::"
echo "::error:: Black found issues"
else
echo "::endgroup::"
fi

# Run ruff, configured in pyproject.toml
echo "::group::Ruff"
if ! ruff check .; then
echo "* ruff found issues." >> "$GITHUB_STEP_SUMMARY"
EXIT_STATUS=1
if $ON_GITHUB_CI; then
ruff check --output-format github --diff .
else
ruff check --diff .
fi
echo "::endgroup::"
echo "::error:: ruff found issues"
else
echo "::endgroup::"
fi

# Run mypy on all supported platforms
# MYPY is set if any of them fail.
MYPY=0
Expand Down Expand Up @@ -94,8 +56,6 @@ if git status --porcelain | grep -q "requirements.txt"; then
echo "::endgroup::"
fi

codespell || EXIT_STATUS=$?
Copy link
Contributor

@A5rocks A5rocks Dec 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(not the right line because github isn't letting me comment there)

Should you remove the uv pip compile lines above? And also update the error message + add a pre-commit run -a line in check.sh so people can run check.sh locally? (I'm not sure people do run check.sh locally but...)

Copy link
Member Author

@jakkdl jakkdl Dec 18, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Realized we can't do uv pip compile in pre-commit in CI since it needs internet, so it needs to be here (or somewhere).

My goal is not to require/expect people to run check.sh locally as it's unintuitive and clunky, but given the limitations of pre-commit we do need to stash these somewhere and make it easy for users to run them locally if they want.
I'd personally vote for tox (due to familiarity), and set up environments for the different tests.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I usually wrap the pre-commit invocation in tox for local run anyway + run it in CI too, which covers the case of the service not having the internet during testing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could also add a local system hook to pre-commit that runs check.sh if we were worried about people forgetting to run it

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could also add a local system hook to pre-commit that runs check.sh if we were worried about people forgetting to run it

There's several problems with doing this currently:

  1. mypy is actually failing in my workdir, probably due to something dirty in my env. tox/nox would fix this
  2. running check.sh takes a full minute on my current system, so I don't want to run it on every commit (pre-commit run --all-files takes ~8 seconds)
  3. I catch most type errors from running mypy through pylsp in my editor anyway

So I don't ~ever run check.sh locally, and don't expect others to either. (Nor do I run ci.sh, I prefer manually running pytest)

This def isn't super obvious for new contributors though, but tox/nox should help with that. E.g. in tox you can create a label for a collection of environments that is considered "standard" to run locally.


echo "::group::Pyright interface tests"
python src/trio/_tests/check_type_completeness.py || EXIT_STATUS=$?

Expand Down
Loading