Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Add RAPIDS Nightly to GPU CI #436

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

praateekmahajan
Copy link
Collaborator

Description

Usage

# Add snippet demonstrating usage

Checklist

  • I am familiar with the Contributing Guide.
  • New or Existing tests cover these changes.
  • The documentation is up to date with these changes.

Signed-off-by: Praateek <[email protected]>
@praateekmahajan praateekmahajan added the gpuci Run GPU CI/CD on PR label Dec 17, 2024
Signed-off-by: Praateek <[email protected]>
Signed-off-by: Praateek <[email protected]>
@praateekmahajan praateekmahajan added gpuci Run GPU CI/CD on PR and removed gpuci Run GPU CI/CD on PR labels Dec 17, 2024
@praateekmahajan praateekmahajan added gpuci Run GPU CI/CD on PR and removed gpuci Run GPU CI/CD on PR labels Dec 17, 2024
Comment on lines +54 to +58
if [ "$BUILD_TYPE" = "nightly" ]; then \
pip install ".[all_nightly]"; \
else \
pip install ".[all]"; \
fi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might need to check if the nightly can use the same caching logic as stable, or if nightly should just always build a fresh non-cached container.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, wanted to flag this explicitly as we do some curator-update optimization above, and I want to ensure if branching doesn't mess with it.
cc @ko3n1g

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do these optimizations come from the base-image?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, if they do even if the base-image tag is unchanged b/c they overwrite it we should extend this build-workflow so that we can force pull the base image. In all other cases we don't need to be worried about caching i believe

@sarahyurick sarahyurick changed the title [WIP] Add GPU CI Nightly [WIP] Add RAPIDS Nightly to GPU CI Dec 17, 2024
@praateekmahajan praateekmahajan added gpuci Run GPU CI/CD on PR and removed gpuci Run GPU CI/CD on PR labels Dec 20, 2024
@ko3n1g
Copy link
Collaborator

ko3n1g commented Dec 20, 2024

Ok so here my suggestion:

  • We run the gpuci workflow on nightly schedule (via github workflows cron trigger)
  • We use use-cache: false to make sure we build from scratch with latest dependencies (merge ci: Bump _build_container workflow #448 before)
  • It doesn't matter whether we build two containers or one with two venvs. Initially I thought it would be easier for debugging we if we had a single container, but two containers (like you already prepared) have the benefit that size / container is smaller which come in handy next year
  • Tip: if: ${{ github.event.label.name == 'gpuci' || github.ref == 'refs/heads/main' }} needs to be changed to if: ${{ github.event.label.name == 'gpuci' || github.ref == 'refs/heads/main' || github. event_name == 'schedule' }}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
gpuci Run GPU CI/CD on PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants