Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add is best model to training servicer #229

Merged

Conversation

thodkatz
Copy link
Collaborator

@thodkatz thodkatz commented Dec 13, 2024

It builds upon #228 .

Whenever we have a new best model, the IsBestModel stream will yield a response. The client can utilize this to perform certain actions e.g. ilastik to propagateDirty any predictions performed on previous models.

@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch 3 times, most recently from b60fb5f to 90b4a5e Compare December 19, 2024 11:15
@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch from 90b4a5e to 811ddd3 Compare December 20, 2024 22:04
Copy link

codecov bot commented Dec 20, 2024

Codecov Report

Attention: Patch coverage is 62.19512% with 372 lines in your changes missing coverage. Please review.

Project coverage is 62.83%. Comparing base (5ea5d3a) to head (14f81af).

Files with missing lines Patch % Lines
tiktorch/trainer.py 42.53% 127 Missing ⚠️
tiktorch/proto/training_pb2_grpc.py 54.78% 52 Missing ⚠️
tiktorch/server/session/backend/supervisor.py 68.86% 52 Missing ⚠️
tiktorch/proto/training_pb2.py 27.02% 27 Missing ⚠️
tiktorch/proto/utils_pb2.py 30.00% 21 Missing ⚠️
tiktorch/proto/inference_pb2.py 20.00% 20 Missing ⚠️
tiktorch/server/session/backend/commands.py 78.40% 19 Missing ⚠️
tiktorch/server/session/backend/base.py 66.66% 17 Missing ⚠️
tiktorch/server/session/process.py 72.34% 13 Missing ⚠️
tiktorch/server/session/rpc_interface.py 71.42% 10 Missing ⚠️
... and 5 more
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #229      +/-   ##
==========================================
- Coverage   64.60%   62.83%   -1.78%     
==========================================
  Files          40       47       +7     
  Lines        2195     2876     +681     
==========================================
+ Hits         1418     1807     +389     
- Misses        777     1069     +292     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch 2 times, most recently from b10d4f9 to 14f81af Compare December 21, 2024 11:15
@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch from 14f81af to 3d62b13 Compare January 20, 2025 13:28
Whenever we have a new model, a stream will yield a response. The client
can utilize this to perform certain actions e.g. ilastik to
propagateDirty any predictions performed on previous models.
@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch from 3d62b13 to c5b8bf6 Compare January 20, 2025 13:38
The response of the best model stream will return an id. The id is
increased by one, each time we have a new model. A client can identify
if an action has been performed by an outdated model based on the id. If
    the current is greater, then a new best model exists.
@thodkatz thodkatz force-pushed the add-is-best-model-to-training-servicer branch from c5b8bf6 to 44bc634 Compare January 20, 2025 13:41
Copy link
Collaborator

@k-dominik k-dominik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @thodkatz,
this is great I don't think it needs any changes 👍

Edit: test failure on mac is a bit surprising (but then maybe not) and needs investigation.

@thodkatz thodkatz merged commit 15a0225 into ilastik:main Jan 20, 2025
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants