Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow for using the same metrics with different parameters (e.g. Recall@10 + Recall@20) #2760

Open
aurelien-clu opened this issue Feb 1, 2025 · 0 comments
Labels
enhancement Enhance existing features

Comments

@aurelien-clu
Copy link

Note: could be considered as a bug, if it is expected to already allow this behavior otherwise can be seen as a new feature (though a raised exception before the implementation could be useful)

Feature description

Allow to use several time the same metric with different parameters, e.g. Recall@10 + Recall@20.

Right now it does not work as the metric name does not take into account the parameters.

Metrics that could be updated:

  • FBetaScoreMetric
  • HammingScore
  • PrecisionMetric
  • RecallMetric
  • TopKAccuracyMetric
  • ... (may have missed some other)

Feature motivation

For simple use cases, deciding on the performance of a model with a single metric or different metrics is feasible. In my case I can accept to have a worse Recall@5 if the Recall@10 is much better. So I need to decide on both metrics.

(Optional) Suggest a Solution

Current:

FormatOptions::new(Self::NAME).unit("%").precision(2)

Potential solution:

FormatOptions::new(format!("{} ({:?})", Self::NAME, self.config)).unit("%").precision(2)

Config does not implement Debug so it probably should be added unless there are reasons not to.

@laggui laggui added the enhancement Enhance existing features label Feb 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhance existing features
Projects
None yet
Development

No branches or pull requests

2 participants