Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incorrect evaluation metrics #1

Open
mohayl opened this issue Jan 19, 2023 · 0 comments
Open

Incorrect evaluation metrics #1

mohayl opened this issue Jan 19, 2023 · 0 comments

Comments

@mohayl
Copy link

mohayl commented Jan 19, 2023

In the evaluation of binary metrics, precision_score, recall_score and f1_score get label and score in wrong order:
**

def compute_binary_metrics(anomaly_pred, anomaly_label, adjustment=False):
if not adjustment:
eval_anomaly_pred = anomaly_pred
metrics = {
"f1": f1_score(eval_anomaly_pred, anomaly_label),
"pc": precision_score(eval_anomaly_pred, anomaly_label),
"rc": recall_score(eval_anomaly_pred, anomaly_label),
}

**

It should be:
**

def compute_binary_metrics(anomaly_pred, anomaly_label, adjustment=False):
if not adjustment:
eval_anomaly_pred = anomaly_pred
metrics = {
"f1": f1_score(anomaly_label,eval_anomaly_pred),
"pc": precision_score(anomaly_label, eval_anomaly_pred),
"rc": recall_score(anomaly_label, eval_anomaly_pred),
}

**

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant