Skip to content

Commit

Permalink
[chore] BERTScore averaging
Browse files Browse the repository at this point in the history
  • Loading branch information
karter-liner committed Jan 1, 2024
1 parent 665a20e commit 9ac7210
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 10 deletions.
10 changes: 0 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,16 +96,6 @@ Answers based on SUMMARY (Questions are generated from Summary)

QAGS Score: 0.3333333333333333

SOURCE Triples
('Messi', 'is', 'Argentine')
('Messi', 'is', 'professional')

SUMMARY Triples
('Messi', 'is', 'Spanish')
('Messi', 'is', 'professional')

Triple Score: 0.5

Avg. ROUGE-1: 0.4415584415584415
Avg. ROUGE-2: 0.3287671232876712
Avg. ROUGE-L: 0.4415584415584415
Expand Down
5 changes: 5 additions & 0 deletions factsumm/factsumm.py
Original file line number Diff line number Diff line change
Expand Up @@ -362,6 +362,11 @@ def calculate_bert_score(
scores["recall"] += recall.item()
scores["f1"] += f1.item()

if len(summary_lines) > 1:
scores["precision"] /= len(summary_lines)
scores["recall"] /= len(summary_lines)
scores["f1"] /= len(summary_lines)

logging.info("<BERTScore Score>\nPrecision: %s\nRecall: %s\nF1: %s", scores["precision"], scores["recall"], scores["f1"])

return scores
Expand Down

0 comments on commit 9ac7210

Please sign in to comment.