RUCBM
Popular repositories Loading
-
rule-distillation
rule-distillation PublicCode&Data for the paper "Distilling Rule-based Knowledge into Large Language Models" [COLING 2025]
-
weak-to-strong-deception
weak-to-strong-deception Public[ICLR 2025] Code&Data for the paper "Super(ficial)-alignment: Strong Models May Deceive Weak Models in Weak-to-Strong Generalization"
Repositories
Showing 4 of 4 repositories
- weak-to-strong-deception Public
[ICLR 2025] Code&Data for the paper "Super(ficial)-alignment: Strong Models May Deceive Weak Models in Weak-to-Strong Generalization"
RUCBM/weak-to-strong-deception’s past year of commit activity - rule-distillation Public
Code&Data for the paper "Distilling Rule-based Knowledge into Large Language Models" [COLING 2025]
RUCBM/rule-distillation’s past year of commit activity
People
This organization has no public members. You must be a member to see who’s a part of this organization.
Top languages
Loading…
Most used topics
Loading…