You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I have tried to use the distillation script you provided(examples/llm_distill/main.py) to distill the LLM. However, I found that only the soft label is used in the script.
I would like to ask how to modify the code so that it can be combined with the original loss (hard label) of the student model.
It would be of great help to me if you could reply! Thank you.