You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are you using the c4 calibration dataset or the redpajama? In my experience, using the calibration dataset from redpajama could get numbers as high as around 7.10.
bash scripts/llama_7b.sh
the source model: wikitext perplexity is 5.67702
prune this model, sparsity 50%, get wikitext perplexity is 7.09153509
but the paper is : 50% 7.26
why?
The text was updated successfully, but these errors were encountered: