Skip to content

[COLING 2025] SKIntern: Internalizing Symbolic Knowledge for Distilling Better CoT Capabilities into Small Language Models

License

Notifications You must be signed in to change notification settings

Xnhyacinth/SKIntern

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SKIntern

🛠 Requirements

Install LLaMA-Factory following LLaMA-Factory.

cd SKIntern
pip install -e ".[torch,metrics]"

💡 Data

  • Download the datasets from official websites.

  • From Google drive: (we unified the formats of the above datasets). Link

🤝 Referencing and Citing

If you find our work useful in your research and would like to cite our project, please use the following citation: found this work useful, please consider giving this repository a star and citing our paper as follows:

@inproceedings{liao2025skintern,
  title={SKIntern: Internalizing Symbolic Knowledge for Distilling Better CoT Capabilities into Small Language Models},
  author={Liao, Huanxuan and He, Shizhu and Hao, Yupu and Li, Xiang and Zhang, Yuanzhe and Zhao, Jun and Liu, Kang},
  booktitle={Proceedings of the 31st International Conference on Computational Linguistics},
  pages={3203--3221},
  year={2025}
}

About

[COLING 2025] SKIntern: Internalizing Symbolic Knowledge for Distilling Better CoT Capabilities into Small Language Models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages