Skip to content

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

License

Notifications You must be signed in to change notification settings

Deerkangkang/P-tuning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

P-tuning

❗ News

🌟 [2022-10-06] Thrilled to present GLM-130B: An Open Bilingual Pre-trained Model. It is an open-sourced LLM outperforming GPT-3 175B over various benchmarks. Get model weights and do inference and P-Tuning with only 4 * RTX 3090 or 8 * RTX 2080 Ti FOR FREE!

🌟 [2022-07-14] Parameter-Efficient Prompt Tuning Makes Generalized and Calibrated Neural Text Retrievers is out! Check our code.

🌟 [2021-10-15] P-tuning v2 is out! Check our Github repo.

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

Xiao Liu*, Yanan Zheng*, Zhengxiao Du, Ming Ding, Yujie Qian, Zhilin Yang, Jie Tang

You may be also interested in our another work GLM: All NLP Tasks Are Generation Tasks: A General Pretraining Framework

How to use our code

We have released the code and datasets for LAMA and few-shot SuperGLUE (32-dev) experiments. Please check README.md and requirement.txt in the corresponding subdirectories for details.

The LAMA and FewGLUE_32dev datasets are available. The LAMA dataset should be placed in ./data directory, and the SuperGLUE dataset should be placed in the ./ (project root) directory.

Citation

If you find our work useful, please cite the following paper:

    @article{liu2021gpt,
    title={GPT Understands, Too},
    author={Liu, Xiao and Zheng, Yanan and Du, Zhengxiao and Ding, Ming and Qian, Yujie and Yang, Zhilin and Tang, Jie},
    journal={arXiv:2103.10385},
    year={2021}
    }

About

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.3%
  • Shell 1.7%