Skip to content

id4thomas/prefix-tuning-kor

Repository files navigation

prefix-tuning-kor

Prefix Tuning for Korean LMs

Environment

  • transformers==4.19.2
  • torch==1.14.0.dev20221029

Models Code

Guide

git clone https://github.com/HKUNLP/UnifiedSKG.git

Training Example

  • Trained 1.3B GPT-Neo-X Model ("EleutherAI/polyglot-ko-1.3b") using open chatbot data
  • 0_data_preprocess.ipynb
    • Split data into Train/Val/Test
  • 1_train_gpt_neox.ipynb
    • Train prefix weights using huggingface Trainer
  • 2_generate_gptneox.ipynb
    • generation example (comparison with baseline)

About

Prefix Tuning for Korean LMs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published