From 25b8b8574c62667bd3ee19ec983c6e2a667b5dc4 Mon Sep 17 00:00:00 2001 From: Lxinyang <68582965+littleSunlxy@users.noreply.github.com> Date: Mon, 19 Feb 2024 11:44:15 +0800 Subject: [PATCH] Update MobileLLaMA_SFT.md update the detail description of MobileLLaMA SFT --- mobilellama/sft/MobileLLaMA_SFT.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/mobilellama/sft/MobileLLaMA_SFT.md b/mobilellama/sft/MobileLLaMA_SFT.md index 13206c2..b55fded 100644 --- a/mobilellama/sft/MobileLLaMA_SFT.md +++ b/mobilellama/sft/MobileLLaMA_SFT.md @@ -1,7 +1,7 @@ # MobileLLaMA SFT ## 🛠️ Installation -Our MobileLLaMA SFT based on [FastChat](https://github.com/lm-sys/FastChat) (commit id: 81785d7ed1d6afb966b464a8ee4689b7413e6313) +Our MobileLLaMA SFT training code is based on [FastChat](https://github.com/lm-sys/FastChat) (commit id: 81785d7ed1d6afb966b464a8ee4689b7413e6313) ### Install From Source. 1. Clone the [FastChat](https://github.com/lm-sys/FastChat) repository and navigate to the FastChat folder @@ -27,7 +27,7 @@ You can download MobileLLaMA-1.4B-Base / MobileLLaMA-2.7B-Base model from huggin ## Dataset We use the sft dataset in Vicuna fromat can be download from link: [ShareGPT_Vicuna_dataset](https://huggingface.co/datasets/Aeala/ShareGPT_Vicuna_unfiltered), and follow the steps: 1. download the [json](https://huggingface.co/datasets/Aeala/ShareGPT_Vicuna_unfiltered/blob/main/ShareGPT_V4.3_unfiltered_cleaned_split.json) file to local data path. -2. white the correct "--data_path" in the training script. +2. write the correct "--data_path" in your SFT training scripts. ## 💎 Training Our training process can be reproduced by runing the scrips: