Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
zml-ai committed Jun 14, 2024
1 parent 6edee23 commit 661ca38
Showing 1 changed file with 9 additions and 2 deletions.
11 changes: 9 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -374,17 +374,24 @@ All models will be automatically downloaded. For more information about the mode
To leverage DeepSpeed in training, you have the flexibility to control **single-node** / **multi-node** training by adjusting parameters such as `--hostfile` and `--master_addr`. For more details, see [link](https://www.deepspeed.ai/getting-started/#resource-configuration-multi-node).

```shell
# Single Resolution Data Preparation
# Single Resolution Training
PYTHONPATH=./ sh hydit/train.sh --index-file dataset/porcelain/jsons/porcelain.json
# Multi Resolution Data Preparation
# Multi Resolution Training
PYTHONPATH=./ sh hydit/train.sh --index-file dataset/porcelain/jsons/porcelain.json --multireso --reso-step 64
```

### LoRA

We provide training and inference scripts for LoRA, detailed in the [guidances](./lora/README.md).

```shell
# Training for porcelain LoRA.
PYTHONPATH=./ sh lora/train_lora.sh --index-file dataset/porcelain/jsons/porcelain.json
# Inference using trained LORA weights.
python sample_t2i.py --prompt "青花瓷风格,一只小狗" --no-enhance --lora_ckpt log_EXP/001-lora_porcelain_ema_rank64/checkpoints/0001000.pt
```

## 🔑 Inference

Expand Down

0 comments on commit 661ca38

Please sign in to comment.