Skip to content

Conversation

rishab-partha
Copy link
Contributor

This PR adds a Load Planner that renames target modules of LoRA so that we can take a Composer checkpoint without LoRA and then use it to load and train a LoRA model.

Also support autoresume out-of-box by not modifying the checkpoint if LoRA is already used.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant