-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
vicuna #13
vicuna #13
Conversation
based on vicuna format coming from https://huggingface.co/junelee/wizard-vicuna-13b/discussions/1 |
@aglick24 thanks for the contribution! It looks like there is some dispute about the precise format though -- https://huggingface.co/junelee/wizard-vicuna-13b/discussions/1#6453c515b8c58783d662bc05 Also I think you will need to import it in the init.py like so: https://huggingface.co/junelee/wizard-vicuna-13b/discussions/1#6453c515b8c58783d662bc05 |
Thanks @anjor ! What do you think about this format? On lines 393-404 https://github.com/lm-sys/FastChat/blob/main/fastchat/conversation.py and if so, should I include the separator? |
This seems to me to be an example of what that conversation template should look like this (using (SEP) instead of the actual separator because it disappears in the comment thread): "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions. USER: What is ChatGPT? ASSISTANT: ChatGPT is an advanced language model developed by OpenAI. It's designed to understand and generate natural language, so it can converse with users, answer questions, and even assist with various tasks.(SEP)USER: So, it's like a super-advanced chatbot? ASSISTANT: Yes, but it can engage in broader conversations and more complex reasoning.(SEP)" |
Following up @anjor |
pychatml/vicuna_v1_1/vicuna_v1_1.py
Outdated
return SEPARATOR.join(chat_history) | ||
|
||
|
||
if __name__ == "__main__": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's remove this. The test is good enough.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep, sorry forgot about it!
@aglick24 let us know if there is anything we can help with to get it over the line and merged? |
Thanks @koogle for merging main! Sorry about the delay, I assumed we were waiting on the changes to merge to main before I would keep going. Looking through it now to update it to the new format |
tests/test_llama2.py
Outdated
@@ -8,7 +8,7 @@ | |||
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information. | |||
<</SYS>> | |||
|
|||
Hi, how are you? [/INST] Good thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is that change intentional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nope, thanks for catching! Autoformatter strikes again. Just reverted all of the Llama2 changes
Thanks so much for the approval @koogle ! Unfortunately, I need write access to merge |
Okay in that case I will merge it , thank you for the contribution |
added a to_chatml function for vicuna and testing the fork, will next do a from_chatml function