Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending system_prompt to LLM even if it's empty #1263

Open
maziyarpanahi opened this issue Jun 10, 2024 · 0 comments
Open

Sending system_prompt to LLM even if it's empty #1263

maziyarpanahi opened this issue Jun 10, 2024 · 0 comments

Comments

@maziyarpanahi
Copy link

Hi,

I have access to an endpoint coming from vLLM that is serving a Mixtral model. Similar to this discussion #1127 I am seeing this error:

ERROR 06-10 20:48:19 serving_chat.py:158] Error in applying chat template from request: Conversation roles must alternate user/assistant/user/assistant/...

However, this seems to be a workaround, my System prompt is empty. In case there is nothing in the system prompt it should not be sent to the vLLM.

Steps to reproduce:

  • Launch vLLM with Mixtral or Mistral model
  • set the endpoint as openai
  • try to have a chat

This works without any issue in TGI however.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant