-
Notifications
You must be signed in to change notification settings - Fork 2.9k
Closed
Labels
models[Component] Issues related to model support[Component] Issues related to model support
Description
Describe the Bug:
When sending a message with the string <script> in any part of the content, LiteLLM returns a 403 Forbidden message. I get this regardless of tool, happens consistently via Python OpenAI SDK, Cline, and OpenWebUI.
Steps to Reproduce:
Python OpenAI Example:
import openai
client = openai.OpenAI(
api_key="sk-xxxxxxxxxxxx",
base_url="https://my-litellm.com"
)
import base64
# Example with text only
response = client.chat.completions.create(
model="gpt-5-mini",
messages=[
{
"role": "user",
"content": "hi"
},
{
"role": "assistant",
"content": "Hello! How can I help you today?"
},
{
"role": "user",
"content": "<script> test </script>"
}
]
)
print(response)
Expected Behavior:
Continuation of previously working convo.
Observed Behavior:
From Python example above:
Traceback (most recent call last):
File "/Users/user/dev/python-test/test.py", line 16, in <module>
response = client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/dev/python-test/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 286, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/dev/python-test/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 1192, in create
return self._post(
^^^^^^^^^^^
File "/Users/user/dev/python-test/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1294, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/dev/python-test/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1067, in request
raise self._make_status_error_from_response(err.response) from None
openai.PermissionDeniedError: <html>
<head><title>403 Forbidden</title></head>
<body>
<center><h1>403 Forbidden</h1></center>
</body>
</html>
Same example in OpenWebUI, using litellm as OpenAI API Connection:
Environment Details:
- LiteLLM Version: v1.81.3
- OpenAI Python SDK: 2.16.0
- Python Version (python -V): 3.12.11
Model Information:
- Are you using LiteLLM: Yes
- Which model is being used: All models (GPT 5.x, Claude, Gemini)
How often has this issue occurred?:
- Always (100%)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
models[Component] Issues related to model support[Component] Issues related to model support