Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Azure File Attachment Missing When Adding to Thread using Threads Proxy Endpoint #7896

Open
Vinnie-Singleton-NN opened this issue Jan 21, 2025 · 0 comments
Labels
bug Something isn't working mlops user request

Comments

@Vinnie-Singleton-NN
Copy link

What happened?

When using Azure to add a message to a thread via the /threads or /v1/threads endpoints and referencing an already uploaded file, the attachment section of the message is blank but no error is thrown. For example this code will successfully add the message "What is my favorite color?" to the thread but will not attach the file that contains the information.

def add_message_to_thread(thread_id, 
                          message, 
                          file_id=None):
    url = f'http://0.0.0.0:4000/v1/threads/{thread_id}/messages'
    headers = {
        'Content-Type': 'application/json',
        'Authorization': f'Bearer {os.getenv("LITELLM_API_KEY")}'
    }

    data = {
        "role": "user",
        "content": message,
        "attachments": [
            { 
                "file_id": file_id, 
                "tools": [{"type": "file_search"}] 
            }
        ],
    }
    print("Data to send:",data)

    response = requests.post(url, headers=headers, json=data)
    print(response)

add_message_to_thread(thread_id="my-thread-id", 
                      message="What is my favorite color?", 
                      file_id="my-file-id")

Result:

{
  "id": "my-id",
  "assistant_id": null,
  "attachments": [], # <- Empty
  "completed_at": null,
  "content": [
    {
      "text": {
        "annotations": [],
        "value": "What is my favorite color?"
      },
      "type": "text"
    }
  ],
  "created_at": creation-time-stamp,
  "incomplete_at": null,
  "incomplete_details": null,
  "metadata": {},
  "object": "thread.message",
  "role": "user",
  "run_id": null,
  "status": "completed",
  "thread_id": "my-thread-id"
}

The debug information in the logs references the file which is even more weird that it isn't being attached.

If I use the same code but use the Azure passthrough endpoint, the file is attached.

http://0.0.0.0:4000/azure/openai/threads/{thread_id}/messages?api-version={version_id}

Result:

{
  "id": "my-id",
  "object": "thread.message",
  "created_at": creation-time-stamp,
  "assistant_id": null,
  "thread_id": "my-thread-id",
  "run_id": null,
  "role": "user",
  "content": [
    {
      "type": "text",
      "text": {
        "value": "What is my favorite color?",
        "annotations": []
      }
    }
  ],
  "attachments": [
    {
      "file_id": "my-file-id", # <- Exists
      "tools": [
        {
          "type": "file_search"
        }
      ]
    }
  ],
  "metadata": {}
}

Can someone look into why this endpoint is not attaching files to messages in threads when using LiteLLM as a proxy?

Relevant log output

16:09:45 - LiteLLM Proxy:DEBUG: litellm_pre_call_utils.py:604 - [PROXY] returned data from litellm_pre_call_utils: {'role': 'user', 'content': 'What is my favorite color?', 'attachments': [{'file_id': 'my-file-id', 'tools': [{'type': 'file_search'}]}], 'proxy_server_request': {'url': 'http://0.0.0.0:4000/v1/threads/my-thread-id/messages', 'method': 'POST', 'headers': {'host': '0.0.0.0:4000', 'user-agent': 'python-requests/2.31.0', 'accept-encoding': 'gzip, deflate', 'accept': '/', 'connection': 'keep-alive', 'content-type': 'application/json', 'content-length': '161'}, 'body': {'role': 'user', 'content': 'What is my favorite color?', 'attachments': [{'file_id': 'my-file-id, 'tools': [{'type': 'file_search'}]}]}}, 'litellm_metadata': {'user_api_key_hash': '[REDACTED]', 'user_api_key_alias': None, 'user_api_key_team_id': None, 'user_api_key_user_id': 'default_user_id', 'user_api_key_org_id': None, 'user_api_key_team_alias': None, 'user_api_key_end_user_id': None, 'user_api_key': '[REDACTED]', 'user_api_end_user_max_budget': None, 'litellm_api_version': '1.58.0', 'global_max_parallel_requests': None, 'user_api_key_team_max_budget': None, 'user_api_key_team_spend': None, 'user_api_key_spend': 0.0, 'user_api_key_max_budget': None, 'user_api_key_model_max_budget': {}, 'user_api_key_metadata': {}, 'headers': {'host': '0.0.0.0:4000', 'user-agent': 'python-requests/2.31.0', 'accept-encoding': 'gzip, deflate', 'accept': '/', 'connection': 'keep-alive', 'content-type': 'application/json', 'content-length': '161'}, 'endpoint': 'http://0.0.0.0:4000/v1/threads/my-thread-id/messages', 'litellm_parent_otel_span': None, 'requester_ip_address': ''}}

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.58.0

Twitter / LinkedIn details

No response

@Vinnie-Singleton-NN Vinnie-Singleton-NN added the bug Something isn't working label Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

1 participant