You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importlitellmimportbase64importimghdrfromimportlib.metadataimportversion## loading groq api key using .env filefromdotenvimportload_dotenvload_dotenv(override=True)
print("Litellm version: ",version('litellm'))
# Function to encode the imagedefencode_image(image_path):
withopen(image_path, "rb") asimage_file:
returnimghdr.what(image_path),base64.b64encode(image_file.read()).decode('utf-8')
image_path="test_image.jpg"# Getting the base64 stringimage_type, base64_image=encode_image(image_path)
model="groq/llama-3.2-11b-vision-preview"messages=[
{
"role": "system",
"content": "extract the text as markdown, and don't miss any details. Only respond with the markdown without root ``` tag without any explanation"
},
{
"role": "user",
"content": [
{
"type": "image_url",
"image_url": {
"url": f"data:image/{image_type};base64,{base64_image}",
},
},
],
}
]
response=litellm.completion(model=model,messages=messages)
print(response.choices[0].message.content)
I think this needs to be handled in the litellm backend.
I don't see any problems with either litellm or Groq. The point is to be flexible when using litellm - providing the proper argument values works - at least for me.
The py-zerox code currently has a hardcoded system role, using the same structure, prompt, and role for all LLMs. However, some LLMs need a bit of customization. The open PR introduces a new custom_role parameter to address this. So, using custom_role=user should resolve the issue.
Since this doesn't point to a problem with litellm, we can close this issue and continue the discussion in the open PR or this other open issue. Thanks!
What happened?
I think this needs to be handled in the litellm backend.
related: getomni-ai/zerox#65
Relevant log output
Twitter / LinkedIn details
https://www.linkedin.com/in/pradyumnasingh/
The text was updated successfully, but these errors were encountered: