You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: mellea/backends/litellm.py
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -54,12 +54,12 @@ def __init__(
54
54
base_url: str|None="http://localhost:11434",
55
55
model_options: dict|None=None,
56
56
):
57
-
"""Initialize and OpenAI compatible backend. For any additional kwargs that you need to pass the the client, pass them as a part of **kwargs.
57
+
"""Initialize an OpenAI compatible backend using the [LiteLLM Python SDK](https://docs.litellm.ai/docs/#litellm-python-sdk).
58
58
59
59
Note: If getting `Unclosed client session`, set `export DISABLE_AIOHTTP_TRANSPORT=True` in your environment. See: https://github.com/BerriAI/litellm/issues/13251.
60
60
61
61
Args:
62
-
model_id : The LiteLLM model identifier. Make sure that all necessary credentials are in OS environment variables.
62
+
model_id : The LiteLLM model identifier; in most cases requires some combination of `<provider>/<model_creator>/<model_name>`. Make sure that all necessary credentials are in OS environment variables.
63
63
formatter: A custom formatter based on backend.If None, defaults to TemplateFormatter
64
64
base_url : Base url for LLM API. Defaults to None.
65
65
model_options : Generation options to pass to the LLM. Defaults to None.
0 commit comments