Using Snowflake LLM via LangChain API in spacy-llm #13697
Unanswered
andrebittar-ko
asked this question in
Help: Coding & Implementations
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
[APOLOGIES - realise I should maybe have posted this in spacy-llm discussions]
Taking inspiration from this blog post, I would like to use LLMs hosted by Snowflake with spacy (including annotation tasks in Prodigy).
I have slightly adapted the code in the blog to create a wrapper class that inherits from the langchain LLM class and implements the specified mehtods (
_call
and_identifying_params property
):I instantiate the class:
llm = SnowflakeCortexLLM()
, which connects to the Snowflake LLM service and allows me to make calls:llm.invoke('Hello!')
Which provides a response:
Hello! How can I assist you today?'
So, this part is working. However, I am a little stuck as to how to integrate this with the langchain API in spacy-llm and confgi file.
I have registered the above class as follows:
I have defined the following config.cfg, for a span categorisation task:
I load the config as follows:
But this results in an error:
I have also tried registering the model without the langchain namespace (e.g. SnowflakeCortexLLM.v1 and using the corresponding reference in the config file), but get the same error.
Which I guess means there is an extra piece of code to fit this into spaCy's langchain API...???
Is this possible? Is there a better way to go about this?
[edit]: I have now also tried using the langchain Snowflake API. to do this. I get the same error:
ValueError: The 'model' Callable should have one input argument and one return value.
I have posted my full code for that on the spacy-llm forum.Beta Was this translation helpful? Give feedback.
All reactions