You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This would keep vector storage complexity to a minimum by removing embedTexts, embedTextsFn and all associated logic, and put it behind an interface so its trivial to extend and add custom embedding APIs.
import{VectorStorage,OpenAIEmbeddings}from"vector-storage";// create embeddings instanceconstembedder=newOpenAIEmbeddings("your-openai-api-key");// Create an instance of VectorStorageconstvectorStore=newVectorStorage(embedder);// Add a text document to the storeawaitvectorStore.addText("The quick brown fox jumps over the lazy dog.",{category: "example",});
The text was updated successfully, but these errors were encountered:
I am looking for this as well. My goal is to be able to run a HuggingFace transformer in the background script of a browser plugin with Transformers.js. I would be willing to contribute if pull requests are welcome.
it might also be worth looking into interfacing with the likes of langchain so it can be used there? i know i'd personally get a lot out of that but may be out of scope for what you're planning
It would be beneficial if the embeddings could be exposed as a interface that is passed in... something like this.
This would keep vector storage complexity to a minimum by removing
embedTexts
,embedTextsFn
and all associated logic, and put it behind an interface so its trivial to extend and add custom embedding APIs.The text was updated successfully, but these errors were encountered: