You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all thank you @developersdigest for this excellent youtube tutorial and repo.
Please take following as an idea and potential feature request.
I feel like it is not ideal to mix all the AI logic with the web app. I would like to see some stable example of separation of concerns of "complex AI backend" and consumer NextJS web app.
Imagine following setup of Answer Engine
Backend:
Python - as it is lingua franca of LLM and the better part of LangChain libs
LangServe (with LCEL streaming) as API
LangGraph for complex state & process management and tool invocation
Persistence of history (graph) e.g. in EdgeDB or Zep
Rate limiting
Semantic Caching
Frontend:
NextJS app
Streaming AI backend consumed as RemoteRunnable of LangChain
Vercel AI SDK with AI/RSC for streaming React components from AI backend
Able to stream intermediate agentic results / graph state from remote backend
There are multiple requests for such setup and nobody came with a stable solution
I have filled similar request to Vercel AI SDK repo: vercel/ai#1506
The text was updated successfully, but these errors were encountered:
Hi @elvenking - thanks for your thoughtful feature request here.
I am working on a feature to be able to scope and interact with third party tools my aim is for it to be framework + programming language agnostic with these features. I am starting with building out '@' commands where you can use that to pull up in the UI of your configured workflows/agents/tools/specific llm providers + models.
'@langserve - tool to do xyz'
'@Langgraph - agent do xyz'
'@CrewAI - do xyz'
etc.
In terms of changing the entire backend to be Python, I have no plans to do that currently.
In terms of Database support that will be coming hopefully early summer - but my plans with most features moving forward is that I want to build it in a way where most things are optional.
I hope to be able to help support most of what you are asking for here - it might not be quite what you were looking for but I hope it is in the direction that you are interested in seeing a project implement.
First of all thank you @developersdigest for this excellent youtube tutorial and repo.
Please take following as an idea and potential feature request.
I feel like it is not ideal to mix all the AI logic with the web app. I would like to see some stable example of separation of concerns of "complex AI backend" and consumer NextJS web app.
Imagine following setup of Answer Engine
Backend:
Frontend:
There are multiple requests for such setup and nobody came with a stable solution
I have filled similar request to Vercel AI SDK repo:
vercel/ai#1506
The text was updated successfully, but these errors were encountered: