Skip to content
This repository has been archived by the owner on Apr 3, 2024. It is now read-only.

Streamed MultiPromptChain with websocket: undesired output #49

Open
bassameter63 opened this issue Jun 24, 2023 · 2 comments
Open

Streamed MultiPromptChain with websocket: undesired output #49

bassameter63 opened this issue Jun 24, 2023 · 2 comments

Comments

@bassameter63
Copy link

Hello,
I'm setting up a MultiPromptChain that I call asynchronously as follows:
resp=await multichain.arun(input=standalone_question,include_run_info=False,return_only_outputs=True)
or:
resp=await multichain.acall(inputs={"input":standalone_question,},include_run_info=False,return_only_outputs=True)

The MutiPromptChain is constructed based on a streaming llm as follows:

stream_handler = StreamingLLMCallbackHandler(websocket)
stream_manager = BaseCallbackManager([stream_handler])
llm = ChatOpenAI(
model_name=model,
temperature=0.9,
streaming=True,
callback_manager=stream_manager,
verbose=False,
openai_api_key=openai_apik)

My problem is that I get a non desirable markdown code snippet with a JSON object;
with the response in the websocket. Fo example:
{
"destination":"......................................",
"next_inputs":"....................................................."
}
followed by the answer to the question.
I don't want that json object to be streamed in the websocket; just the answer to the question.
Is there a way to do it?

Thanks for answering.
Bassam.

@liyangwd
Copy link

same issue

@liyangwd
Copy link

@bassameter63 FinalStreamingStdOutCallbackHandler may help

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants