-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v0.1.14 breaks streaming response rendering from Claude 3.5 Sonnet #134
Comments
Hey @radu-malliu ! Any chance you have a piece of code that reproduces this? I'm running the following which is streaming successfully:
|
hey @efriis, thanks for taking a look! I suspect it's related to the star rating we add to replies. Trying to whip something up to repro. |
Hi there @efriis , I am on 0.1.15 and I am facing a similar issue. Just by taking this sample code from the documentation I am able to reproduce this. The chunks do not stream, instead it is as if I have used .invoke over .stream in terms of the perception to the user. from typing_extensions import Annotated, TypedDict
from langchain_aws.chat_models.bedrock import ChatBedrock
import logging
logging.basicConfig(level=logging.INFO)
class AnswerWithJustification(TypedDict):
'''An answer to the user question along with justification for the answer.'''
answer: Annotated[str,...]
justification: Annotated[str,...]
llm =ChatBedrock(
model_id="anthropic.claude-3-5-sonnet-20240620-v1:0",
model_kwargs={"temperature": 0.001},
region_name="us-east-1",
streaming=True
) # type: ignore[call-arg]
structured_llm = llm.with_structured_output(AnswerWithJustification)
for chunk in structured_llm.stream("What weighs more a pound of bricks or a pound of feathers"):
print(chunk) |
Similar issue. Streaming callback handler is not working properly with ChatBedrock. This code should reproduce the problem:
|
Hi all, I was facing the same issue and I shared a mitigation here: #144 (comment)
|
Thanks @hourliert but we've been using LCEL in our codebase and so this LLMChain is not a solution that solves the streaming issue for us. Hoping that someone can get a chance to look at this soon. |
…154) As pointed out in #134 (comment), some Bedrock models that support streaming tool calls do not properly stream structured output. This is due to our implementation of `with_structured_output`. Here we update the output parsing for models that support streaming tool calls.
Answered in #217. Also, streaming support is best for LCEL chains, look at this page for a LCEL conversation chain with memory support. Closing as duplicate. |
We picked up v0.1.14 in our project today and noticed streaming responses from Claude via Bedrock are no longer rendered as chunks are delivered. The message from the model remains empty until a new reply is entered, then the message is rendered.
Reverting to v0.1.13 fixes the issue.
After entering a reply
After entering one more reply
The message not being renedered would be a response to:
The text was updated successfully, but these errors were encountered: