Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatBedrockConverse response not streaming #3

Closed
gituser022 opened this issue Oct 24, 2024 · 1 comment
Closed

ChatBedrockConverse response not streaming #3

gituser022 opened this issue Oct 24, 2024 · 1 comment

Comments

@gituser022
Copy link

Hi Shivanshu,

I have tried the simple streaming use-case and replaced just with bedrock FM instead of open AI , when I tried to run streamlit, I see following issues

  1. its not streaming but gathers all response
  2. response from assistant is saved to streamlit state but not printed until there is a new user prompt. It can be simply fixed, not an issue but if it is fixed in repo, it will be helpful to others.

I have attached files.

Thanks

Archive.zip

@shiv248
Copy link
Owner

shiv248 commented Oct 24, 2024

ah @gituser022 as we predicted it is an issue with ChatBedrockConverse not with the streamlit components, I just tested with chatfireworks and chatopenai.

from a quick search try adding streaming=True to the params. the issue was brought up in the langchain-aws repo.
I would recommend forwarding any further issues to the threads below, they are better suited to help you.
langchain-ai/langchain-aws#217
langchain-ai/langchain-aws#241

@shiv248 shiv248 closed this as completed Oct 24, 2024
@shiv248 shiv248 changed the title Assistant response not streaming ChatBedrockConverse response not streaming Oct 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants