Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stream responses from LLM instead of waiting for the complete response #86

Merged
merged 2 commits into from
Oct 31, 2023

Conversation

vivekuppal
Copy link
Owner

@vivekuppal vivekuppal commented Oct 30, 2023

  • Stream responses from the LLM instead of getting the complete response in one go.
  • Update UI with streaming responses as they are received

Advantages

  • First response time at API level is much faster with a streaming response, especially for longer responses
  • User Experience is much better as time to first response is shorter than before

@vivekuppal vivekuppal added the enhancement New feature or request label Oct 30, 2023
@vivekuppal vivekuppal self-assigned this Oct 30, 2023
@vivekuppal vivekuppal changed the title DRAFT: Stream responses from LLM instead of waiting for the complete response Stream responses from LLM instead of waiting for the complete response Oct 30, 2023
@vivekuppal vivekuppal merged commit 5dbc610 into main Oct 31, 2023
2 checks passed
@vivekuppal vivekuppal deleted the vu-response-stream branch October 31, 2023 14:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants