Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: TransformQueryEngine does not support async streaming #17371

Open
YeonwooSung opened this issue Dec 26, 2024 · 0 comments
Open
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized

Comments

@YeonwooSung
Copy link
Contributor

Feature Description

As you know, the query engine does not fully support async yet, which lets the llama-index to only let the chat engine to use async streaming with RetrieverQueryEngine instance.

This leads the users to only use the 'sync mode' for TransformQueryEngine, which could be used for HyDE.

Reason

This happens basically because the query engine does not fully support the async streaming yet, I guess.

Value of Feature

Most people use FastAPI for serving llama-index based application if they use python, and you know that FastAPI could be accelerated if we could use async correctly.
Adding support for streaming mode to TransformQueryEngine and other query engine would be definitely benefit for using llama-index in production apps.

@YeonwooSung YeonwooSung added enhancement New feature or request triage Issue needs to be triaged/prioritized labels Dec 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

1 participant