Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request and bug to report] Streaming response in the Editor itself. #9

Open
venkatmidhunmareedu opened this issue Nov 15, 2024 · 1 comment

Comments

@venkatmidhunmareedu
Copy link

venkatmidhunmareedu commented Nov 15, 2024

Hey I love this extension which is very useful for my university thesis writing. When i use the "extend selection feature" (I use ollama btw ) the application just stops responding and goes into waiting state until a response is sent back to libreoffice.

Take a look
Screenshot 2024-11-15 164311

What i was saying is to make a new setting under Settings option.

Name : streaming
type : Bool
Function : when set to True, The streaming of chat should be done similar to what we see in the ollama console.
When set to False, the direct output should be displayed after some certain wait time.

This helps low end pc users to work around libreoffice conveniently.

@balisujohn
Copy link
Owner

balisujohn commented Nov 17, 2024

It's nice to hear you are finding localwriter useful! I will look into the best way to implement this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants