Replies: 5 comments
-
Use |
Beta Was this translation helpful? Give feedback.
-
@zyriab Thank you for your suggestion, but I encountered a problem when trying it out. The responses from my AI assistant are streamed, but I found that whether I use I feel that the bubbletea code doesn't seem to strictly follow its semantics when implementing |
Beta Was this translation helpful? Give feedback.
-
You have to append each received token into an area that is rendered through It's tricky and there are a lot of edge cases but it's doable ;) As for EDIT: I'm actually working on a similar project. I've done what I describe above but have met some issues (markdown rendering with Glamour thinks tokens are whole documents, token stream area height being limited by the viewport's, etc). I'm learning Bubble Tea as I go so take my advice with a pinch of salt! 😅 EDIT 2: I've tried rewriting my implementation with a I'm also having issues with the token streaming area adding some artifacts from the rendered TUI parts on the stdout when I flush it. A workaround that might be to do like Claude Code and actually print full sentences/blocks to stdout without streaming each token in. Good luck! |
Beta Was this translation helpful? Give feedback.
-
@zyriab I agree with your thoughts, and you've gone further than I have, already trying several solutions I wanted to pursue, which is great 👍. Maybe you're right, and I should give up on streaming output first. Maybe I won't be able to resist the temptation and still want to try to solve it 😂. |
Beta Was this translation helpful? Give feedback.
-
You should, it's a good challenge! Just having issues with artifacts from the rendered parts of the app (text input, mostly) ending up on stdout but I figured out that it came from the terminal emulator missing a refresh from Bubble Tea while scrolling very fast (when restoring conversations). |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
My question
When the text output by my UI exceeds one screen, scrolling up the terminal causes the previous text to disappear.
I am using this framework to build a terminal-based AI application, and I hope to retain the previous chat history when the user scrolls the screen. What should I do?
Beta Was this translation helpful? Give feedback.
All reactions