Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recover from errors #42

Open
endolith opened this issue Sep 13, 2023 · 0 comments
Open

Recover from errors #42

endolith opened this issue Sep 13, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@endolith
Copy link

I get errors like "Interpreter has stopped" or "Error: This model's maximum context length is 8192 tokens. However, your messages resulted in 9667 tokens (9514 in the messages, 153 in the functions). Please reduce the length of the messages or functions." and then it gives me no options but to reset the entire chat and lose everything. It should just continue the conversation and work around the bug.

@silvanmelchior silvanmelchior changed the title Fails too easily Recover from errors Sep 14, 2023
@silvanmelchior silvanmelchior added the enhancement New feature or request label Sep 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants