v0.0.30
What's Changed
π Main Features
- Implemented CodeGemmy model for enhanced code completion.
- Utilized a local tokenizer for improved performance and more accurate prediction times.
π οΈ Updates and Enhancements
- Update server builds to late llama.cpp(b2665) by @gespispace in #28
- feat(configuration): add configuration option to use pre release by @gespispace in #27
- bugfix(configuration): fix type in default config for cloud use by @gespispace in #29
- refactor(ci): move s3 url for server builds to new firecoder domain by @gespispace in #30
- bugfix(chat): fix references for highlighted text by @gespispace in #31
- feat(completion): move to codegemmy completion by @gespispace in #32
- feat(configuration): auto restart servers after changing configuration by @gespispace in #33
- feat(chat): add cancel button in chat to stop generation by @gespispace in #36
- feat(prompt): use tokenizer from @xenova/transformers by @gespispace in #38
- feat(cloud): auto completion by @gespispace in #39
- feat(chat): add copy button in chat by @gespispace in #41
Full Changelog: v0.0.29...v0.0.30