v0.5 Amazon SageMaker & custom system prompts
We now support Amazon SageMaker as a backend! 🥳 Check out the updated README for more info.
We also added support for custom system prompts. This will let you customize the responses you expect from your assistants. You can try it out over at hf.co/chat !
A big thanks to our new contributors, and especially to @AndreasMadsen who created some great PRs in this release, including adding handlebars support so you can use your own custom prompt structure more easily.
What's Changed
- Added access token note by @merveenoyan in #360
- Update /privacy and other content following Llama v2 release by @julien-c in #374
- Clarify that model 'tokens' are not actual tokens by @AndreasMadsen in #367
- Attempt to clarify how hosted API ≠ local endpoint by @julien-c in #373
- Make model branding customizable based on env var by @flozi00 in #345
- trim and remove stop-suffixes from summary by @AndreasMadsen in #369
- Add a login button by @nsarrazin in #381
- allow different user and assistant end-token by @AndreasMadsen in #375
- Leverage model link to modelUrl when informed by @airibarne in #385
- feat: add support for endpoints requiring client authentication using PKI by @cambriancoder in #393
- Add Sagemaker support by @nsarrazin in #401
- fix docs regarding TGI endpoint url by @AndreasMadsen in #408
- Bump @antfu/utils from 0.7.2 to 0.7.6 by @dependabot in #407
- Bump word-wrap from 1.2.3 to 1.2.5 by @dependabot in #406
- Bump tough-cookie from 4.1.2 to 4.1.3 by @dependabot in #405
- Bump vite from 4.3.5 to 4.3.9 by @dependabot in #404
- Make all prompt templates configurable by @AndreasMadsen in #400
- Support custom system prompts from the user by @nsarrazin in #399
New Contributors
- @AndreasMadsen made their first contribution in #367
- @flozi00 made their first contribution in #345
- @airibarne made their first contribution in #385
- @cambriancoder made their first contribution in #393
- @dependabot made their first contribution in #407
Full Changelog: v0.4...v0.5