Releases: DefTruth/Awesome-LLM-Inference
Releases 路 DefTruth/Awesome-LLM-Inference
v1.2
v1.1
Full Changelog: v1.0...v1.1
v1.0
Full Changelog: v0.9...v1.0
v0.9
v0.8
Full Changelog: v0.7...v0.8
v0.7
What's Changed
- LLMLingua-2 by @liyucheng09 in #11
- add SnapKV by @liyucheng09 in #12
- Add Microbenchmark by @Miroier in #14
- [KVcache] add "Gear" paper and code of "Keyformer" by @HarryWu-CHN in #13
- Update README.md by @preminstrel in #15
New Contributors
- @Miroier made their first contribution in #14
- @HarryWu-CHN made their first contribution in #13
- @preminstrel made their first contribution in #15
Full Changelog: v0.6...v0.7
Awesome-LLM-Inference v0.6
What's Changed
- Add an ICLR paper for KV cache compression by @Janghyun1230 in #8
- Add github link for paper FP8-Quantization[2208.09225] by @Mr-Philo in #9
New Contributors
- @Janghyun1230 made their first contribution in #8
- @Mr-Philo made their first contribution in #9
Full Changelog: v0.5...v0.6
Awesome-LLM-Inference v0.5
What's Changed
- correct affiliation error by @liyucheng09 in #3
- fix typo by @lkm2835 in #6
- add context compression & new papers KV compression by @liyucheng09 in #7
New Contributors
- @liyucheng09 made their first contribution in #3
- @lkm2835 made their first contribution in #6
Full Changelog: v0.4...v0.5
Awesome LLM Inference v0.4
Update README.md