Skip to content

Commit

Permalink
update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
lvhan028 committed Jul 24, 2024
1 parent ace1f43 commit 202ef08
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ ______________________________________________________________________
<details open>
<summary><b>2024</b></summary>

- \[2024/07\] Support llama3.1
- \[2024/07\] Support [InternVL2](https://huggingface.co/collections/OpenGVLab/internvl-20-667d3961ab5eb12c7ed1463e) full-series models, [InternLM-XComposer2.5](docs/en/multi_modal/xcomposer2d5.md) and [function call](docs/en/serving/api_server_tools.md) of InternLM2.5
- \[2024/06\] PyTorch engine support DeepSeek-V2 and several VLMs, such as CogVLM2, Mini-InternVL, LlaVA-Next
- \[2024/05\] Balance vision model when deploying VLMs with multiple GPUs
Expand Down Expand Up @@ -110,6 +111,7 @@ For detailed inference benchmarks in more devices and more settings, please refe
<li>Llama (7B - 65B)</li>
<li>Llama2 (7B - 70B)</li>
<li>Llama3 (8B, 70B)</li>
<li>Llama3.1 (8B)</li>
<li>InternLM (7B - 20B)</li>
<li>InternLM2 (7B - 20B)</li>
<li>InternLM2.5 (7B)</li>
Expand Down
2 changes: 2 additions & 0 deletions README_zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@ ______________________________________________________________________
<details open>
<summary><b>2024</b></summary>

- \[2024/07\] 支持 llama3.1
- \[2024/07\] 支持 [InternVL2](https://huggingface.co/collections/OpenGVLab/internvl-20-667d3961ab5eb12c7ed1463e) 全系列模型,[InternLM-XComposer2.5](docs/zh_cn/multi_modal/xcomposer2d5.md) 模型和 InternLM2.5 的 [function call 功能](docs/zh_cn/serving/api_server_tools.md)
- \[2024/06\] PyTorch engine 支持了 DeepSeek-V2 和若干 VLM 模型推理, 比如 CogVLM2,Mini-InternVL,LlaVA-Next
- \[2024/05\] 在多 GPU 上部署 VLM 模型时,支持把视觉部分的模型均分到多卡上
Expand Down Expand Up @@ -111,6 +112,7 @@ LMDeploy TurboMind 引擎拥有卓越的推理能力,在各种规模的模型
<li>Llama (7B - 65B)</li>
<li>Llama2 (7B - 70B)</li>
<li>Llama3 (8B, 70B)</li>
<li>Llama3.1 (8B)</li>
<li>InternLM (7B - 20B)</li>
<li>InternLM2 (7B - 20B)</li>
<li>InternLM2.5 (7B)</li>
Expand Down
2 changes: 2 additions & 0 deletions docs/en/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
| Llama | 7B - 65B | Yes | Yes | Yes | Yes |
| Llama2 | 7B - 70B | Yes | Yes | Yes | Yes |
| Llama3 | 8B, 70B | Yes | Yes | Yes | Yes |
| Llama3.1 | 8B | Yes | Yes | Yes | Yes |
| InternLM | 7B - 20B | Yes | Yes | Yes | Yes |
| InternLM2 | 7B - 20B | Yes | Yes | Yes | Yes |
| InternLM2.5 | 7B | Yes | Yes | Yes | Yes |
Expand Down Expand Up @@ -44,6 +45,7 @@ The TurboMind engine doesn't support window attention. Therefore, for models tha
| Llama | 7B - 65B | Yes | No | Yes |
| Llama2 | 7B - 70B | Yes | No | Yes |
| Llama3 | 8B, 70B | Yes | No | Yes |
| Llama3.1 | 8B | Yes | No | - |
| InternLM | 7B - 20B | Yes | No | Yes |
| InternLM2 | 7B - 20B | Yes | No | - |
| InternLM2.5 | 7B | Yes | No | - |
Expand Down
2 changes: 2 additions & 0 deletions docs/zh_cn/supported_models/supported_models.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
| Llama | 7B - 65B | Yes | Yes | Yes | Yes |
| Llama2 | 7B - 70B | Yes | Yes | Yes | Yes |
| Llama3 | 8B, 70B | Yes | Yes | Yes | Yes |
| Llama3.1 | 8B | Yes | Yes | Yes | Yes |
| InternLM | 7B - 20B | Yes | Yes | Yes | Yes |
| InternLM2 | 7B - 20B | Yes | Yes | Yes | Yes |
| InternLM2.5 | 7B | Yes | Yes | Yes | Yes |
Expand Down Expand Up @@ -44,6 +45,7 @@ turbomind 引擎不支持 window attention。所以,对于应用了 window att
| Llama | 7B - 65B | Yes | No | Yes |
| Llama2 | 7B - 70B | Yes | No | Yes |
| Llama3 | 8B, 70B | Yes | No | Yes |
| Llama3.1 | 8B | Yes | No | - |
| InternLM | 7B - 20B | Yes | No | Yes |
| InternLM2 | 7B - 20B | Yes | No | - |
| InternLM2.5 | 7B | Yes | No | - |
Expand Down

0 comments on commit 202ef08

Please sign in to comment.