Skip to content

Commit

Permalink
Improve docs
Browse files Browse the repository at this point in the history
  • Loading branch information
lucasavila00 committed Apr 7, 2024
1 parent c25d381 commit 7fcf9f5
Show file tree
Hide file tree
Showing 11 changed files with 78 additions and 14 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,5 +36,5 @@ There are more complete examples in the
| [Typescript Client](https://github.com/lucasavila00/LmScript/tree/main/packages/client) | Dependency-free client that creates and executes LmScript programs with different backends. |
| [GUI](https://github.com/lucasavila00/LmScript/tree/main/apps/egui) | Desktop Application that creates and executes LmScript programs. |
| [Runpod Serverless SGLang](https://github.com/lucasavila00/LmScript/tree/main/docker/runpod-serverless-sglang) | Docker image that runs [SGLang](https://github.com/sgl-project/sglang/) on [Runpod Serverless](https://www.runpod.io/serverless-gpu). |
| [Docker SGLang](https://github.com/lucasavila00/LmScript/tree/main/docker/sglang-docker) | Docker image that runs [SGLang](https://github.com/sgl-project/sglang/) |
| [Docker SGLang](https://github.com/lucasavila00/LmScript/tree/main/docker/sglang) | Docker image that runs [SGLang](https://github.com/sgl-project/sglang/) |
| [Runpod Serverless vLLM](https://github.com/lucasavila00/LmScript/tree/main/docker/runpod-serverless-vllm) | Docker image that runs [vLLM](https://github.com/vllm-project/vllm) ([Outlines](https://github.com/outlines-dev/outlines)) on [Runpod Serverless](https://www.runpod.io/serverless-gpu). |
24 changes: 22 additions & 2 deletions apps/lmscript-docs/docs/docker/runpod-serverless-vllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,26 @@
sidebar_position: 1
---

# Runpod Serverless vLLM
# Runpod Serverless vLLM Docker Image

TODO
Pre-built Docker image that runs on
[Runpod Serverless](https://www.runpod.io/serverless-gpu).

## Installation

The image is published to
https://hub.docker.com/r/degroote22/lmscript-runpod-serverless-vllm

## Usage

The DockerHub image can be deployed to a machine with a 24gb RAM GPU without any
configuration changes.

## Docker-Compose

There is an example of a Docker-compose file in the repository.

Clone the [LmScript repository](https://github.com/lucasavila00/LmScript/) and:

- `cd docker/runpod-serverless-vllm`
- `docker-compose up`
18 changes: 16 additions & 2 deletions apps/lmscript-docs/docs/docker/sglang.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,20 @@
sidebar_position: 1
---

# Docker SGLang
# SGLang Docker Image

TODO
Pre-built Docker image that runs [TheBloke/Mistral-7B-Instruct-v0.2-AWQ](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-AWQ) on SGLang

## Installation

The image is published to
https://hub.docker.com/r/degroote22/lmscript-sglang

## Docker-Compose

There is an example of a Docker-compose file in the repository.

Clone the [LmScript repository](https://github.com/lucasavila00/LmScript/) and:

- `cd docker/sglang`
- `docker-compose up`
13 changes: 11 additions & 2 deletions apps/lmscript-docs/docs/docker/vllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@
sidebar_position: 1
---

# Docker vLLM
# vLLM Docker Image

TODO
Docker-compose file that runs vLLM locally.

## Docker-Compose

There is an example of a Docker-compose file in the repository.

Clone the [LmScript repository](https://github.com/lucasavila00/LmScript/) and:

- `cd docker/sglang`
- `docker-compose up`
2 changes: 1 addition & 1 deletion docker/runpod-serverless-sglang/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ https://hub.docker.com/r/degroote22/lmscript-runpod-serverless

## Usage

Documentation is available in the [LmScript Docs](/docs/category/lmscriptclient).
Documentation is available in the [LmScript Docs](/docs/docker/runpod-serverless-sglang).

## License

Expand Down
3 changes: 1 addition & 2 deletions docker/runpod-serverless-vllm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,7 @@ https://hub.docker.com/r/degroote22/lmscript-runpod-serverless-vllm

## Usage

The DockerHub image can be deployed to a machine with a 24gb RAM GPU without any
configuration changes.
Documentation is available in the [LmScript Docs](/docs/docker/runpod-serverless-vllm).

## Source Code

Expand Down
File renamed without changes.
5 changes: 2 additions & 3 deletions docker/sglang-docker/README.md → docker/sglang/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Runpod Serverless SGLang Docker Image
# SGLang Docker Image

Pre-built Docker image that runs [TheBloke/Mistral-7B-Instruct-v0.2-AWQ](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-AWQ) on SGLang

Expand All @@ -9,8 +9,7 @@ https://hub.docker.com/r/degroote22/lmscript-sglang

## Usage

Use the `docker-compose.yml` file in this folder to run it locally with
`docker-compose up`.
Documentation is available in the [LmScript Docs](/docs/docker/sglang).

## License

Expand Down
File renamed without changes.
File renamed without changes.
25 changes: 24 additions & 1 deletion docker/vllm/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,24 @@
TODO
# vLLM Docker

Docker-compose file that runs vLLM locally.

## Usage

Documentation is available in the [LmScript Docs](/docs/docker/vllm).

## License

[MIT](https://choosealicense.com/licenses/mit/)

## Contributing

Use the `docker-compose.yml` file in this folder to run it locally with
`docker-compose up`.

### Building

```
docker build -t degroote22/lmscript-sglang:0.0.9 .
docker push docker.io/degroote22/lmscript-sglang:0.0.9
```

0 comments on commit 7fcf9f5

Please sign in to comment.