Skip to content

Commit f943b68

Browse files
authored
Tweak new concurrency docs (#2205)
* add example for concurrency.max * fix typo * cog.File is deprecated
1 parent 994b2f3 commit f943b68

File tree

2 files changed

+9
-2
lines changed

2 files changed

+9
-2
lines changed

docs/python.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -220,9 +220,9 @@ class Predictor(BasePredictor):
220220

221221
### Streaming output
222222

223-
Cog models can stream output as the `predict()` method is running. For example, a language model can output tokens as they're being generated and an image generation model can output a images they are being generated.
223+
Cog models can stream output as the `predict()` method is running. For example, a language model can output tokens as they're being generated and an image generation model can output images as they are being generated.
224224

225-
To support streaming output in your Cog model, add `from typing import Iterator` to your predict.py file. The `typing` package is a part of Python's standard library so it doesn't need to be installed. Then add a return type annotation to the `predict()` method in the form `-> Iterator[<type>]` where `<type>` can be one of `str`, `int`, `float`, `bool`, `cog.File`, or `cog.Path`.
225+
To support streaming output in your Cog model, add `from typing import Iterator` to your predict.py file. The `typing` package is a part of Python's standard library so it doesn't need to be installed. Then add a return type annotation to the `predict()` method in the form `-> Iterator[<type>]` where `<type>` can be one of `str`, `int`, `float`, `bool`, or `cog.Path`.
226226

227227
```py
228228
from cog import BasePredictor, Path

docs/yaml.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -169,6 +169,13 @@ This stanza describes the concurrency capabilities of the model. It has one opti
169169

170170
The maximum number of concurrent predictions the model can process. If this is set, the model must specify an [async `predict()` method](python.md#async-predictors-and-concurrency).
171171

172+
For example:
173+
174+
```yaml
175+
concurrency:
176+
max: 10
177+
```
178+
172179
## `image`
173180

174181
The name given to built Docker images. If you want to push to a registry, this should also include the registry name.

0 commit comments

Comments
 (0)