Skip to content

Commit

Permalink
Add filelock and flash-attn to vllm extra (#529)
Browse files Browse the repository at this point in the history
`filelock` to avoid `BaseFileLock.__init__() got an unexpected keyword argument 'mode'`; and `flash-attn` as it's recommended over the default `xformers`
  • Loading branch information
alvarobartt committed Apr 15, 2024
1 parent e50b559 commit 34a5ba2
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ mistralai = ["mistralai >= 0.1.0"]
ollama = ["ollama >= 0.1.7"]
openai = ["openai >= 1.0.0"]
vertexai = ["google-cloud-aiplatform >= 1.38.0"]
vllm = ["vllm >= 0.2.1"]
vllm = ["vllm >= 0.2.1", "filelock >= 3.13.4", "flash-attn >= 2.5.7"]

[project.urls]
Documentation = "https://distilabel.argilla.io/"
Expand Down

0 comments on commit 34a5ba2

Please sign in to comment.