What is a good server configuration for running a large-v3 model with the Batched Faster Whisper? #1203
Unanswered
toanhuynhnguyen
asked this question in
Q&A
Replies: 1 comment 3 replies
-
Depends on your needs |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What is a good server configuration for running a large-v3 model with the Batched Faster Whisper?
Beta Was this translation helpful? Give feedback.
All reactions