-
Notifications
You must be signed in to change notification settings - Fork 539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Help]: It takes too long to load the model during inference #322
Comments
Hi, what is the inference speed of this model? It said that it is NAR model structure, but there are two big model in this arch, I guess it will be no quicker then former AR based pipelines. |
Hi, the model only needs to load the model once. You can use the Gradio demo or Jupyter Notebook to maintain the models in memory. |
Problem Overview
When I run
maskgct_inference.py
, I found it takes too long to load the model during inference.Steps Taken
a. build stage
b. download stage
c. load stage
d. inference stage
Then I found:
Expected Outcome
Does this mean that if I want to inference a piece of audio, then I have to wait a long time for the model to load?
Or is there something wrong with my Settings?
The text was updated successfully, but these errors were encountered: