Any chance of keeping the model loaded #182
StuartIanNaylor
started this conversation in
Ideas
Replies: 2 comments 5 replies
-
Hi there, looking to achieve the same thing, maybe simply by loading the model into memory and accessing it on the next ./main run or so... |
Beta Was this translation helpful? Give feedback.
5 replies
-
i have looked at installing it on a RAM disk but have not finished testing that yet. also, trying to see what sort of a wrapper could be used to create an API out of it so it is always in memory. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Rather than streaming could the model load and just watch a folder and move audio or delete when processed.
So you don't get any load time from processing file inputs than the current ./main ?
Beta Was this translation helpful? Give feedback.
All reactions