-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Video inference requires more than 24GB GPU memory? #26
Comments
Traceback (most recent call last): |
Hi @KangYuan1233 , This is a bit strange, could you please share the script you are using to run the code? |
@HarborYuan Do we have tried on 24 GB card or not? |
@KangYuan1233 I guess you should use at 40GB memory card for video inference. |
I am trying to do video inference using Nvidia 4090 under Sa2VA-1B config, but got OutOfMemoryError.
The text was updated successfully, but these errors were encountered: