-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
demo.py have problem #33
Comments
Hi @LORDPQK , This usually happens when you pass only one image to the video argument. |
Thank you for your response |
Can you show me the details about the code and the errors? |
What is in the "my_video_picture_folder" |
I think it may be because your image files do not follow: Line 53 in 6f59e55
or there are some errors in the image. |
"My image format is .png, I think there shouldn't be any issues with my image format. I extracted the video frames from the video you provided at assets/videos/gf_exp1.mp4. However, after installing the requirements.txt, I found there were still bugs, so I reinstalled flash attention using flash_attn-2.7.4.post1+cu12torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl. Could this be the source of the problem?" RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):/root/anaconda3/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c105Error4whatEv |
I have the same problem when i use demo.ipynb :Floating point exception (core dumped) |
When I use a single RTX 3090, I found that when the number of images is less than 5 in PATH_TO_FOLDER (images containing videos), there will be a value tensor of shape problem, and when the number of images is greater than or equal to 5, it will be OOM, how to use the maximum value of a single 3090? (The text+mask reasoning of a single 3090 is measured to be about 15GB of video memory usage) or how to solve the problem of PATH_TO_FOLDER? |
我想知道你解决这个问题了?我遇到了相同的bug |
you mean "Floating point exception (core dumped)"? |
This bug:RuntimeError: shape mismatch: value tensor of shape [256, 2048] cannot be broadcast to indexing result of shape [1280, 2048] |
In my server, I also meet this problem. I already know how to fix this problem, but I am working on something else. I will fix this problem in the future (one week later I guess). For now, I think it should work as @wshiman mentioned. Please at least provide 5 frames for a video to test. |
when i run demo.py and use my picture python demp.py video/v1 ,this has a bug
RuntimeError: shape mismatch: value tensor of shape [256, 2048] cannot be broadcast to indexing result of shape [1280, 2048]
and i try to use demo.ipyng but sitll has a bug:
Floating point exception (core dumped)
The text was updated successfully, but these errors were encountered: