-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Overlay the image generated in processing about another image #62
Comments
Hi, why not load the image into Processing then for every frame draw the background image and all the elements it needs? Or in other words, is there any advantage to twisting the command to do what would be quite easy with Processing? |
Yes, I am rendering videos of a duration of hours, if I modify the output of the ffmpeg I can have a greater number of frames in processing because the window has a smaller resolution, the gain in speed is very good. |
I see. It's probably doable but I have no time at the moment to find the right command. I hope someone knows :) |
Thanks for your answer, I'll keep researching for that. |
I've never tried anything like that, but I can see where it would be useful, a quick google turns up this article that describes what you are trying to do, if I understand you correctly https://video.stackexchange.com/questions/16975/how-do-i-put-the-image-behind-video-by-using-ffmpeg |
That's what he's trying to do, but not only. The difference is that on top of that, the video in question does not yet exist but it's being created frame by frame. Maybe someone with experience in juggling ffmpeg commands can suggest a solution :) |
Right, I thought they could render the video from processing, then add the background image afterwards using the ffmpeg command in the article there under "overlay filter". The example in the article is adding a background to the sides of the video, which the W-w/2 part seems to be handling, I imagine a complete border around the video is possible as well, just a matter of experimenting at the cmd line. Or am I misunderstanding what they intend for the final result? ffmpeg -loop 1 -i image.png -i video.mp4 -filter_complex "overlay=(W-w)/2:shortest=1" output.mp4 |
A possible issue with doing it in two steps is that it would be compressed twice, loosing quality. But if the first pass uses a lossless video format then it should be ok I think. |
I'm trying to put a default background image and overlay the image generated by processing, but I can't get it.
This is the normal configuration, it works correctly
With this code I can do what I want. But I don't know how to use it with pipe
I hope someone can help me. Thank you
The green image would be the processed image in processing, the one coming by pipe and the background image would be a static image configured in the ffmpeg output
The text was updated successfully, but these errors were encountered: