Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overlay the image generated in processing about another image #62

Open
dancodedev opened this issue Nov 20, 2020 · 8 comments
Open

Overlay the image generated in processing about another image #62

dancodedev opened this issue Nov 20, 2020 · 8 comments

Comments

@dancodedev
Copy link

dancodedev commented Nov 20, 2020

I'm trying to put a default background image and overlay the image generated by processing, but I can't get it.

This is the normal configuration, it works correctly

videoExport.setFfmpegVideoSettings(
    new String[]{
    "[ffmpeg]",
    "-y",
    "-f",        "rawvideo",
    "-vcodec",   "rawvideo",
    "-s",        "[width]x[height]",
    "-pix_fmt",  "rgb24",
    "-r",        "[fps]",
    "-i",        "pipe:0",
    "-an",
    "-vcodec",   "h264",
    "-pix_fmt",  "yuv420p",
    "-crf",      "[crf]",
    "-preset",  "ultrafast",
    "[output]"
    });

With this code I can do what I want. But I don't know how to use it with pipe

ffmpeg -loop 1 -framerate 1 -i 4k.jpg -framerate 30 -i 1k.jpg -filter_complex "overlay=0:0:shortest=1,format=yuv420p" -c:v libx264 -r 30 -movflags +faststart output.mp4

I hope someone can help me. Thank you

The green image would be the processed image in processing, the one coming by pipe and the background image would be a static image configured in the ffmpeg output
image

@dancodedev dancodedev changed the title ¿How to overlay a larger image at the output of the ffmpg ? Overlay the image generated in processing on another image Nov 20, 2020
@dancodedev dancodedev changed the title Overlay the image generated in processing on another image Overlay the image generated in processing about another image Nov 20, 2020
@hamoid
Copy link
Owner

hamoid commented Nov 20, 2020

Hi, why not load the image into Processing then for every frame draw the background image and all the elements it needs?

Or in other words, is there any advantage to twisting the command to do what would be quite easy with Processing?

@dancodedev
Copy link
Author

dancodedev commented Nov 21, 2020

Hola, ¿por qué no cargar la imagen en Procesamiento y luego para cada fotograma dibujar la imagen de fondo y todos los elementos que necesita?

O en otras palabras, ¿hay alguna ventaja para torcer el comando para hacer lo que sería bastante fácil con el procesamiento?

Yes, I am rendering videos of a duration of hours, if I modify the output of the ffmpeg I can have a greater number of frames in processing because the window has a smaller resolution, the gain in speed is very good.

image

@hamoid
Copy link
Owner

hamoid commented Nov 21, 2020

I see. It's probably doable but I have no time at the moment to find the right command. I hope someone knows :)

@dancodedev
Copy link
Author

Thanks for your answer, I'll keep researching for that.

@bcoley
Copy link
Contributor

bcoley commented Nov 24, 2020

I've never tried anything like that, but I can see where it would be useful, a quick google turns up this article that describes what you are trying to do, if I understand you correctly https://video.stackexchange.com/questions/16975/how-do-i-put-the-image-behind-video-by-using-ffmpeg

@hamoid
Copy link
Owner

hamoid commented Nov 24, 2020

That's what he's trying to do, but not only. The difference is that on top of that, the video in question does not yet exist but it's being created frame by frame. Maybe someone with experience in juggling ffmpeg commands can suggest a solution :)

@bcoley
Copy link
Contributor

bcoley commented Nov 24, 2020

Right, I thought they could render the video from processing, then add the background image afterwards using the ffmpeg command in the article there under "overlay filter". The example in the article is adding a background to the sides of the video, which the W-w/2 part seems to be handling, I imagine a complete border around the video is possible as well, just a matter of experimenting at the cmd line. Or am I misunderstanding what they intend for the final result?

ffmpeg -loop 1 -i image.png -i video.mp4 -filter_complex "overlay=(W-w)/2:shortest=1" output.mp4

@hamoid
Copy link
Owner

hamoid commented Nov 24, 2020

A possible issue with doing it in two steps is that it would be compressed twice, loosing quality. But if the first pass uses a lossless video format then it should be ok I think.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants