Skip to content
This repository has been archived by the owner on Aug 16, 2023. It is now read-only.

Possible to work with images? #110

Open
antgiant opened this issue Jul 14, 2019 · 5 comments
Open

Possible to work with images? #110

antgiant opened this issue Jul 14, 2019 · 5 comments

Comments

@antgiant
Copy link

I have a use case that is going to output a very low framerate so it would be ideal for me if I could just feed it two source images and specify the number of rendered intermediate images I want as output. That way I could avoid all of the overhead of going back and forth between images and video repeatedly. Is there anyway I can use this project to accomplish this?

@dthpham
Copy link
Owner

dthpham commented Jul 14, 2019

That's something I could add/support in the future. However, in its present state, BF only works with video inputs and can only output video.

@wagesj45
Copy link

You could probably write a script to do something very similar using ffmpeg as a middle-man. Write the script to take in two images, have ffmpeg create a 2 frame *.mp4 file with lossless compression, feed that temporary file to butterflow with the lossless flag, then use ffmpeg once more to convert the frames from the butterflow output file into images.

If you're using Linux, it might look something like this:

#!/bin/bash
#Converts two images to a interpolated image set.

image1=$1
image2=$2
outFolder=$3
frames=$(printf "%0.3f\n" $(echo "$4/30" | bc -l))
calculatedFrames=
tempFolder="$(mktemp -d)"
tempImageMovie="$(mktemp --suffix=.mp4)"
tempInterpolatedMovie="$(mktemp --suffix=.mp4)"

cp $image1 $tempFolder/i1.bmp
cp $image2 $tempFolder/i2.bmp

ffmpeg -i "$tempFolder/i%d.bmp" -r 30 -c:v libx264 -crf 0 -preset ultrafast -hide_banner "$tempImageMovie"

butterflow -l -s a=00.000,b=end,dur=$frames -o "$tempInterpolatedMovie" "$tempImageMovie"

ffmpeg -i "$tempInterpolatedMovie" -hide_banner "$outputFolder/img%04d.bmp"

rm -rf $tempFolder
rm $tempImageMovie
rm $tempInterpolatedMovie

I don't have butterflow installed on any of my Linux machines, so I can't debug this script, but I hope you find it a useful starting point.

@antgiant
Copy link
Author

@dthpham Thank you for even considering such a feature! That would be amazing for me!
(Also, I don't know if butterflow uses more than two images to generate the intermediate frames. If it does I'd be happy to pass in as many history images as it needs.)

@wagesj45 Thank you. I have been experimenting with something similar just hadn't quite got it working yet. Also, I didn't realize ffmpeg had a lossless flag that helps, Thanks! My end goal is quasi-realtime so I was also hoping to be able to avoid the overhead of a 3 or 4 trips through ffmpeg per source frame. (two here plus one or two inside butterflow.)

@wagesj45
Copy link

@antgiant Butterflow is great, but I don't think you'll be approaching anything close to real time with it. That is just the nature of the beast. Although, if you can get that kind of performance out of your GPU, I'd love to know how!

@antgiant
Copy link
Author

antgiant commented Jul 18, 2019

@wagesj45 I have the benefit of not needing very many frames and being able to control image size. (Image size can make a huge performance difference.)
I'm still very much in the proof of concept stage but this 10 minute clip took less than 10 minutes to render and upload. So at least in this special case, quasi realtime is possible, and butterflow makes it look incredible!

Now for the challenge of getting the rest of the process to reliably work and finding a spare "server" somewhere to run the stream.

I am still be very interested in working directly with images. That would allow me to reduce the latency of this because I could use the images as they are rendered instead of only after the complete render process is finished. For this project I would estimate that to make a 10 minute improvement in latency.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants