-
Notifications
You must be signed in to change notification settings - Fork 91
Possible to work with images? #110
Comments
That's something I could add/support in the future. However, in its present state, BF only works with video inputs and can only output video. |
You could probably write a script to do something very similar using If you're using Linux, it might look something like this: #!/bin/bash
#Converts two images to a interpolated image set.
image1=$1
image2=$2
outFolder=$3
frames=$(printf "%0.3f\n" $(echo "$4/30" | bc -l))
calculatedFrames=
tempFolder="$(mktemp -d)"
tempImageMovie="$(mktemp --suffix=.mp4)"
tempInterpolatedMovie="$(mktemp --suffix=.mp4)"
cp $image1 $tempFolder/i1.bmp
cp $image2 $tempFolder/i2.bmp
ffmpeg -i "$tempFolder/i%d.bmp" -r 30 -c:v libx264 -crf 0 -preset ultrafast -hide_banner "$tempImageMovie"
butterflow -l -s a=00.000,b=end,dur=$frames -o "$tempInterpolatedMovie" "$tempImageMovie"
ffmpeg -i "$tempInterpolatedMovie" -hide_banner "$outputFolder/img%04d.bmp"
rm -rf $tempFolder
rm $tempImageMovie
rm $tempInterpolatedMovie I don't have butterflow installed on any of my Linux machines, so I can't debug this script, but I hope you find it a useful starting point. |
@dthpham Thank you for even considering such a feature! That would be amazing for me! @wagesj45 Thank you. I have been experimenting with something similar just hadn't quite got it working yet. Also, I didn't realize ffmpeg had a lossless flag that helps, Thanks! My end goal is quasi-realtime so I was also hoping to be able to avoid the overhead of a 3 or 4 trips through ffmpeg per source frame. (two here plus one or two inside butterflow.) |
@antgiant Butterflow is great, but I don't think you'll be approaching anything close to real time with it. That is just the nature of the beast. Although, if you can get that kind of performance out of your GPU, I'd love to know how! |
@wagesj45 I have the benefit of not needing very many frames and being able to control image size. (Image size can make a huge performance difference.) Now for the challenge of getting the rest of the process to reliably work and finding a spare "server" somewhere to run the stream. I am still be very interested in working directly with images. That would allow me to reduce the latency of this because I could use the images as they are rendered instead of only after the complete render process is finished. For this project I would estimate that to make a 10 minute improvement in latency. |
I have a use case that is going to output a very low framerate so it would be ideal for me if I could just feed it two source images and specify the number of rendered intermediate images I want as output. That way I could avoid all of the overhead of going back and forth between images and video repeatedly. Is there anyway I can use this project to accomplish this?
The text was updated successfully, but these errors were encountered: