As you can see it finishes almost before you've had chance to realise it's started to play and isn't very helpful at showing why the moth is named after the Hummingbird.
Now I'm sure there are many ways in which you could turn the video into a looping GIF but I'm going to detail what I did, partly so I don't forget, and partly as I wrote some software to deal with one particular issue.
My approach to almost any video related task usually starts with FFmpeg and this was no different, with a simple command to scale the video down and produce a GIF as output making sure to keep the right frame rate.
ffmpeg -i 00002.MTS -vf "fps=25,scale=400:225" animated.gif
As you can see this works to produce an animated GIF although there are a number of problems with it. Firstly it's very grainy and secondly you can't really see the moth now we've scaled the image down. We'll deal with the second problem first (as the first problem mostly goes away by the end). Again a simple FFmpeg command allows us to crop the video:
ffmpeg -i 00002.MTS -vf "fps=25,scale=1920:1080,crop=800:450:400:225,scale=400:225" animated.gif
This gives us a much better view of the moth but that jump as the animation loops around is very very annoying. The problem is that even with just 14 frames there is enough camera movement between the first and last frame for the join to be really obvious. This is something you often see with animated GIFs produced from video and you can clearly see why in this image of the first and last frames superimposed on one another.
At this point I tried a number of filters in FFmpeg that are supposed to help remove camera shake etc. but none of them helped as the camera movement is fairly smooth and what I want is to just remove the movement altogether so that the plant stay stills between the frames. While I couldn't find anyway of doing this in FFmpeg I did realise that some of the code I used in 3DAssembler might help.
I've never really described how 3DAssembler works, but essentially it aligns images. In fact it uses SURF to detect identical features in a pair of images and then determines the horizontal and vertical shift required to try and overlap as many of the features it found as possible. In 3DAssembler this allows you to automatically align photos to produce a usable stereo pair. Here though we can use the same approach to align the frames of the video.
The code I wrote (which is currently a mess of hard coded values, so I'll release it once I've had time to clean it up) calculates the horizontal and vertical shifts of each frame against the first frame and then crops each frame appropriately. If we superimpose the first and last of these corrected frames we can see how things have improved.
Producing the final animated GIF is then a multistage process. Firstly we use FFmpeg to turn the video into a series of still images, taking care to deinterlace the original video:
ffmpeg -i 00002.MTS -vf "scale=1920:1080,yadif=1:0,hqdn3d,fps=25" frame%03d.pngMy code then aligns, crops, and scales these frames down to the same size we were using before. The set of frames is then reassembled to produce the animated GIF:
convert -delay 4 frame*.png animated.gif
While there is still a jump of the background as it loops around it is a lot less obvious than in the original. You'll also notice that the grain present in the original has disappeared. The grain is actually dithering introduced to try and improve the image due to the fact that a GIF image is limited to just 256 colours. I didn't apply a dither filter when assembling the GIF which does mean you can see problems with the colour palette especially in the grass at the bottom left.
I'm not sure why GIF has such a limited colour pallette but I'm guessing it relates to keeping the filesize down when storage was more expensive and bandwidth was a lot lower. Now that most peoples internet connection can handle full resolution HD video we shuold probably move beyond GIF images. For single images, especially hand drawn or those that use transparency, GIF has been replaced by PNG. The PNG format also supports animation. Unfortunately only Firefox currently has support for showing animated PNG images, in all the other browsers all you see is the first frame.
Fortunately it is possible to get animated PNGs to play in most modern browsers with a little bit of JavaScript trickery using the apng-canvas library. Unfortunately the way this library works means that you need to host both the image and the javascript in the same place which makes it difficult to use with blogger, not helped by the fact the you can't currently upload animated PNG files to blogger either. Anyway after a little bit of work hopefully the following should be animated for everyone.
As you can see this is much better than the GIF version as we aren't limited to just 256 colours. This was produced in exactly the same way as the GIF version though, apart from the final command to assemble the file which now looks like:
apngasm animated.png frame*.png 1 25I'll clean up the code I used and make it available in case anyone else fancies producing weird little animated videos.
I just wish I had your knowledge. It seems everything I could do has been overtaken by computers. I do embrace them and I have spent a couple of hours tracking the video you sent in Blender but the result is much the same.
ReplyDeleteGreat job.
Thanks Adrian. I guess the difference between is that you wouldn't have messed up the video recording in the first place and so wouldn't have had to resort to a looping GIF.
DeleteInteresting to hear it ends up pretty similar using Blender. I wouldn't know where to start to do that, I was just lucky in that I already had the code for image alignment kicking around so it probably took longer to write the blog post than it did to figure out the image manipulation.
Mark, I would have missed the shot whilst looking for lenses.
DeleteI dread to think how many hours are spent making a BBC documentary like 'Spring Watch'. I do know there is miles of footage to edit but only a couple of folk doing it and the director.