I have been enlisted as a member of the Bell Helicopter/Dunbar High School FIRST Robotics team, which I’m really excited about. For Christmas, my brother gave me an HD Hero camera, which will be a really fun way of capturing some on-board shots of the robot during construction, and in action. That’s definitely a safer way of putting it to use than what the manufacturer seems to promote (skiing and BASE jumping, preferably one immediately following the other).
The Hero doesn’t just do video though, it has a nice timelapse mode that lets you take 5 megapixel photos every 2, 5, 10, 30, or 60 seconds. This post is about how to take photos from the Hero’s timelapse mode, and turn them into MPEG-4 video (on a Mac or Linux machine). There are lots of examples of how to make timelapse videos on the web, but I had to use information from several sources, including the FFmpeg FAQ, in order to actually put this together. There are some potential issues with frame dropping that were not covered anywhere I could find, those are addressed at the end.
The Hero stores photos as 2592×1944 JPEGs named like “GOPR0031.JPG”. We start by simply copying all of these files off the Hero into a local directory. No software needed, the Hero mounts as a USB storage device. A random example of one is shown here, the link to Flickr will let you get the full resolution copy.
FFmpeg is our tool of choice for producing high quality MPEG-4 video. The simplest way to install it on Mac OS X (Assuming you have XCode installed) is via the homebrew package manager. So, we run “brew install ffmpeg”, and take a break.
[~]$ brew install ffmpeg <...various output snipped...> /usr/local/Cellar/yasm/1.1.0: 42 files, 3.7M, built in 42 seconds /usr/local/Cellar/x264/r1713M-c276662: 7 files, 2.5M, built in 49 seconds /usr/local/Cellar/faac/1.28: 10 files, 928K, built in 86 seconds /usr/local/Cellar/faad2/2.7: 13 files, 868K, built in 23 seconds /usr/local/Cellar/pkg-config/0.25: 8 files, 248K, built in 46 seconds /usr/local/Cellar/libogg/1.2.0: 82 files, 508K, built in 19 seconds /usr/local/Cellar/libvorbis/1.3.1: 117 files, 7.6M, built in 92 seconds /usr/local/Cellar/theora/1.1.1: 91 files, 2.7M, built in 24 seconds /usr/local/Cellar/libvpx/0.9.2: 19 files, 1.3M, built in 5 seconds /usr/local/Cellar/xvid/1.2.2: 7 files, 1.3M, built in 22 seconds /usr/local/Cellar/ffmpeg/0.6.1: 91 files, 15M, built in 46 seconds
FFmpeg has an annoying requirement that input files be named sequentially, starting from one. The files from the Hero may or may not meet that requirement, depending on if you have other photos on the device before you start capturing. Mine were numbered with an offset, so we need to rename everything. To avoid mucking up existing files, we’ll just symlink the files to a different directory. From the FFmpeg FAQ (assumes a bourne shell):
[images]$ mkdir img_in_order [images]$ x=1;for i in *JPG; do counter=$(printf %04d $x); ln "$i" img_in_order/img"$counter".jpg; x=$(($x+1)); done
img_in_order contains properly named links to the JPEGs. With FFmpeg, we can encode our video with the following command:
[images]$ ffmpeg -i img_in_order/img%04d.jpg -r 30 -s 640x480 -vcodec libx264 -vpre hq -b 1000k hero_bloglapse.mp4
That produces a 30fps 640×480 MPEG-4 video at 1 megabit/sec. Different sizes can be specified with the
-s parameter, either explicit resolutions or identifiers like hd720, hd1080, etc. There are tons of resources for all the various FFmpeg flags, those above produce a reasonable default for the Hero.
However, if we try one of the widescreen resolutions, the resulting video is slightly distorted due to the fact that the input JPEGs are 4:3, but the output is not. Since the Hero does not have an LCD screen, you’ll probably be setting up a timelapse shot with a good deal of buffer around the main image. You can use FFmpeg to crop parts of your image out, and deliver a widescreen timelapse. As you can see from one of the original shots, there is plenty of space above my head. Since our source image is 2592×1944, and we want a target resolution of 1280×720, we can figure out that we need to shave 486 rows off the original image (mostly from the top) to get an undistorted image. Immediately after the
-i flag and argument, adding
-croptop 426 and
-cropbottom 60 will crop the JPEG before video encoding.
One final note: When FFmpeg is encoding, keep an eye on the “frame” and “drop” counts it displays. If a very low framerate is used (say, 8fps), a significant number of frames may be dropped. If your video seems much shorter than you were expecting, this is probably the reason. At higher framerates, everything seems to work fine.
I don’t have anything exciting to show off for the final result, so instead I took a one-shot-every-two-seconds timelapse of myself writing this post. That totalled about 760 images, at 1.6GB. Cropping and encoding this with FFmpeg took 2 minutes.
Validate XHTML Validate CSS