====== Encoding Ogg Theora Video from Almost Anything ====== Encoding Ogg Theora videos using free software tools might not be obvious, since MEncoder has no Theora encoding support whatsoever and FFmpeg wants to use its own Theora encoder and Ogg multiplexer, which are broken and/or worse than libtheora. This means you have to rely on Theora-specific tools like the libtheora example encoder or [[http://v2v.cc/~j/ffmpeg2theora/|FFmpeg2theora]], which, unfortunately, don't support as many input formats. If you have something that can be read by FFmpeg2theora, try using that, as that's probably the easiest way and it's available in Debian. If you can't get your data into FFmpeg2theora directly, you can convert it to YUV4MPEG format using MPlayer or FFmpeg, and then pipe it. ==== From MPlayer ==== For anything that MPlayer can read, the command line to pipe to ''theora_encoder_example''/''ffmpeg2theora'' is: mplayer -really-quiet -nosound -vo yuv4mpeg:file=/dev/stdout inputfile \ | theora_encoder_example -v 7 -o out.ogv - You can also substitute ''ffmpeg2theora'' for ''theora_encoder_example'', since it supports the same options. The libtheora example encoder can be found in the Debian package ''libtheora-bin''. The ''-v'' options sets the video quality. ==== From FFmpeg ==== You can do basically the same using FFmpeg instead of MPlayer. ffmpeg -an -i inputfile -f yuv4mpegpipe - \ | theora_encoder_example -v 7 -o out.ogv - ==== From your program ==== The basic approach of spawning subprocesses and pipes is IMHO easier if you want to encode output from your software into Theora than using GStreamer or libtheora directly However, reading raw pixel data using MPlayer or FFmpeg is a bit tricky, so I'll show what I did to read 32 bit RGB data. The input is one second of 100x100 black pixels from /dev/zero: dd if=/dev/zero bs=40000 count=25 \ | mplayer -really-quiet -nosound -demuxer rawvideo -rawvideo w=100:h=100:fps=25:format=bgra -vo yuv4mpeg:file=/dev/stdout - \ | theora_encoder_example -v 7 -o out.ogv - The same can be done using FFmpeg, however, ffmpeg cannot directly convert raw RGB to YUV4MPEG, so you need to spawn two FFmpeg processes for that. dd if=/dev/zero bs=40000 count=25 \ | ffmpeg -an -f rawvideo -pix_fmt rgb32 -s 100x100 -i - -f rawvideo -pix_fmt yuv420p - \ | ffmpeg -an -f rawvideo -pix_fmt yuv420p -s 100x100 -i - -f yuv4mpegpipe - \ | theora_encoder_example -v 7 -o out.ogv -