These are all really great answers. Here's another suggestion. @user621442 is correct that the bottleneck is typically the writing of the image, so if you are writing png files to your video compressor, it will be pretty slow (even if you are sending them through a pipe instead of writing to disk). I found a solution using pure ffmpeg, which I personally find easier to use than matplotlib.animation or mencoder.
Also, in my case, I wanted to just save the image in an axis, instead of saving all of the tick labels, figure title, figure background, etc. Basically I wanted to make a movie/animation using matplotlib code, but not have it "look like a graph". I've included that code here, but you can make standard graphs and pipe them to ffmpeg instead if you want.
import matplotlib
matplotlib.use('agg', warn = False, force = True)
import matplotlib.pyplot as plt
import subprocess
# create a figure window that is the exact size of the image
# 400x500 pixels in my case
# don't draw any axis stuff ... thanks to @Joe Kington for this trick
# https://mcmap.net/q/41454/-how-to-remove-frame-from-a-figure
f = plt.figure(frameon=False, figsize=(4, 5), dpi=100)
canvas_width, canvas_height = f.canvas.get_width_height()
ax = f.add_axes([0, 0, 1, 1])
ax.axis('off')
def update(frame):
# your matplotlib code goes here
# Open an ffmpeg process
outf = 'ffmpeg.mp4'
cmdstring = ('ffmpeg',
'-y', '-r', '30', # overwrite, 30fps
'-s', '%dx%d' % (canvas_width, canvas_height), # size of image string
'-pix_fmt', 'argb', # format
'-f', 'rawvideo', '-i', '-', # tell ffmpeg to expect raw video from the pipe
'-vcodec', 'mpeg4', outf) # output encoding
p = subprocess.Popen(cmdstring, stdin=subprocess.PIPE)
# Draw 1000 frames and write to the pipe
for frame in range(1000):
# draw the frame
update(frame)
plt.draw()
# extract the image as an ARGB string
string = f.canvas.tostring_argb()
# write to pipe
p.stdin.write(string)
# Finish up
p.communicate()
buffer = fig.canvas.tostring_rgb()
, and the width and height of the figure in pixels withfig.canvas.get_width_height()
(orfig.bbox.width
, etc) – Endorsement-f image2pipe
option so that it expects a series of images), or from a local socket (egudp://localhost:some_port
) and writing to the socket in python... So far, only partial success... I feel like I'm almost there, though... I'm just not familiar enough with ffmpeg... – Endorsementffmpeg -f image2pipe -vcodec mjpeg -i - ouput.whatever
. You can open asubprocess.Popen(cmdstring.split(), stdin=subprocess.PIPE)
and write each frame to itsstdin
) I'll post a more detailed example if I get a chance... – Endorsementmatplotlib
(see my answer below) – Reexamine