How to convert array of png image data into video file
Asked Answered
T

2

7

I am getting frames from canvas through canvas.getDataURL().

However, now I have an array of png images, but I want a video file.

How do I do this?

var canvas = document.getElementById("mycanvaselementforvideocapturing");
var pngimages = [];
...
setInterval(function(){pngimages.push(canvas.toDataURL())}, 1000);
Tolliver answered 12/8, 2016 at 18:47 Comment(0)
E
16

For a full browser support way, you'll have to send your image batch to the server then use some server-side program to do the encoding.

FFmpeg might be able to do it.

But in newest browsers the canvas.captureStream method, has been implemented. It will convert your canvas drawings to a webm video stream, recordable with a MediaRecorder. All of this is still not stabilized though, and will only be available in latest version of browsers, probably with some flags set in user's preferences (e.g chrome needs the "Experimental Web Platforms" one).

var cStream,
  recorder,
  chunks = [];

rec.onclick = function() {
  this.textContent = 'stop recording';
  // set the framerate to 30FPS
  var cStream = canvas.captureStream(30);
  // create a recorder fed with our canvas' stream
  recorder = new MediaRecorder(cStream);
  // start it
  recorder.start();
  // save the chunks
  recorder.ondataavailable = saveChunks;

  recorder.onstop = exportStream;
  // change our button's function
  this.onclick = stopRecording;
};

function saveChunks(e) {

  chunks.push(e.data);

}

function stopRecording() {

  recorder.stop();

}


function exportStream(e) {
  // combine all our chunks in one blob
  var blob = new Blob(chunks)
    // do something with this blob
  var vidURL = URL.createObjectURL(blob);
  var vid = document.createElement('video');
  vid.controls = true;
  vid.src = vidURL;
  vid.onended = function() {
    URL.revokeObjectURL(vidURL);
  }
  document.body.insertBefore(vid, canvas);
}

// make something move on the canvas
var x = 0;
var ctx = canvas.getContext('2d');

var anim = function() {
  x = (x + 2) % (canvas.width + 100);
  // there is no transparency in webm,
  // so we need to set a background otherwise every transparent pixel will become opaque black
  ctx.fillStyle = 'ivory';
  ctx.fillRect(0, 0, canvas.width, canvas.height);
  ctx.fillStyle = 'black';
  ctx.fillRect(x - 50, 20, 50, 50)
  requestAnimationFrame(anim);
};
anim();
<canvas id="canvas" width="500" height="200"></canvas>
<button id="rec">record</button>

And since you asked for a way to add audio to this video, note that you can use cStream.addTrack(anAudioStream.getAudioTracks()[0]); before calling new MediaRecorder(cStream), but this will currently only work in chrome, FF seems to have a bug in MediaRecorder which makes it record only the stream with the tracks it was defined to... A workaround for FF is to call new MediaStream([videoTrack, audioTrack]);

[big thanks to @jib for letting me know how to actually use it...]

Edit: video.onend --> video.onended

Estrange answered 13/8, 2016 at 5:25 Comment(14)
Thanks. I must test it on firefox (as chrome is not supporting it). What does the 30 mean in canvas.captureStream? Is that the framerate.Tolliver
Yes it is, just like the comment says ;-)Estrange
Oh, oops! Missed that. Thanks a million.Tolliver
Just wondering.. Does this support audio, if audio is playing on the webpage.Tolliver
@DavidCallanan, I'm sorry, actually chrome does support recording the canvas stream too. It's just both browsers implementations are so different I though it didn't, and I might got fooled by MDN's article about it... Anyway, added a way for saving both canvas, and a small note about recording audio, which won't be possible currently for your use case...Estrange
To get chunks in Firefox, call recorder.requestData() to trigger ondataavailable.Laquitalar
@jib, actually, reading your link, I'm not sure it would help a lot, since my horrible workaround to handle both implementations doesn't really needs the chunks, it's just that chrome does fire the ondataavailable automatically after some short time, with the chunks in it. So requestData would allow to produce the same behavior only by calling it manually in a timed loop right? But then, won't l get incorrect data for chrome? I hope all this will finally stabilize...Estrange
@Estrange Your workaround is unnecessary. Instead of checking state in ondataavailable, use recorder.onstop instead, and it'll work the same in both browsers regardless of the number of chunks.Laquitalar
Also, this is a w3c spec, not whatwg.Laquitalar
@Laquitalar ah that makes sense ;-) will update accordingly. The MDN article I point to in my answer is really misleading then. Also, do you know for the FF bug I mention about stream.addTrack and MediaRecorder taking only the initial tracks? Or should I open a bug report?Estrange
@Estrange Yes, please file a bug. Did you try new MediaStream([canvasTrack, audioTrack])?Laquitalar
@Laquitalar "TypeError: Argument 1 is not valid for any of the 2-argument overloads of MediaRecorder.". Tried with both streams and with both tracks in an array, in a Blob containing both tracks/streams. Anyway, that would still be a bug that it doesn't record the manually assigned tracks.Estrange
@Estrange I've updated the MDN article. Thanks for pointing out it was outdated! Yes please file a bug.Laquitalar
@jib, not sure it needs an SO question either, since the only response I would get would be "file a bug" and that it's now done : bugzilla.mozilla.org/show_bug.cgi?id=1296531 ps: OP asked for audio in comments, and anyway, chrome doesn't work either.Estrange
S
1

The MediaRecorder + canvas.captureStream approach in Kaiido's answer is definitely the way to go at the moment - unfortunately as of writing it's only supported in Chrome and Firefox.

Another approach that will work when browsers adopt webp encoding support (only Chrome has it, currently) is this:

let frames = []; // <-- frames must be *webp* dataURLs
let webmEncoder = new Whammy.Video(fps); 
frames.forEach(f => webmEncoder.add(f));
let blob = await new Promise(resolve => webmEncoder.compile(false, resolve));
let videoBlobUrl = URL.createObjectURL(blob);

It uses the whammy library to join a bunch of webp images into a webm video. In browsers that support webp encoding you can write canvas.toDataURL("image/webp") to get a webp dataURL from a canvas. This is the relevant bug report for Firefox webp support.

One cross-browser approach as of writing seems to be to use libwebp.js to convert the png dataURLs output by canvas.toDataURL() into webp images, and then feeding those into the whammy encoder to get your final webm video. Unfortunately the png-->webp encoding process is very slow (several minutes for a few seconds of video on my laptop).

Edit: I've discovered that with the MediaRecorder/captureStream approach, you can end up with low-quality output videos compared to the whammy approach. So unless there's some way to control the quality of the captured stream, the whammy approach seems like the best approach, with the other two as fall-backs. See this question for more details. Use this snippet to detect whether a browser supports webp encoding (and thus supports the whammy approach):

let webPEncodingIsSupported = document.createElement('canvas').toDataURL('image/webp').startsWith('data:image/webp');
Stove answered 17/9, 2018 at 12:31 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.