The MediaRecorder
+ canvas.captureStream
approach in Kaiido's answer is definitely the way to go at the moment - unfortunately as of writing it's only supported in Chrome and Firefox.
Another approach that will work when browsers adopt webp encoding support (only Chrome has it, currently) is this:
let frames = []; // <-- frames must be *webp* dataURLs
let webmEncoder = new Whammy.Video(fps);
frames.forEach(f => webmEncoder.add(f));
let blob = await new Promise(resolve => webmEncoder.compile(false, resolve));
let videoBlobUrl = URL.createObjectURL(blob);
It uses the whammy library to join a bunch of webp images into a webm video. In browsers that support webp encoding you can write canvas.toDataURL("image/webp")
to get a webp dataURL from a canvas. This is the relevant bug report for Firefox webp support.
One cross-browser approach as of writing seems to be to use libwebp.js to convert the png dataURLs output by canvas.toDataURL()
into webp images, and then feeding those into the whammy encoder to get your final webm video. Unfortunately the png-->webp encoding process is very slow (several minutes for a few seconds of video on my laptop).
Edit: I've discovered that with the MediaRecorder
/captureStream
approach, you can end up with low-quality output videos compared to the whammy approach. So unless there's some way to control the quality of the captured stream, the whammy
approach seems like the best approach, with the other two as fall-backs. See this question for more details. Use this snippet to detect whether a browser supports webp
encoding (and thus supports the whammy approach):
let webPEncodingIsSupported = document.createElement('canvas').toDataURL('image/webp').startsWith('data:image/webp');
canvas.captureStream
? Is that the framerate. – Tolliver