Convert ArrayBuffer into ImageData for drawing on canvas: optimization
Asked Answered
S

1

7

I am streaming video over a WebSocket by sending each frame in the raw ImageData format (4 bytes per pixel in RGBA order). When I receive each frame on the client (as an ArrayBuffer), I want to paint this image directly onto the canvas as efficiently as possible, using putImageData.

This is my current solution:

// buffer is an ArrayBuffer representing a properly-formatted image
var array = new Uint8ClampedArray(buffer);
var image = new ImageData(array, width, height);
canvas.putImageData(image, 0, 0);

But it is rather slow. My theories as to why:

  • the array (which is ~1MB in size) is being copied thrice, once into the Uint8ClampedArray, once into the ImageData, and lastly into the canvas, each frame (30 times per second).

  • I am using new twice for each frame, which may be a problem for the garbage collector.

Are these theories correct and if so, what tricks can I employ to make this as fast as possible? I am willing to accept an answer that is browser-specific.

Shaman answered 21/9, 2016 at 3:18 Comment(0)
L
5

No, both your ImageData image and your TypedArray array share the exact same buffer buffer.

These are just pointers, your original buffer is never "copied".

var ctx = document.createElement('canvas').getContext('2d');

var buffer = ctx.getImageData(0,0,ctx.canvas.width, ctx.canvas.height).data.buffer;

var array = new Uint8ClampedArray(buffer);

var image = new ImageData(array, ctx.canvas.width, ctx.canvas.height);

console.log(array.buffer === buffer && image.data.buffer === buffer);

For your processing time issue, the best way would be to simply send directly the video stream to a videoElement and use drawImage.

Lula answered 21/9, 2016 at 3:36 Comment(3)
Thanks, your example makes a lot of sense. Unfortunately I can't use a video element since it has to be realtime and HTML5 doesn't technically support streamingShaman
@Shaman Actually it is part of the specs. You should be able to get a MediaStream from a videoElement using videoElement.captureStream() method, currently prefixed in FF through mozCaptureStream. Then you should be able to send it through WebRTC, socket.io, or WebSocket. Finally, you just have to set client-side videoElement's srcObject to the MediaStream sent. I don't have the server-side knowledge, but here is a front-side demo converting a recorded file to a video stream : jsfiddle.net/usk05sfsLula
And if your stream does only come from sever, I think you can use MediaSource API and send chunks of the file. developer.mozilla.org/en-US/docs/Web/API/MediaSourceLula

© 2022 - 2024 — McMap. All rights reserved.