HTML5 Video: Streaming Video with Blob URLs
Asked Answered
A

2

27

I have an array of Blobs (binary data, really -- I can express it however is most efficient. I'm using Blobs for now but maybe a Uint8Array or something would be better). Each Blob contains 1 second of audio/video data. Every second a new Blob is generated and appended to my array. So the code roughly looks like so:

var arrayOfBlobs = [];
setInterval(function() {
    arrayOfBlobs.append(nextChunk());
}, 1000);

My goal is to stream this audio/video data to an HTML5 element. I know that a Blob URL can be generated and played like so:

var src = URL.createObjectURL(arrayOfBlobs[0]);
var video = document.getElementsByTagName("video")[0];
video.src = src;

Of course this only plays the first 1 second of video. I also assume I can trivially concatenate all of the Blobs currently in my array somehow to play more than one second:

// Something like this (untested)
var concatenatedBlob = new Blob(arrayOfBlobs);
var src = ...

However this will still eventually run out of data. As Blobs are immutable, I don't know how to keep appending data as it's received.

I'm certain this should be possible because YouTube and many other video streaming services utilize Blob URLs for video playback. How do they do it?

Argus answered 14/5, 2018 at 15:16 Comment(2)
Might help : #21922290Acreage
@HyyanAboFakher Just skimmed through that link, unfortunately nothing on there mentions Blob URLs. It's more about encoding and transport stream than the mechanics of playback. The final answer ultimately says "encode in HLS and use HLS.js" -- but the crux of my question is how does HLS.js work? (in my case I don't have HLS, but I'm similarly getting video in chunks)Argus
A
56

Solution

After some significant Googling I managed to find the missing piece to the puzzle: MediaSource

Effectively the process goes like this:

  1. Create a MediaSource
  2. Create an object URL from the MediaSource
  3. Set the video's src to the object URL
  4. On the sourceopen event, create a SourceBuffer
  5. Use SourceBuffer.appendBuffer() to add all of your chunks to the video

This way you can keep adding new bits of video without changing the object URL.

Caveats

  • The SourceBuffer object is very picky about codecs. These have to be declared, and must be exact, or it won't work
  • You can only append one blob of video data to the SourceBuffer at a time, and you can't append a second blob until the first one has finished (asynchronously) processing
  • If you append too much data to the SourceBuffer without calling .remove() then you'll eventually run out of RAM and the video will stop playing. I hit this limit around 1 hour on my laptop

Example Code

Depending on your setup, some of this may be unnecessary (particularly the part where we build a queue of video data before we have a SourceBuffer then slowly append our queue using updateend). If you are able to wait until the SourceBuffer has been created to start grabbing video data, your code will look much nicer.

<html>
<head>
</head>
<body>
    <video id="video"></video>
    <script>
        // As before, I'm regularly grabbing blobs of video data
        // The implementation of "nextChunk" could be various things:
        //   - reading from a MediaRecorder
        //   - reading from an XMLHttpRequest
        //   - reading from a local webcam
        //   - generating the files on the fly in JavaScript
        //   - etc
        var arrayOfBlobs = [];
        setInterval(function() {
            arrayOfBlobs.append(nextChunk());
            // NEW: Try to flush our queue of video data to the video element
            appendToSourceBuffer();
        }, 1000);

        // 1. Create a `MediaSource`
        var mediaSource = new MediaSource();

        // 2. Create an object URL from the `MediaSource`
        var url = URL.createObjectURL(mediaSource);

        // 3. Set the video's `src` to the object URL
        var video = document.getElementById("video");
        video.src = url;

        // 4. On the `sourceopen` event, create a `SourceBuffer`
        var sourceBuffer = null;
        mediaSource.addEventListener("sourceopen", function()
        {
            // NOTE: Browsers are VERY picky about the codec being EXACTLY
            // right here. Make sure you know which codecs you're using!
            sourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs=\"opus,vp8\"");

            // If we requested any video data prior to setting up the SourceBuffer,
            // we want to make sure we only append one blob at a time
            sourceBuffer.addEventListener("updateend", appendToSourceBuffer);
        });

        // 5. Use `SourceBuffer.appendBuffer()` to add all of your chunks to the video
        function appendToSourceBuffer()
        {
            if (
                mediaSource.readyState === "open" &&
                sourceBuffer &&
                sourceBuffer.updating === false
            )
            {
                sourceBuffer.appendBuffer(arrayOfBlobs.shift());
            }

            // Limit the total buffer size to 20 minutes
            // This way we don't run out of RAM
            if (
                video.buffered.length &&
                video.buffered.end(0) - video.buffered.start(0) > 1200
            )
            {
                sourceBuffer.remove(0, video.buffered.end(0) - 1200)
            }
        }
    </script>
</body>
</html>

As an added bonus this automatically gives you DVR functionality for live streams, because you're retaining 20 minutes of video data in your buffer (you can seek by simply using video.currentTime = ...)

Argus answered 15/5, 2018 at 15:37 Comment(10)
i can't understand in the referencce link please can you elaborate the code please!Identification
Well this is the implementation of recorded chunk of video play in the same side. But how to play the received chink of video blob array in video tag continuously from server to client side. Please see my question at : #51097270Identification
hello i am reading in video data from an aws kinsesis stream, i am able to view the video in blobs with createobjectaturl(blob) but i need it to continuously add, like your solution. how do you know your data's codec? on upload side the codec is "video/x-matroska;codecs=avc1" but that is not supported for buffersource. any help would be greatWashin
@Washin If the video that you have is not intended to be delivered as chunks, it may not work. In my experience the best way to make a video work is to encode the video with ISO-BMFF. If you cannot pre-encode the video data, you can transmux it in JavaScript. This is how HLS.js and others work. Unfortunately how to create a JavaScript transmuxer is beyond the scope of a single SO comment. Another option is to try and convert the video to HLS and then just use HLS.jsArgus
Hello @Argus I am getting this error: "Failed to execute 'appendBuffer' on 'SourceBuffer': No function was found that matched the signature provided." Any help would be appreciated.Geography
@SachinKammar What device / browser are you using? iOS doesn't support Media Source Extensions, so that's the first thing that comes to mind.Argus
@Argus Thanks for the replay. I am able to play it, however, the error is still shown in the console. Now I have another issue. if I have already loaded my page and start the streaming it plays but if streaming has started already and a new client joins, it doesn't play. Error is: "DOMException: Failed to load because no supported source was found." One more thing do you have a new solution/approach to this as its 2020?Geography
@SachinKammar Regarding the console error, unfortunately I don't have many other ideas without looking at a site where the issue is occurring. Maybe try making a new SO question about the issue and link a JSFiddle with example code. Regarding a new client joining, "no supported source was found" usually means you have bad video data. Double-check that your video server is keeping the video data in memory instead of flushing it after sending it to one client. As for a new solution / approach, nothing on the HTML API side. Just utilize design patterns to abstract away the ugly parts :)Argus
For others who may come across this: Your video source (mp4) must be fragmented for it to work with MediaSource.Tucson
Somebody must make a valid codecs lister for MediaSource too. How do you expect people to find video/mp4; codecs=avc1.42E01E, mp4a.40.2 and this is not even supported by MediaRecorder. I had to discover a codec which is both ok for MediaRecorder and MediaSource as i am trying videoElement.captureStream() -> MediaRecorder -> delay chunks -> MediaSource sourcebuffer -> delayedVideoElement. MediaRecorder and MediaSource MUST have the same codec duo.Shatter
F
0

Adding to the previous answer...

make sure to add sourceBuffer.mode = 'sequence' in the MediaSource.onopen event handler to ensure the data is appended based on the order it is received. The default value is segments, which buffers until the next 'expected' timeframe is loaded.

Additionally, make sure that you are not sending any packets with a data.size === 0, and make sure that there is 'stack' by clearing the stack on the broadcasting side, unless you are wanting to record it as an entire video, in which case just make sure the size of the broadcast video is small enough, and that your internet speed is fast. The smaller and lower the resolution the more likely you can keep a realtime connection with a client, ie a video call.

For iOS the broadcast needs to made from a iOS/macOS application, and be in mp4 format. The video chunk gets saved to the app's cache and then removed once it is sent to the server. A client can connect to the stream using either a web browser or app across nearly any device.

Fleshly answered 29/6, 2021 at 21:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.