MediaSource vs MediaStream in Javascript
Asked Answered
G

3

22

My Javascript application gets a WebM video stream over a Websocket connection. There is no delay between remote peer sending video frames and the application getting them.

I create a MediaSource object in the application, to which I "append video frames", and let a video element show it:

video.src = window.URL.createObjectURL(mediaSource);

This works nicely but there is some (less than a second) delay which arguably makes this solution not optimal for video calls.

Evidently, some WebRTC applications use MediaStream instead:

video.srcObject = mediaStream;

...and these show no delay.

I could not determine from documentation whether browsers handle src and srcObject differently.

Another thing I could not find is if it is possible to create a MediaStream and append buffers to it much like with MediaSource. I want to try that just to check if srcObject would not cause the aforementioned delay in my application.

If I use:

video.srcObject = mediaSource;

I get the error:

TypeError: Failed to set the 'srcObject' property on 'HTMLMediaElement': The provided value is not of type 'MediaStream'

Grating answered 14/8, 2018 at 14:11 Comment(4)
If you want low latency, you should use WebRTC. There's a lot more to it than just shuffling data over the wire. Codecs need to be tuned, congestion needs to be accommodated, jitter needs corrected, buffering needs to be configured for low latency, etc.Yoo
my (windows) native application delivers over websocket a webm stream or I could just deliver a vp8 stream. MediaSource avoids using ICE/DTLS/SRTP needed by WebRTC...I would had to implement this whole staff in my native application. What it bothers me is why the same video (vp8) stream is displayed with a slight delay in MediaSource and with no delay in MediaStream. I would like to find out where the difference is. I try to dig into chrome sources with no success...Grating
I told you why, in my comment.Yoo
try fast forward my click the right arrow key multiple times while the stream is ongoingRodrigo
R
32

What you are asking are very good questions, and all of us, streaming video developers, encounter same issues and share same frustration when it comes to plugin-free near real time streaming video in browsers.

Let me address your questions to the best of my knowledge (I have implemented both WebRTC and Media Source Extensions in recent years, for a streaming server software)

  1. " if it is possible to create a MediaStream and append buffers to it, like the MediaSource"

This one is easy - it is NOT possible. MediaStream API: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream does not expose access to MediaStream object's frame buffer, it handles everything internally using WebRTC, either getting frames using getUserMedia (from local webcam), or from RTCPeerConeection (from network). With MediaStream object you don't manipulate frames or segments directly.

And, of course, video.srcObject = mediaSource will not work: video.srcObject must be a MediaStream object created by WebRTC API, nothing else.

  1. "I could not find in the documentation if browsers handle src and srcObject differently"

Hell yes, browsers do treat video.src and video.srcObject very differently; and there is no documentation about it, and it doesn't make much sense. Politics play large role in it.

Notorious examples from Chrome browser:

a. Media Source Extensions (video.src) support AAC audio, but WebRTC (video.srcObject) does not, and never will. The reason is - Google bought too many audio compression companies and one of them - Opus - made it to WebRTC specs, and Google is pushing Opus to be a new "royalty-free" audio king, so no AAC support in video.srcObject, and all the hardware world must implement Opus now. So Google can and is legally allowed to add AAC support to Chrome, because it does it for Media Source Extesnsions (video.src). But it will not add AAC support to WebRTC, never.

b. Chrome uses different strategies for H264 video decoders in video.src and video.srcObject. This makes no sense but it's a fact. For example, on Android, only devices with hardware H264 decoding support will support H264 in WebRTC (video.srcObject). Older devices without hardware H264 support, will not play H264 video via WebRTC. But same devices will play same H264 video via Media Source Extensions (video.src). So video.src must be using a software decoder if hardware is not available. Why the same cannot be done in WebRTC?

Lastly, your VP8 stream will not play on iOS, neither in Media Source Extensions (iOS doesn't support it at all, ha ha ha), nor in WebRTC (iOS only support H264 video for WebRTC, ha ha ha ha). You are asking why Apple does that? ha ha ha ha ha

Rusel answered 15/8, 2018 at 14:5 Comment(6)
In term of pure decoding + rendering latency, which one do you think will provider lower latency? Assume that low-latency mode is enabled for MSE. Under what scenario do you think is the main reason to use MSE over WebRTC, and the reverse?Steels
WebRTC will provide lower latency. Latency in MSE can, unfortunately, fluctuate.Rusel
does it mean that by using webRTC, we will just have the trust the browser to handle all the work? I know for MSE we can trick it (i.e. set duration to be infinite or 0) to enter a "live" low-latency mode, which allows us to have zero-buffering. Can WebRTC do the same?Steels
Yes, in WebRTC you have to trust the browser to do all the work; in the following webpage you can compare the latency of live streaming from IP camera, using MSE and WebRTC players: umediaserver.net/umediaserver/demos.htmlRusel
You should just edit your old answer to include the new information... that would make people easier to find information.Windblown
This answer is somewhat outdated. Assigning MediaSource to the src property is now specified by WHATWG; For those who like to publish applications for Chrome only (shame! shame! shame!), Google/Chromium team are experimenting with some custom media track generation API, but I cannot for the life of me find the link in my browser history, although I am 100% certain I was reading it a week ago.Checkrein
R
5

Lastly, your VP8 stream will not play on iOS, neither in Media Source Extensions (iOS doesn't support it at all, ha ha ha), nor in WebRTC (iOS only support H264 video for WebRTC, ha ha ha ha). You are asking why Apple does that? ha ha ha ha ha

2020 update - now iOS devices support VP8 via WebRTC, and also new iPADs started supporting Media Source Extensions. Way to go, Apple!

Rusel answered 19/4, 2020 at 22:22 Comment(0)
D
1
video.srcObject = mediaStream
video.src = URL.createObjectURL(mediaSource)

tested in electron (so should work in chrome too)

Duax answered 3/10, 2020 at 18:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.