Live streaming audio with WebRTC browser => server
Asked Answered
M

1

5

I'm trying to sent some audio stream from my browser to some server(udp, also try websockets). I'm recording audio stream with webrtc , but I have problems with transmitting data from a nodeJS client to the my server. Any idea? is it possible to send audio stream to the server using webrtc(openwebrtc)?

Medarda answered 16/1, 2018 at 14:39 Comment(0)
B
7

To get audio from the browser to the server, you have a few different possibilities.

Web Sockets

Simply send the audio data over a binary web socket to your server. You can use the Web Audio API with a ScriptProcessorNode to capture raw PCM and send it losslessly. Or, you can use the MediaRecorder to record the MediaStream and encode it with a codec like Opus, which you can then stream over the Web Socket.

There is a sample for doing this with video over on Facebook's GitHub repo. Streaming audio only is conceptually the same thing, so you should be able to adapt the example.

HTTP (future)

In the near future, you'll be able to use a WritableStream as the request body with the Fetch API, allowing you to make a normal HTTP PUT with a stream source from a browser. This is essentially the same as what you would do with a Web Socket, just without the Web Socket layer.

WebRTC (data channel)

With a WebRTC connection and the server as a "peer", you can open a data channel and send that exact same PCM or encoded audio that you would have sent over Web Sockets or HTTP.

There's a ton of complexity added to this with no real benefit. Don't use this method.

WebRTC (media streams)

WebRTC calls support direct handling of MediaStreams. You can attach a stream and let the WebRTC stack take care of negotiating a codec, adapting for bandwidth changes, dropping data that doesn't arrive, maintaining synchronization, and negotiating connectivity around restrictive firewall environments. While this makes things easier on the surface, that's a lot of complexity as well. There aren't any packages for Node.js that expose the MediaStreams to you, so you're stuck dealing with other software... none of it as easy to integrate as it could be.

Most folks going this route will execute gstreamer as an RTP server to handle the media component. I'm not convinced this is the best way, but it's the best way I know of at the moment.

Burdened answered 17/1, 2018 at 5:43 Comment(4)
is HTTP put now able to stream ?Cureton
also what are some good places learn about live streaming architecture / system designCureton
@MuhammadUmer No, HTTP PUT doesn't support a ReadableStream as a body yet, and now it seems there is opposition to it from browser devs. There is concern that proxies and other servers won't be able to handle the chunked transfer encoding. Frankly, I think the concern is far overblown, but the folks who know more about the standards are trying to watch out for everyone and keep things compatible. Maybe in HTTP/2. In the mean time, we're stuck hacking around this problem with Web Sockets.Burdened
@MuhammadUmer As for your second question on where/how to learn... the only advice I have is to learn by doing. Streaming media can be pretty opaque... there are a lot of legacy standards that dictate how things go. But, it's not too bad when you get into it!Burdened

© 2022 - 2024 — McMap. All rights reserved.