How to use GStreamer to directly stream to a web browser?
Asked Answered
B

3

9

There are many examples online to use GStreamer pipeline with "tcpclientsink" or "udpsink" with NodeJS to consume the GStreamer pipeline output to Web Browser.

But I could not find any example or documentation which clearly explains how to use the webrtcbin element with a NodeJS server to send stream to a web browser. (An alternative to webrtcbin would be fine, too.)

I have the following GStreamer pipeline:

gst-launch-1.0 videotestsrc  \
! queue ! vp8enc ! rtpvp8pay \
! application/x-rtp,media=video,encoding-name=VP8,payload=96 \
! webrtcbin name=sendrecv

Can someone help in consuming this pipeline with a NodeJS based server to display the stream onto a web browser?

Here is a similar example, but it uses tcpclientsink: https://tewarid.github.io/2011/04/26/stream-live-webm-video-to-browser-using-node.js-and-gstreamer.html

Bowlder answered 17/9, 2020 at 21:44 Comment(0)
B
15

Update: In the end, I was able to achieve GStreamer to Browser using NodeJS tutotial mentioned in the question. Here is a proof of concept code that someone can use if needed (or in case the tutorial link gets removed from internet):

var express = require('express')
var http = require('http')
var net = require('net');
var child = require('child_process');
require('log-timestamp');   //adds timestamp in console.log()

var app = express();
app.use(express.static(__dirname + '/'));

var httpServer = http.createServer(app);
const port = 9001;  //change port number is required

//send the html page which holds the video tag
app.get('/', function (req, res) {
    res.send('index.html');
});

//stop the connection
app.post('/stop', function (req, res) {
    console.log('Connection closed using /stop endpoint.');

    if (gstMuxer != undefined) {
        gstMuxer.kill();    //killing GStreamer Pipeline
        console.log(`After gstkill in connection`);
    }
    gstMuxer = undefined;
    res.end();
});

//send the video stream
app.get('/stream', function (req, res) {

    res.writeHead(200, {
        'Content-Type': 'video/webm',
    });

    var tcpServer = net.createServer(function (socket) {
        socket.on('data', function (data) {
            res.write(data);
        });
        socket.on('close', function (had_error) {
            console.log('Socket closed.');
            res.end();
        });
    });

    tcpServer.maxConnections = 1;

    tcpServer.listen(function () {
        console.log("Connection started.");
        if (gstMuxer == undefined) {
            console.log("inside gstMuxer == undefined");
            var cmd = 'gst-launch-1.0';
            var args = getGstPipelineArguments(this);
            var gstMuxer = child.spawn(cmd, args);

            gstMuxer.stderr.on('data', onSpawnError);
            gstMuxer.on('exit', onSpawnExit);

        }
        else {
            console.log("New GST pipeline rejected because gstMuxer != undefined.");
        }
    });
});

httpServer.listen(port);
console.log(`Camera Stream App listening at http://localhost:${port}`)

process.on('uncaughtException', function (err) {
    console.log(err);
});

//functions
function onSpawnError(data) {
    console.log(data.toString());
}

function onSpawnExit(code) {
    if (code != null) {
        console.log('GStreamer error, exit code ' + code);
    }
}

function getGstPipelineArguments(tcpServer) {
    //Replace 'videotestsrc', 'pattern=ball' with camera source in below GStreamer pipeline arguments.
    //Note: Every argument should be written in single quotes as done below
    var args =
        ['videotestsrc', 'pattern=ball',
            '!', 'video/x-raw,width=320,height=240,framerate=100/1',
            '!', 'vpuenc_h264', 'bitrate=2000',
            '!', 'mp4mux', 'fragment-duration=10',
            '!', 'tcpclientsink', 'host=localhost',
            'port=' + tcpServer.address().port];
    return args;
}

And also sharing the HTML code:

<!DOCTYPE html>

<head>
    <title>GStreamer with NodeJS Demo</title>
    <meta name="viewport" content="width=device-width, initial-scale=0.9">

    <style>
        html,
        body {
            overflow: hidden;
        }
    </style>
    
    <script>
        function buffer() {
            //Start playback as soon as possible to minimize latency at startup 
            var dStream = document.getElementById('vidStream');

            try {
                dStream.play();
            } catch (error) {
                console.log("Error in buffer() method.");
                console.log(error);
            }

        }
    </script>
</head>

<body onload="buffer();">
    <video id="vidStream" width="640" height="480" muted>
        <source src="/stream" type="video/mp4" />
        <source src="/stream" type="video/webm" />
        <source src="/stream" type="video/ogg" />
        <!-- fallback -->
        Your browser does not support the <code>video</code> element.
    </video>
</body>
Bowlder answered 8/6, 2021 at 13:20 Comment(12)
Doing video over TCP does have some downside, but glad it worked for you! WebRTC has congestion control + adaptive bitrate so will make sure you don't oversend video if you don't have enough bitrate available. Before GStreamer had webrtcbin I wrote github.com/pion/example-webrtc-applications/tree/master/… which could be helpful.Adulteress
Hi @SeanDuBois, yes we faced some issues with the bitrate initially but we fine-tuned it. And due to varying networkState and readyState, GStreamer does crash sometimes. But we are able to achieve a balance and reload the stream if this happens. On a new browser window (like a new window popup), we have found the stream very stable. Crashes once after 60+ mins of streaming and then we automatically reload the page. So far, its running in an acceptable state. I will take a look at your implementation also. Thanks.Bowlder
what's the html page that the code refers to? When I get this running I just get a blank page with index.js when I load the site (sorry if this is obvious - im a noob with browser stuff). - nvm, it's at localhost:9001/streamHarkness
I've tried a bunch of sample html pages with video tags but none play the stream correctly, do you have an example?Harkness
@PawanPillai Where do you copy/save the html code? or what do you modify so node.js script can use that html.Thomsen
@Thomsen the HTML file needs to be in the same folder as the javascript file above. If your Node JS is working properly and this JS code is executed, then the app.get("/"...) line in javascript will load the index.html when you start the web page. Please check basic nodejs tutorials to get started and it should make sense.Bowlder
I don't think this should be the accepted answer. The question specifically asked to use webrtcbin with a web browser, and this answer uses an alternative to webrtcbin.Intuitive
@Intuitive I know the answer does not point towards webrtcbin but it solves the main requirement of running a GStreamer stream on to a browser. I have used this method to run a realtime GStreamer feed from a robot on to a browser with less than 200ms latency. My main purpose to provide this solution was to help someone else who is looking for a similar solution. Hope this helps.Bowlder
@PawanPillai, I editet the question to reflect that.Intuitive
This answer sends http content type video/webm for mp4.Intuitive
@PawanPillai does this technique still work? Do you stream to chrome? I'm unable to run the pipeline as it says no element "vpuenc_h264" Also h264 is not supported on chrome anymore, do you have something newer for it?Boak
@MáriusRak I am not actively maintaining that project now, so do not know the latest changes in Chrome. But till I was maintaining it, I was getting very high performance stream with under 200ms delay.Bowlder
I
3

Unfortunately it's not that simple. You have to have some way to interact with browser to be able exchange SDP offer/answer, and also ICE candidates exchange.

You can look example here

Intoxicating answered 18/9, 2020 at 5:39 Comment(4)
Yes, it does not look easy setup. I will take a look at the CPP code and let you know if it helps.Bowlder
any pointers on how to use that repo to get a MVP of a simple gstreamer pipeline streaming to a webpage?Harkness
@Harkness you can look to complete app here github.com/WebRTSP/ReStreamer (binaries snapcraft.io/rtsp-to-webrtsp) Also you can find minimal example at github.com/WebRTSP/Native/tree/master/Apps/BasicServer, but it's a little bit broken right now (I'll fix it soon)Intoxicating
@Harkness I've prepared minimal working example github.com/WebRTSP/ClockServer If you will have any questions you can create issue or start discussion on GitHubIntoxicating
M
3

There is a nice integration test for gstreamer (and other applications suchs as browsers) available here: https://github.com/sipsorcery/webrtc-echoes/tree/master/gstreamer. It works, with minimal quirks (at least in chrome). It gets data from this gstreamer pipeline

  pipeline =
     gst_parse_launch ("webrtcbin bundle-policy=max-bundle name=sendonly "
       "videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay ! "
       "queue ! " RTP_CAPS_VP8 " ! sendonly. "
       , &error);

and opens a web server for the browser to obtain this stream. You have to manually open the index.html

Marielamariele answered 20/7, 2021 at 14:7 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.