How to receive continuous chunk of video as a blob array and set to video tag dynamically in Websocket
Asked Answered
O

1

5

I am trying to make my own broadcasting architecture. In this system i am using Websocket to transfer data since i know it is suitable for continuous data transfer.


In my system there is a Host who initiate webcam live broadcast video. I use MediaStreamRecorder.js which record every 5s chunk of video and send to server through websocket as blob array.

Server simply recieve and send to the all client who are connected in that Session.

When client connected then it receive continuous 5s chunk of video as blob array through Websocket.

My main problem is in Client side how can I set the video blob array to html video source dynamically in every 5 seconds such that it can play every 5s chunk of video data.

I am using Glassfish 4.0 as server and Javscript in Host and Client side. Browser: Chrome Source Code:

ServerBroadCast.java

    package websocket1;

import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.Collections;
import java.util.HashSet;
import java.util.Iterator;
import java.util.Set;

import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;

@ServerEndpoint(value = "/liveStreamMulticast")
public class LiveStreamMultiCast {
    private static final Set<Session> sessions = Collections.synchronizedSet(new HashSet<Session>());

    @OnOpen
    public void whenOpening(Session session) {
        // session.setMaxBinaryMessageBufferSize(1024*512); // 512 KB
        sessions.add(session);
        System.out.println("You are Connected!");
        System.out.println("Total Connection are connected: " + sessions.size());

    }

    @OnMessage
    public void handleVideo(byte[] videoData, Session HostSession) {
        // System.out.println("Insite process video");

        try {
            if (videoData != null) {
                sendVideo(videoData, HostSession);
            }

        } catch (Throwable e) {
            System.out.println("Error sending message " + e.getMessage());
        }
    }


    @OnClose
    public void onClosing(Session session) {
        System.out.println("Goodbye!");
        sessions.remove(session);
    }

    private void sendVideo(byte[] videoData, Session hostSession) throws IOException {

        Iterator<Session> iterator = sessions.iterator();
        Session tempSession = null;

        while (iterator.hasNext()) {
            tempSession = iterator.next();

            // System.out.println("Sever send data to "+ tempSession);
            if (!tempSession.equals(hostSession))
                tempSession.getBasicRemote().sendBinary(ByteBuffer.wrap(videoData));

        }

    }
}

host.html

<html>
<head>
    <title>Demo</title>
    <script type="text/javascript" src="js/required/mediastream.js"></script>
</head>
<body>

<video id="video" autoplay=""></video>

<button id="stopButton" onclick="stop()">Stop</button>
<script type="text/javascript">

var url = "ws://localhost:8080/LiveTraining3Demo/liveStreamMulticast"; // 8080/application_name/value_given_in_annotation

var socket = new WebSocket(url);
    var video = document.querySelector('video');

socket.onopen = function(){

    console.log("Connected to Server!!");

}
socket.onmessage = function(msg){
    console.log("Message come from server");

}
/////////////////////////////////
var wholeVideo =[];
var chunks = [];
var mediaRecorder;
//////////////////////////////////////

  function gotMedia(stream) {
    video.srcObject = stream;
    mediaRecorder = new MediaStreamRecorder(stream);
    console.log("mediaRecorderCalled");
    mediaRecorder.mimeType = 'video/webm';
    mediaRecorder.start(5000);//
    console.log("recorder started");

    mediaRecorder.ondataavailable = (event) =>{
        chunks.push(event.data);
        console.log("push  B");
        wholeVideo.push(event.data);
        console.log("WholeVideo Size:");
        setTimeout(sendData(),5010);
    }



  }


  function sendData(){ 
    //var byteArray = new Uint8Array(recordedTemp);
    const superBuffer =  new Blob(chunks, {
        type: 'video/webm'
        });

     socket.send(superBuffer);
     console.log("Send Data");
      console.table(superBuffer);
      chunks = [];

  }


  navigator.getUserMedia  = navigator.getUserMedia || 
                                     navigator.webkitGetUserMedia ||
                                      navigator.mozGetUserMedia || 
                                       navigator.msGetUserMedia;

  navigator.mediaDevices.getUserMedia({video: true , audio: true})
      .then(gotMedia)
      .catch(e => { console.error('getUserMedia() failed: ' + e); });
    </script>

</body>
</html>

client.html

<html>
<head>

<title>Recieve Video</title>

</head>
<body>
<video id="video" autoplay controls loop
    style="width: 700; height: 500; margin: auto">
    <source src="" type="video/webm">
</video>
<script>
    var url = "ws://localhost:8080/LiveTraining3Demo/liveStreamMulticast"; // 8080/application_name/value_given_in_annotation
    var check = true;
    var socket = new WebSocket(url);
    var videoData = [];
    var superBuffer = null;
    //var videoUrl;

    //socket.binaryType = 'arraybuffer';
    socket.onopen = function() {
        console.log("Connected!!");

    }

    var check = true;
    socket.onmessage = function(videoStream) {

        var video = document.querySelector('video');
        var videoUrl = window.URL.createObjectURL(videoStream.data);
        video.src = videoUrl;
        video.load();
        video.onloadeddata = function() {
            URL.revokeObjectURL(video.src);
            video.play();
        }
        //video.srcObject

        //video.play();

        console.table(videoStream);

    }
    socket.onerror = function(err) {
        console.log("Error: " + err);
    }
</script>
</body>
</html>


When I try to run all other looks fine but in client.html only the video tag source is display with no any video play.

I am working on it since a week. Might be my some implementation goes wrong, I also know WebRTC, Mauz Webrtc Broadcast but i didn't like to go through that complex if there is another simple way to do that. I am not like to use node.js server since i have to make this web application with spring. Any idea can be appreciated. Thanks In Advance!!.

Onomastics answered 29/6, 2018 at 7:45 Comment(1)
I have implemented this concept on the project which is published in Github: github.com/sushant097/MultiLearningPlatformOnomastics
R
7

In client side will get array buffer. So you need to convert array buffer into blob array.

 let video = document.querySelector('video'); 
  let blobArray = [];
 socket.on('message',data=>{
  blobArray.push(new Blob([new Uint8Array(data)],{'type':'video/mp4'}));
  let currentTime = video.currentTime;
  let blob = new Blob(blobArray,{'type':'video/mp4'});
  video.src = window.URL.createObjectURL(blob);
  video.currentTime = currentTime;
  video.play();
 });
Runlet answered 1/7, 2019 at 9:21 Comment(6)
Thank you so much..!! Don't know why it is so under-rated.Mal
the main clue is in currentTime and previously saved chunksConey
This works, But looks like screen flashes.Assailant
creating a new blob URL , each time a new chunk is received , will cause video interruptions . the video will flash and resets . there must be a way to update video's data with the new chunk received , without creating new blob url .Decreasing
Did anyone get a solution for this?Orton
Any update here, for the flickering part.Progesterone

© 2022 - 2024 — McMap. All rights reserved.