it is possible to stream video with SignalR?
Asked Answered
A

4

29

Well I'm trying to perform a proof about video streaming, I'm working with asp.net c#. I'm kind of lost, you have any idea or suggestion?

Argue answered 22/10, 2012 at 17:50 Comment(1)
It will not be the best tool for the job. Some people use a screwdriver like a chisel with a brick instead of a hammer, and also drive screws into the wood with the same brick, and they end up with a masterpiece. If you can afford them, the right tools will make the job allot easier and will cause you less headache along the way. The techniques used in SignalR are not of much use for streaming video. Rather try porting IceCast to .net.Allare
F
27

No, SignalR is based on standards (WebSockets, LongPolling, ForeverFrame, etc.) which only stream text based JSON messages. You're probably better off looking into the WebRTC specification. Now, you could bring these two technologies together by sending control messages with SignalR that triggers some JavaScript that changes the WebRTC feed that the browser is currently showing.

Faradism answered 23/10, 2012 at 1:12 Comment(1)
MessagePack is now supported. You could even inject your custom hub message serializer.Fluoresce
I
30

I implemented video streaming on top of SignalR. You can find my example at http://weblogs.asp.net/ricardoperes/archive/2014/04/24/video-streaming-with-asp-net-signalr-and-html5.aspx.

Imphal answered 24/4, 2014 at 16:7 Comment(6)
link is broken :)Appreciative
It’s not broken, it’s apparently a problem with weblogs.asp.net... should be fixed soon, hopefully!Imphal
Still not working, no alternate links, repo or something?Appreciative
See github.com/rjperes/…Imphal
The site is backImphal
Much appreciated.Appreciative
F
27

No, SignalR is based on standards (WebSockets, LongPolling, ForeverFrame, etc.) which only stream text based JSON messages. You're probably better off looking into the WebRTC specification. Now, you could bring these two technologies together by sending control messages with SignalR that triggers some JavaScript that changes the WebRTC feed that the browser is currently showing.

Faradism answered 23/10, 2012 at 1:12 Comment(1)
MessagePack is now supported. You could even inject your custom hub message serializer.Fluoresce
S
4

Yes, recent versions of SignalR core supports streaming.
Some samples provided by Microsoft.

Scriber answered 20/1, 2022 at 11:55 Comment(0)
L
3

I do not know if SignalR is intentioned for working with video stream or not but SignalR is a Hub container between client-to-client client-to-server and server-to-client. If I want a Video-chat why I can not use it as my hub? Anyway SignalR can handle array of bytes too and not only Strings then let try by sending each frame as a byte[] (Stream). At least when I using only .Net I can hub byte[]. When I put in Python then I need to serialize to string with base64 and it is working from my PI too. Put an eye on my lab-solution I push into my GIT. https://github.com/Guille1878/VideoChat

SignalR Hub (Default, not serverless)

namespace ChatHub
{
    public interface IVideoChatClient
    {
        Task DownloadStream(byte[] stream);
    }

    public class VideoChatHub : Hub<IVideoChatClient> 
    {
        public async Task UploadStream(byte[] stream)
        {
            await Clients.All.DownloadStream(stream);
        }
    }
}

Video-Sender: (UWP)

while (isStreamingOut)
{
      var previewProperties = mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview) as VideoEncodingProperties;

      VideoFrame videoFrame = new VideoFrame(BitmapPixelFormat.Bgra8, (int)previewProperties.Width, (int)previewProperties.Height);
    
      Var frame = await mediaCapture.GetPreviewFrameAsync(videoFrame)

      if (frame == null)
      {
            await Task.Delay(delayMilliSeconds);
            continue;
      }

      var memoryRandomAccessStream = new InMemoryRandomAccessStream();
      var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, memoryRandomAccessStream);
      encoder.SetSoftwareBitmap(frame.SoftwareBitmap);  
      encoder.IsThumbnailGenerated = false;
      await encoder.FlushAsync();

      try
      {
             var array = new byte[memoryRandomAccessStream.Size];
             await memoryRandomAccessStream.ReadAsync(array.AsBuffer(), (uint)memoryRandomAccessStream.Size, InputStreamOptions.None);

             if (array.Any())
                  await connection.InvokeAsync("UploadStream", array);                   
       }
       catch (Exception ex)
       {
              System.Diagnostics.Debug.WriteLine(ex.Message);
       }

       await Task.Delay(5);
}

Video-receiver: (UWP)

private async void StreamVideo_Click(object sender, RoutedEventArgs e)
{
      isStreamingIn = StreamVideo.IsChecked ?? false;
      if (isStreamingIn)
      {
            hubConnection.On<byte[]>("DownloadStream", (stream) =>
            {
                  _ = this.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
                  {
                       if (isStreamingIn)
                           StreamedArraysQueue.Enqueue(stream);
                  });
             });

             if (hubConnection.State == HubConnectionState.Disconnected)
                  await hubConnection.StartAsync();

              _ = BuildImageFrames();
           }
       }     
}

private async Task BuildImageFrames()
{
      while (isStreamingIn)
      {
            await Task.Delay(5);

            StreamedArraysQueue.TryDequeue(out byte[] buffer);

            if (!(buffer?.Any() ?? false))
                  continue;

            try
            {
                    var randomAccessStream = new InMemoryRandomAccessStream();
                    await randomAccessStream.WriteAsync(buffer.AsBuffer());
                    randomAccessStream.Seek(0); 
                    await randomAccessStream.FlushAsync();

                    var decoder = await BitmapDecoder.CreateAsync(randomAccessStream);

                    var softwareBitmap = await decoder.GetSoftwareBitmapAsync();

                    var imageSource = await ConvertToSoftwareBitmapSource(softwareBitmap);

                    ImageVideo.Source = imageSource;
             }
             catch (Exception ex)
             {
                    System.Diagnostics.Debug.WriteLine(ex.Message);
             }
       }
}

I am using "SignalR Core"

Laurentium answered 31/7, 2020 at 9:52 Comment(2)
Not right now. My code send images (frames) and text-messages. But I need voice too then I'm going to work with voice live sending later. If you only want to send a short audio like Whatsapp then it's easy, you can send the audio as a stream in byte[]. But to live streaming audio has more challenges. I'm going to take it later with my next lab.Laurentium
If you managed to get that working, please let me know. I really need that piece of code! thanks in advanceTankoos

© 2022 - 2024 — McMap. All rights reserved.