25

Well I'm trying to perform a proof about video streaming, I'm working with asp.net c#. I'm kind of lost, you have any idea or suggestion?

luis_laurent
  • 744
  • 1
  • 10
  • 30
  • It will not be the best tool for the job. Some people use a screwdriver like a chisel with a brick instead of a hammer, and also drive screws into the wood with the same brick, and they end up with a masterpiece. If you can afford them, the right tools will make the job allot easier and will cause you less headache along the way. The techniques used in SignalR are not of much use for streaming video. Rather try porting IceCast to .net. – Louis Somers Oct 19 '20 at 19:58

3 Answers3

27

I implemented video streaming on top of SignalR. You can find my example at http://weblogs.asp.net/ricardoperes/archive/2014/04/24/video-streaming-with-asp-net-signalr-and-html5.aspx.

Ricardo Peres
  • 11,795
  • 4
  • 45
  • 64
26

No, SignalR is based on standards (WebSockets, LongPolling, ForeverFrame, etc.) which only stream text based JSON messages. You're probably better off looking into the WebRTC specification. Now, you could bring these two technologies together by sending control messages with SignalR that triggers some JavaScript that changes the WebRTC feed that the browser is currently showing.

Drew Marsh
  • 32,415
  • 2
  • 79
  • 100
0

I do not know if SignalR is intentioned for working with video stream or not but SignalR is a Hub container between client-to-client client-to-server and server-to-client. If I want a Video-chat why I can not use it as my hub? Anyway SignalR can handle array of bytes too and not only Strings then let try by sending each frame as a byte[] (Stream). At least when I using only .Net I can hub byte[]. When I put in Python then I need to serialize to string with base64 and it is working from my PI too. Put an eye on my lab-solution I push into my GIT. https://github.com/Guille1878/VideoChat

SignalR Hub (Default, not serverless)

namespace ChatHub
{
    public interface IVideoChatClient
    {
        Task DownloadStream(byte[] stream);
    }

    public class VideoChatHub : Hub<IVideoChatClient> 
    {
        public async Task UploadStream(byte[] stream)
        {
            await Clients.All.DownloadStream(stream);
        }
    }
}

Video-Sender: (UWP)

while (isStreamingOut)
{
      var previewProperties = mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview) as VideoEncodingProperties;

      VideoFrame videoFrame = new VideoFrame(BitmapPixelFormat.Bgra8, (int)previewProperties.Width, (int)previewProperties.Height);
    
      Var frame = await mediaCapture.GetPreviewFrameAsync(videoFrame)

      if (frame == null)
      {
            await Task.Delay(delayMilliSeconds);
            continue;
      }

      var memoryRandomAccessStream = new InMemoryRandomAccessStream();
      var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, memoryRandomAccessStream);
      encoder.SetSoftwareBitmap(frame.SoftwareBitmap);  
      encoder.IsThumbnailGenerated = false;
      await encoder.FlushAsync();

      try
      {
             var array = new byte[memoryRandomAccessStream.Size];
             await memoryRandomAccessStream.ReadAsync(array.AsBuffer(), (uint)memoryRandomAccessStream.Size, InputStreamOptions.None);

             if (array.Any())
                  await connection.InvokeAsync("UploadStream", array);                   
       }
       catch (Exception ex)
       {
              System.Diagnostics.Debug.WriteLine(ex.Message);
       }

       await Task.Delay(5);
}

Video-receiver: (UWP)

private async void StreamVideo_Click(object sender, RoutedEventArgs e)
{
      isStreamingIn = StreamVideo.IsChecked ?? false;
      if (isStreamingIn)
      {
            hubConnection.On<byte[]>("DownloadStream", (stream) =>
            {
                  _ = this.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
                  {
                       if (isStreamingIn)
                           StreamedArraysQueue.Enqueue(stream);
                  });
             });

             if (hubConnection.State == HubConnectionState.Disconnected)
                  await hubConnection.StartAsync();

              _ = BuildImageFrames();
           }
       }     
}

private async Task BuildImageFrames()
{
      while (isStreamingIn)
      {
            await Task.Delay(5);

            StreamedArraysQueue.TryDequeue(out byte[] buffer);

            if (!(buffer?.Any() ?? false))
                  continue;

            try
            {
                    var randomAccessStream = new InMemoryRandomAccessStream();
                    await randomAccessStream.WriteAsync(buffer.AsBuffer());
                    randomAccessStream.Seek(0); 
                    await randomAccessStream.FlushAsync();

                    var decoder = await BitmapDecoder.CreateAsync(randomAccessStream);

                    var softwareBitmap = await decoder.GetSoftwareBitmapAsync();

                    var imageSource = await ConvertToSoftwareBitmapSource(softwareBitmap);

                    ImageVideo.Source = imageSource;
             }
             catch (Exception ex)
             {
                    System.Diagnostics.Debug.WriteLine(ex.Message);
             }
       }
}

I am using "SignalR Core"

Wille Esteche
  • 1,039
  • 7
  • 6
  • Not right now. My code send images (frames) and text-messages. But I need voice too then I'm going to work with voice live sending later. If you only want to send a short audio like Whatsapp then it's easy, you can send the audio as a stream in byte[]. But to live streaming audio has more challenges. I'm going to take it later with my next lab. – Wille Esteche Aug 20 '20 at 06:08
  • If you managed to get that working, please let me know. I really need that piece of code! thanks in advance – AmirHossein Parsapour Aug 21 '20 at 08:22