Upload live streaming video from iPhone like Ustream or Qik
Asked Answered
S

3

41

How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.

Is Apple's Http Living Streaming something I should use? Or something else? Thanks.

Spy answered 25/12, 2009 at 8:26 Comment(2)
They're not using HTTP Live Streaming. All of the recently approved apps are actually using a private API for capturing the screen. Almost inexplicably, Apple reversed the policy on this specific set of CoreGraphics calls and allowed these apps in. Expect a true API for this feature in a future iPhone OS release - these apps will be required to use that when it is available. In the meantime, these currently private calls are okay.Bytom
Hi, I found that we might need a media server like Wowza to allow RTSP streaming, but you can also do something similar without using RTSP by HTTP. I am a bit clueless on this topic now actually, correct me if I am wrong. I understand that people use private API for capturing the screen, but what does it have to do with streaming them to the server? Thanks!Spy
D
46

There isn't a built-in way to do this, as far as I know. As you say, HTTP Live Streaming is for downloads to the iPhone.

The way I'm doing it is to implement an AVCaptureSession, which has a delegate with a callback that's run on every frame. That callback sends each frame over the network to the server, which has a custom setup to receive it.

Here's the flow: https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5-SW2

And here's some code:

// make input device
NSError *deviceError;
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *inputDevice = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:&deviceError];

// make output device
AVCaptureVideoDataOutput *outputDevice = [[AVCaptureVideoDataOutput alloc] init];
[outputDevice setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

// initialize capture session
AVCaptureSession *captureSession = [[[AVCaptureSession alloc] init] autorelease];
[captureSession addInput:inputDevice];
[captureSession addOutput:outputDevice];

// make preview layer and add so that camera's view is displayed on screen
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
previewLayer.frame = view.bounds;
[view.layer addSublayer:previewLayer];

// go!
[captureSession startRunning];

Then the output device's delegate (here, self) has to implement the callback:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );
    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

   NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}

EDIT/UPDATE

Several people have asked how to do this without sending the frames to the server one by one. The answer is complex...

Basically, in the didOutputSampleBuffer function above, you add the samples into an AVAssetWriter. I actually had three asset writers active at a time -- past, present, and future -- managed on different threads.

The past writer is in the process of closing the movie file and uploading it. The current writer is receiving the sample buffers from the camera. The future writer is in the process of opening a new movie file and preparing it for data. Every 5 seconds, I set past=current; current=future and restart the sequence.

This then uploads video in 5-second chunks to the server. You can stitch the videos together with ffmpeg if you want, or transcode them into MPEG-2 transport streams for HTTP Live Streaming. The video data itself is H.264-encoded by the asset writer, so transcoding merely changes the file's header format.

Debidebilitate answered 8/4, 2011 at 21:39 Comment(15)
I should add that I'm not doing it this way anymore, since frame-by-frame upload turned out to be too slow for me. But if you're looking for a way to edit frames as they come in from the device's camera, this is it.Debidebilitate
Can you please share/assist code for uploading video mechanism which is not slow as you mentioned ? Any hint please ?Teece
Well, to speed up the data transfer, the video has to be compressed. So, two possibilities: 1) Compress it on the fly, requiring a codec library plus lots of CPU; or 2) Use the iPhone's built-in, hardware-accelerated mp4 compression -- but that only supports streaming to disk. I am streaming to disk, changing target files every few seconds and uploading the finished files. It's very tricky and complex, even without the workarounds for several Apple bugs I found. You can't easily use a single file as a pipe, because the frame index doesn't get written until the file is closed.Debidebilitate
@NoMoreWishes My list of solutions above is stated a different way in this answer.Debidebilitate
How can we implement Live broadcasting from iOS device to a server by using above answer?Therein
Hi how did you achieve this without frame by frame upload ?Shower
I understand this is old, but I am stuck on the server side of this very topic. How did you configure your server to handle the stream of image frames?Overwrite
@Overwrite - I uploaded short MP4s instead, and concatenated them with ffmpeg. See the final paragraph of my edit above.Debidebilitate
you said HTTP Live Streaming... can it be done with RTMP server like wowza...?Ranchman
@Ranchman I think you're talking about downloading (i.e., watching videos)? This question is about getting the videos from an iPhone to a server. What you do with your videos once they're on the server is a separate topic.Debidebilitate
@Debidebilitate i was saying about broadcasting... i was just asking how do you send those frames to media server like wowza over rtmp...?Ranchman
@Ranchman HTTP Live Streaming is for downloads, so it's not a direct comparison. I don't know anything about Wowza or RTMP, sorry.Debidebilitate
@Debidebilitate what am trying to ask is that you said "callback sends each frame over the network to the server" ... i just want to know how to send frames to the server...Ranchman
@Ranchman I was using HTTP POST on each frame. It wasn't efficient. As I said above, I later switched to uploading 5-second videos, still by HTTP POST.Debidebilitate
Could somebody please take a look at my question regarding live streaming directly from iPhone to iPhone? Thanks #20895310Alyshaalysia
R
3

I have found one library that will help you on this.

HaishinKit Streaming Library

Above Library is giving you all option streaming Via RTMP or HLS.

Just follow this library given step and read it all instruction carefully. Please don't direct run example code given in this library it is having some error instead of that get required class and pod into your demo app.

I have just done it with this you can record screen, Camera and Audio.

Roughhew answered 19/12, 2019 at 6:2 Comment(2)
What kind of latency do you experience with this approach? Say from the instant the image is captured to when it is received by the app and when it is sent to the server?Hannie
I have used this for Screen Record and create m3u8 file.@HannieRoughhew
I
-4

I'm not sure you can do that with HTTP Live Streaming. HTTP Live Streaming segments the video in 10 secs (aprox.) length, and creates a playlist with those segments. So if you want the iPhone to be the stream server side with HTTP Live Streaming, you will have to figure out a way to segment the video file and create the playlist.

How to do it is beyond my knowledge. Sorry.

Inquiring answered 21/2, 2011 at 8:31 Comment(1)
Streaming media involves four steps: (1) encoding the data from hardware, (2) transferring the data to a server, (3) transcoding the data to the right downstream format, (4) downloading, decoding and playing the data. The question is about step 2. Your answer concerns step 3.Romney

© 2022 - 2024 — McMap. All rights reserved.