How do I stream video from iPhone acting as a server?
Asked Answered
C

2

8

I'm working on an app for iOS, where one iPhone has to live stream its camera recordings to another iPhone (to keep things simple, both are in the same Wi-Fi network).
The streaming should work without a physical interconnect (e.g. a server used for routing the stream to clients). In fact, the recording iPhone should be the server which serves the other iPhone (or more other iOS devices in the network) with the live stream.

So, what I need to do is:

  1. Get the live pictures from the camera
  2. Process this data if needed
  3. Send frame by frame to the connected clients (TCP?)
  4. Receive the frames on the client and display them in real time

What I have and what I'm stuck with:

  1. I have already solved problem 1. I use an AVCaptureSession which is constantly returning CMSampleBufferRef's (found here).

  2. I'm not so sure yet what I need to do with the CMSampleBufferRef. I do know how to transform it into a CGImage or a UIImage (thanks to Benjamin Loulier's great blogpost2), but I have no idea of what specifically I need to send and if I need to encode the frames somehow.
    As mentioned by @jab in the above linked answer (this) it is possible to write those samples to a file with one or more AVAssetWriter's. But then again he says those 5 sec video snippets are to be uploaded to a server which makes a streamable movie file out of them (and that movie can then be streamed to an iOS device by HTTP Live Streaming I suppose).

  3. As I already indicated, my app (i.e. the video capturing "server" device) has one or multiple clients connected to it and needs to send the video frames in real time to them.
    One idea which came to my mind is to use a simple TCP connection where the server sends every single frame in a serialised format to the connected clients in a loop. More specifically: when one buffered frame is successfully sent to the client, the server takes the most recent frame as the next one to be sent.
    Now: is this the right thought how it should work? Or is there another protocol, which is much better suited for this kind of task?
    Remember: I want to keep it simple (simple for me, i.e., so that I don't need to study too many new programming aspects) and fast. I already know some things about TCP, I wrote servers and clients with it at school in C, so I'd prefer to apply the knowledge I have now to this project...

  4. Last but not least, the receiving client:
    I imagine, if I'm really going to use a TCP connection, that on the client-side I receive frame after frame from the server, cast the read byte package into the used format (CMSampleBuffer, CGImage, UIImage) and just display it on a CALayer or UIImageView, right? The effect of a movie will be gotten by just constantly keeping updated that image view.

Please give me some ideas on how to reach this goal. It is very important, because it's part of my school-graduation project... Any sample code is also appreciated ;-) Or just refer me to another site, tutorial, Stackoverflow-answer, etc.

If you have any question to this, just leave a comment and I'll update the post.

Commercialize answered 3/1, 2014 at 1:7 Comment(2)
Do you want to stream audio as well?Samarskite
@Samarskite Streaming audio would be the next step, but not really necessary. Anyway, I'd prefer to have it in the finished project :) Do you have any suggestion for that?Commercialize
V
6
  1. Sounds OK?

  2. Video frames are really big. You're going to have bandwidth problems streaming video from one device to another. You can compress the frames as JPEGs using UIImageJPEGRepresentation from a UIImage, but that's computationally expensive on the "server", and still may not make them small enough to stream well. You can also reduce your frame rate and/or resolution by dropping frames, downsampling the UIImages, and fiddling with the settings in your AVCaptureSession. Alternately, you can send small (5-second) videos, which are hardware-compressed on the server and much easier to handle in bandwidth, but will of course give you a 5-second lag in your stream.

  3. If you can require iOS 7, I'd suggest trying MultipeerConnectivity.framework. It's not terribly difficult to set up, and I believe it supports multiple clients. Definitely use UDP rather than TCP if you're going to roll your own networking - this is a textbook application for UDP, and it has lower overhead.

  4. Frame by frame, just turn the JPEGs into UIImages and use UIImageView. There's significant computation involved, but I believe you'll still be limited by bandwidth rather than CPU. If you're sending little videos, you can use MPMoviePlayerController. There will probably be little glitches between each video as it "prepares" them for playback, which will also result in requiring 5.5 seconds or so to play each 5-second video. I wouldn't recommend using HTTP Live Streaming unless you can get a real server into the mix somewhere. Or you could use an ffmpeg pipeline -- feed videos in and pop individual frames out -- if you can/want to compile ffmpeg for iOS.

Let me know if you need clarification on any of these points. It's a lot of work but relatively straightforward.

Vassalize answered 3/1, 2014 at 16:22 Comment(7)
Thanks for your answer. It gave me some good new pieces of advice. Regarding point 2: having a lag of relevant duration in the stream is no possibility for me :( I'll see what I can do with compressing etc.Commercialize
Point 3: MultipeerConnectivity.framwork is definitely worth a look, since it also supports sending files and it's a high level implementation (not as dealing with CFSocket's would be). I've read something on UDP just now and I think I'll use that because it's not too hard to switch over from TCP :)Commercialize
Here's a good tutorial including MultipeerConnectivity. It discusses it in the context of Bluetooth, but the framework also utilizes Wi-fi when available.Vassalize
One more question: Do I need to work with UIImages (they need to be created before out of a CMSampleBufferRef, which again costs CPU)? Can't I send CGImage or even the CMSampleBuffer?Commercialize
You could, but the sample buffer by definition is uncompressed. I think you should first try trading CPU for bandwidth, but that's just a hypothesis to test. Streaming to more clients will cost more bandwidth but almost no CPU.Vassalize
Okay. And what if I changed the "definition" of my project to "single client"? So no need for multiple connections. Shall I still use MultipeerConnectivity? And would that change any of the other things radically?Commercialize
I'd still say MultipeerConnectivity is the way to go with a single client. You'll probably still be bandwidth-limited, but it shouldn't be crippling. Note that network latency is variable, so your playback won't be perfect. (Also, don't forget that your packets may be received on a background thread, but you have to update the UI on the main thread. I frequently forget that.)Vassalize
W
-2

If you need a solution off the shelf, you may try some ready streaming libraries. The one I have experience is angl streaming lib. Works pretty well with RTMP output to Wowza media server.

Willis answered 15/1, 2014 at 17:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.