I'm working on an app for iOS, where one iPhone has to live stream its camera recordings to another iPhone (to keep things simple, both are in the same Wi-Fi network).
The streaming should work without a physical interconnect (e.g. a server used for routing the stream to clients). In fact, the recording iPhone should be the server which serves the other iPhone (or more other iOS devices in the network) with the live stream.
So, what I need to do is:
- Get the live pictures from the camera
- Process this data if needed
- Send frame by frame to the connected clients (TCP?)
- Receive the frames on the client and display them in real time
What I have and what I'm stuck with:
I have already solved problem 1. I use an
AVCaptureSession
which is constantly returningCMSampleBufferRef
's (found here).I'm not so sure yet what I need to do with the
CMSampleBufferRef
. I do know how to transform it into aCGImage
or aUIImage
(thanks to Benjamin Loulier's great blogpost2), but I have no idea of what specifically I need to send and if I need to encode the frames somehow.
As mentioned by @jab in the above linked answer (this) it is possible to write those samples to a file with one or moreAVAssetWriter
's. But then again he says those 5 sec video snippets are to be uploaded to a server which makes a streamable movie file out of them (and that movie can then be streamed to an iOS device byHTTP Live Streaming
I suppose).As I already indicated, my app (i.e. the video capturing "server" device) has one or multiple clients connected to it and needs to send the video frames in real time to them.
One idea which came to my mind is to use a simpleTCP
connection where the server sends every single frame in a serialised format to the connected clients in a loop. More specifically: when one buffered frame is successfully sent to the client, the server takes the most recent frame as the next one to be sent.
Now: is this the right thought how it should work? Or is there another protocol, which is much better suited for this kind of task?
Remember: I want to keep it simple (simple for me, i.e., so that I don't need to study too many new programming aspects) and fast. I already know some things about TCP, I wrote servers and clients with it at school inC
, so I'd prefer to apply the knowledge I have now to this project...Last but not least, the receiving client:
I imagine, if I'm really going to use aTCP
connection, that on the client-side I receive frame after frame from the server, cast the read byte package into the used format (CMSampleBuffer
,CGImage
,UIImage
) and just display it on aCALayer
orUIImageView
, right? The effect of a movie will be gotten by just constantly keeping updated that image view.
Please give me some ideas on how to reach this goal. It is very important, because it's part of my school-graduation project... Any sample code is also appreciated ;-) Or just refer me to another site, tutorial, Stackoverflow-answer, etc.
If you have any question to this, just leave a comment and I'll update the post.