How to stream camera from one iOS device to another using multi peer connectivity
Asked Answered
S

2

8

How can we efficiently transfer a camera feed from one iOS device to another using bluetooth or wifi in iOS 7. Below is code for getting the stream buffer.

- (void)captureOutput:(AVCaptureOutput *)captureOutput
         didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
         fromConnection:(AVCaptureConnection *)connection
{
    // Create a UIImage from the sample buffer data
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];


}

    // Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
      bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

here we can get Image which is being captured by iOS camera.

Can we send sample buffer info directly to another device using multi peer or is there any efficient way to steam the data to other iOS devices ?

Thank you.

Spicule answered 12/9, 2014 at 11:46 Comment(3)
Multipeer Connectivity sounds like a valid option. But you'll need to check the performance. Sending uncompressed images will likely require too much bandwidth so you'll probably have to create a real video stream in order to be able to transfer live capture.Manisa
Edits must be 6 characters, so unless we come up with filler, this post will eternally steam the feedRadicle
Good question Sandipbhai, Upvoted..Tremaine
S
1

I got the way of doing it , We can use multi peer connectivity to stream compressed images so that it will look like streaming of camera.

One peer who is going to send the stream will use this code.In captureOutput Delegate method :

     NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2);

    // maybe not always the correct input?  just using this to send current FPS...
    AVCaptureInputPort* inputPort = connection.inputPorts[0];
    AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input;
    CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration;
    NSDictionary* dict = @{
                           @"image": imageData,
                           @"timestamp" : timestamp,
                           @"framesPerSecond": @(frameDuration.timescale)
                           };
    NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict];


    [_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil];

And at the receiving side :

- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID {

//    NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length);

    NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data];
    UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0];
    NSNumber* framesPerSecond = dict[@"framesPerSecond"];


}

We will get FPS value and accordingly we can set parameters to manage our streaming images.

Hope it will help.

Thank you.

Spicule answered 15/9, 2014 at 12:58 Comment(1)
Sandip did you tried streaming the audio files between two iPhone devices using multi-peer ?Lifeline
K
1

Here's the best way to do it (and, I explain why at the end):

On the iOS device sending the image data:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);


    UIImage *image = [[UIImage alloc] initWithCGImage:newImage scale:1 orientation:UIImageOrientationUp];
    CGImageRelease(newImage);
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    if (image) {
        NSData *data = UIImageJPEGRepresentation(image, 0.7);
        NSError *err;
        [((ViewController *)self.parentViewController).session sendData:data toPeers:((ViewController *)self.parentViewController).session.connectedPeers withMode:MCSessionSendDataReliable error:&err];
    }
}

On the iOS device receiving the image data:

typedef struct {
    size_t length;
    void *data;
} ImageCacheDataStruct;

- (void)session:(nonnull MCSession *)session didReceiveData:(nonnull NSData *)data fromPeer:(nonnull MCPeerID *)peerID
{
  dispatch_async(self.imageCacheDataQueue, ^{
        dispatch_semaphore_wait(self.semaphore, DISPATCH_TIME_FOREVER);
        const void *dataBuffer = [data bytes];
        size_t dataLength = [data length];
        ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
        imageCacheDataStruct->data = (void*)dataBuffer;
        imageCacheDataStruct->length = dataLength;

        __block const void * kMyKey;
        dispatch_queue_set_specific(self.imageDisplayQueue, &kMyKey, (void *)imageCacheDataStruct, NULL);

        dispatch_sync(self.imageDisplayQueue, ^{
            ImageCacheDataStruct *imageCacheDataStruct = calloc(1, sizeof(imageCacheDataStruct));
            imageCacheDataStruct = dispatch_queue_get_specific(self.imageDisplayQueue, &kMyKey);
            const void *dataBytes = imageCacheDataStruct->data;
            size_t length = imageCacheDataStruct->length;
            NSData *imageData = [NSData dataWithBytes:dataBytes length:length];
            UIImage *image = [UIImage imageWithData:imageData];
            if (image) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    [((ViewerViewController *)self.childViewControllers.lastObject).view.layer setContents:(__bridge id)image.CGImage];
                    dispatch_semaphore_signal(self.semaphore);
                });
            }
        });
    });
}

The reason for the semaphores and the separate GCD queues is simple: you want the frames to display at equal time intervals. Otherwise, the video will seem to slow down at first at times, right before speeding up way past normal in order to catch up. My scheme ensures that each frame plays one after another at the same pace, regardless of network bandwidth bottlenecks.

Kovach answered 21/8, 2017 at 17:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.