Convert incoming NSStream to View
Asked Answered
F

2

14

I'm successfully sending a stream of NSData. The delegate method below is getting that stream and appending to NSMutableData self.data. How do I take this data and make it into a UIView/AVCaptureVideoPreviewLayer (which should show video)? I feel like I'm missing another conversion, AVCaptureSession > NSStream > MCSession > NSStream > ?

- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode {
    switch(eventCode) {
        case NSStreamEventHasBytesAvailable:
        {
            if(!self.data) {
                self.data = [NSMutableData data];
            }
            uint8_t buf[1024];
            unsigned int len = 0;
            len = [(NSInputStream *)stream read:buf maxLength:1024];
            if(len) {
                [self.data appendBytes:(const void *)buf length:len];
            } else {
                NSLog(@"no buffer!");
            }

// Code here to take self.data and convert the NSData to UIView/Video
}

I send the stream with this:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
//    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);


    NSError *error;
    self.oStream = [self.mySession startStreamWithName:@"videoOut" toPeer:[[self.mySession connectedPeers]objectAtIndex:0] error:&error];
    self.oStream.delegate = self;
    [self.oStream scheduleInRunLoop:[NSRunLoop mainRunLoop]
                            forMode:NSDefaultRunLoopMode];
    [self.oStream open];

    [self.oStream write:[data bytes] maxLength:[data length]];






//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );

    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

    NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );
}
Flu answered 13/3, 2014 at 20:54 Comment(8)
You might want to see if you can use Open GL. Take your data, convert it into GL textures, then use GL to show it. There's probably a higher-level API for this. The data isn't in any standard format?Benefice
What's the video format? A UIView? What's the link with a video?Publea
Video format is AVCaptureSessionFlu
Do you have to lock/unlock the pixels each frame? I wonder if that will be costly time wise.Benefice
I was unaware that's what I was doing. Where do you see that in the code?Flu
If your idea is to stream video, this is probably a very nieve approach. Understand that CMSampleBufferGetImageBuffer() returns the raw data that makes up the image (4 bytes per pixel RGBA). So if you stream this, you are still missing the timing and description info (and audio) that make up a video. Also, keep in mind that you will need at a minimum 20 frames(images) per second to make it look smooth. if you intend to follow this path, your options are either to create a new file at the receiver using AVAssetWriter, or render each image using OpenGL.Hormone
The problem with creating a file is writing the whole file before displaying it. I want to stream the video over MCSession. I can't believe I need a custom API for this. AVCaptureSession > NSStream > MCSession > NSStream > ? I can't get the stream back to AVFlu
Again, You're only passing raw image data... Here is a similar question. #20150837 .It contains a link to a github repo where the author has started on a similar project. Again, taking a quick look at the code, it doesn't appear to render video, but rather display images. It's not a complex task to add in the missing OpenGL Render code.Hormone
T
1

I think you need AVCaptureManager, see if the code below works for you..

AVCamCaptureManager *manager = [[AVCamCaptureManager alloc] init];
[self setCaptureManager:manager];

[[self captureManager] setDelegate:self];

if ([[self captureManager] setupSession]) {
     // Create video preview layer and add it to the UI
    AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
    UIView *view = self.videoPreviewView;//Add a view in XIB where you want to show video
    CALayer *viewLayer = [view layer];
    [viewLayer setMasksToBounds:YES];
    CGRect bounds = [view bounds];

    [newCaptureVideoPreviewLayer setFrame:bounds];

    [newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];

    [self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
        [[[self captureManager] session] startRunning];
    });
}

Manage the delegates

- (void)captureManager:(AVCamCaptureManager *)captureManager didFailWithError:(NSError *)error
{

}

- (void)captureManagerRecordingBegan:(AVCamCaptureManager *)captureManager
{

}

- (void)captureManagerRecordingFinished:(AVCamCaptureManager *)captureManager outputURL:(NSURL *)url
{



}

- (void)captureManagerStillImageCaptured:(AVCamCaptureManager *)captureManager
{



}

- (void)captureManagerDeviceConfigurationChanged:(AVCamCaptureManager *)captureManager
{

}

I hope it helps.

Tieshatieup answered 21/3, 2014 at 20:11 Comment(2)
None of the captureManager's delegate methods have a way to handle video. Am I missing something?Flu
@Flu See this developer.apple.com/library/ios/samplecode/AVCam/Introduction/… if it helps..Tieshatieup
C
-2

You can make an UIImageView on yout handle event like this:

UIImageView * iv = [[UIImageView alloc] initWithImage: [UIImage imageWithData: self.data];

Also you can alloc just once and just call init.

Each time u receive from socket, you initialize the UIImageView and you can show it adding that UIImageView to a UIView.

Sorry for my english, I don´t know if I have understood you

Catchpenny answered 21/3, 2014 at 11:51 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.