How to make a movie from set of images using UIGetScreenImage
Asked Answered
I

1

8

I have used this method and get multiple images. I am able to successfully create a movie but my problem is that when I play the movie, it seems to be playing too fast i.e. the movie doesn't have all the frames. Here is my code.

-(UIImage *)uiImageScreen
{
  CGImageRef screen = UIGetScreenImage();
  UIImage* image = [UIImage imageWithCGImage:screen];
  CGImageRelease(screen);
  UIImageWriteToSavedPhotosAlbum(image, self,nil, nil);
  return image;
}

-(void) writeSample: (NSTimer*) _timer 
 {
if (assetWriterInput.readyForMoreMediaData) {
    // CMSampleBufferRef sample = nil;

    CVReturn cvErr = kCVReturnSuccess;

    // get screenshot image!
    CGImageRef image = (CGImageRef) [[self uiImageScreen] CGImage];
    NSLog (@"made screenshot");

    // prepare the pixel buffer
    CVPixelBufferRef pixelBuffer = NULL;
    CFDataRef imageData= CGDataProviderCopyData(CGImageGetDataProvider(image));
    NSLog (@"copied image data");
    cvErr = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                         FRAME_WIDTH,
                                         FRAME_HEIGHT,
                                         kCVPixelFormatType_32BGRA,
                                         (void*)CFDataGetBytePtr(imageData),
                                         CGImageGetBytesPerRow(image),
                                         NULL,
                                         NULL,
                                         NULL,
                                         &pixelBuffer);
    NSLog (@"CVPixelBufferCreateWithBytes returned %d", cvErr);

    // calculate the time
    CFAbsoluteTime thisFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    CFTimeInterval elapsedTime = thisFrameWallClockTime - firstFrameWallClockTime;
    NSLog (@"elapsedTime: %f", elapsedTime);
    CMTime presentationTime =  CMTimeMake (elapsedTime * 600, 600);

    // write the sample
    BOOL appended = [assetWriterPixelBufferAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:presentationTime];

    if (appended) 
    {
        NSLog (@"appended sample at time %lf", CMTimeGetSeconds(presentationTime));
    } 
    else 
    {
        NSLog (@"failed to append");

    }
}
}

Then I call this method to create movie.

-(void)StartRecording
{
 NSString *moviePath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:OUTPUT_FILE_NAME];
if ([[NSFileManager defaultManager] fileExistsAtPath:moviePath]) {
    [[NSFileManager defaultManager] removeItemAtPath:moviePath error:nil];
}

NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
NSLog(@"path=%@",movieURL);
NSError *movieError = nil;
[assetWriter release];
assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL 
                                        fileType: AVFileTypeQuickTimeMovie 
                                           error: &movieError];
NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                          AVVideoCodecH264, AVVideoCodecKey,
                                          [NSNumber numberWithInt:320], AVVideoWidthKey,
                                          [NSNumber numberWithInt:480], AVVideoHeightKey,
                                          nil];
assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                      outputSettings:assetWriterInputSettings];
assetWriterInput.expectsMediaDataInRealTime = YES;
[assetWriter addInput:assetWriterInput];

[assetWriterPixelBufferAdaptor release];
assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                 initWithAssetWriterInput:assetWriterInput
                                 sourcePixelBufferAttributes:nil];
[assetWriter startWriting];

firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
[assetWriter startSessionAtSourceTime: CMTimeMake(0, 1000)];

// start writing samples to it
[assetWriterTimer release];
assetWriterTimer = [NSTimer scheduledTimerWithTimeInterval:0.1
                                                    target:self
                                                  selector:@selector (writeSample:)
                                                  userInfo:nil
                                                   repeats:YES] ;
}
Intellect answered 12/7, 2012 at 9:14 Comment(7)
i am still waiting for answer...Intellect
Please help me..I am new in ios. Is there any way to shortout this problem.Intellect
Hey m also stuck in the same thing !! tell me did you sorted out your problm!!Kaffraria
You are right, you are not getting all frames, probably because your grabbing method is slow. First thing I would do is to remove these NSLog lines. They slow down the code immensely. Another thing I would do is to create an array to work like a buffer for the screen shots an asynchronous method to read from that array in a second thread and write that to the stream.Aconcagua
try these links 1. binpress.com/app/ios-screen-capture-view/1038 2.github.com/gabriel/CaptureRecordCosmo
Please help me about this. I am still unable to solve out this problemIntellect
Which API's have you been using until now ?Avow
G
1

try this method....

if (![videoWriterInput isReadyForMoreMediaData]) {
    NSLog(@"Not ready for video data");
}
else {
    @synchronized (self) {
        UIImage* newFrame = [self.currentScreen retain];
        CVPixelBufferRef pixelBuffer = NULL;
        CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
        CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

        int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
        if(status != 0){
            //could not get a buffer from the pool
            NSLog(@"Error creating pixel buffer:  status=%d", status);
        }
        // set image data into pixel buffer
        CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
        uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
        CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);  //XXX:  will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data

        if(status == 0){
            BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
            if (!success)
                NSLog(@"Warning:  Unable to write buffer to video");
        }

        //clean up
        [newFrame release];
        CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
        CVPixelBufferRelease( pixelBuffer );
        CFRelease(image);
        CGImageRelease(cgImage);
    }

}
Griffie answered 23/5, 2013 at 10:34 Comment(1)
Hi, welcome to SO. It may be useful to include an explanation of the key idea in your solution.Ayurveda

© 2022 - 2024 — McMap. All rights reserved.