AVAssetWriterInputPixelBufferAdaptor returns null pixel buffer pool
Asked Answered
B

6

9

I'm sure something's wrong with my buffer attributes, but it's not clear to me what -- it's not well documented what's supposed to go there, so I'm guessing based on CVPixelBufferPoolCreate -- and Core Foundation is pretty much a closed book to me.

    // "width" and "height" are const ints
    CFNumberRef cfWidth = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &width);
    CFNumberRef cfHeight = CFNumberCreate(kCFAllocatorDefault, kCFNumberIntType, &height);

    CFStringRef keys[] = {
        kCVPixelBufferWidthKey,
        kCVPixelBufferHeightKey,
        kCVPixelBufferCGImageCompatibilityKey
    };
    CFTypeRef values[] = {
        cfWidth,
        cfHeight,
        kCFBooleanTrue
    };
    int numValues = sizeof(keys) / sizeof(keys[0]);

    CFDictionaryRef bufferAttributes = CFDictionaryCreate(kCFAllocatorDefault, 
                                                          (const void **)&keys, 
                                                          (const void **)&values,
                                                          numValues,
                                                          &kCFTypeDictionaryKeyCallBacks,
                                                          &kCFTypeDictionaryValueCallBacks
                                                          );

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [[AVAssetWriterInputPixelBufferAdaptor 
                                                      assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                      sourcePixelBufferAttributes:(NSDictionary*)bufferAttributes] retain];
    CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
    NSParameterAssert(bufferPool != NULL); // fails
Bloodstain answered 27/4, 2011 at 21:43 Comment(5)
Hi, I have the same problem, you found a solution ?Ampliate
Not really. I'm just creating a pixel buffer for every frame instead of using the pool. :(Bloodstain
Ok, we have found the same solution. Thanks !Ampliate
Hi @DavidMoles did you found the solution, or do you have working code with "creating a pixel buffer for every frame" ?Lineate
@IraniyaNaynesh Sorry, I haven't looked at this in years. But the docs for pixelBufferPool now say “This property is NULL before the first call to startSessionAtTime:on the associated AVAssetWriter object.” So maybe that was the issue?Bloodstain
T
18

When the pixelBufferPool returns null, check the following:

    1. the output file of the AVAssetsWriter doesn't exist.
    2. use the pixelbuffer after calling startSessionAtTime: on the AVAssetsWriter.
    3. the settings of AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor are correct.
    4. the present times of appendPixelBuffer uses are not the same.
Tort answered 21/11, 2013 at 0:37 Comment(4)
#1 did the trick for me too. Such a simple solution! I was really looking too deep. BTW, my failure was when I called CVPixelBufferPoolCreatePixelBuffer it was returning -6661.Stonehenge
how did you guys fix this with number one? I'm using NSHomeDirectory in swift. Can someone post a github example?Designer
#1 for me as well.Baden
I don't understand #1 - Why would a file have to exist? If I'm recording a new video, it may not exist. I've never had to "create" the file first on any other app I've made. --- UPDATE --- you will get the error if the file DOES exist and you are trying to write over it. Try renaming your destination file.Antependium
D
3

I had the same problem, and I think it is possibly because you have not configured your AVAssetWriterInput correctly. My pool started working after I had done this. In particular, the pool would not give me pixel buffers unless I had provided data in AVVideoCompressionPropertiesKey. First, create and fully configure the AVAssetWriter ( Look in /Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS4.3.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h for keys & values for outputSettings and compressionSettings):

NSError * err = 0;
AVAssetWriter * outputWriter = [AVAssetWriter
    assetWriterWithURL: [NSURL fileURLWithPath:outputPath]
              fileType: AVFileTypeAppleM4V
                 error: & err];

NSMutableDictionary * outputSettings
    = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264
                   forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: AVVideoHeightKey];

NSMutableDictionary * compressionProperties
    = [[NSMutableDictionary alloc] init];
[compressionProperties setObject: [NSNumber numberWithInt: 1000000]
                          forKey: AVVideoAverageBitRateKey];
[compressionProperties setObject: [NSNumber numberWithInt: 16]
                          forKey: AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject: AVVideoProfileLevelH264Main31
                          forKey: AVVideoProfileLevelKey];

[outputSettings setObject: compressionProperties
                   forKey: AVVideoCompressionPropertiesKey];

AVAssetWriterInput * writerInput = [AVAssetWriterInput
    assetWriterInputWithMediaType: AVMediaTypeVideo
                   outputSettings: outputSettings];

[compressionProperties release];
[outputSettings release];

Create the pixel buffer adaptor:

NSMutableDictionary * pixBufSettings = [[NSMutableDictionary alloc] init];
[pixBufSettings setObject: [NSNumber numberWithInt: kCVPixelFormatType_32BGRA]
                   forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[pixBufSettings setObject: [NSNumber numberWithInt: width_]
                   forKey: (NSString *) kCVPixelBufferWidthKey];
[pixBufSettings setObject: [NSNumber numberWithInt: height_]
                   forKey: (NSString *) kCVPixelBufferHeightKey];

AVAssetWriterInputPixelBufferAdaptor * outputPBA =
    [AVAssetWriterInputPixelBufferAdaptor
    assetWriterInputPixelBufferAdaptorWithAssetWriterInput: outputInput
                               sourcePixelBufferAttributes: nil];

Then retrieve pixel buffers from its pool using:

CVReturn res = CVPixelBufferPoolCreatePixelBuffer (NULL
    , [outputPBA pixelBufferPool]
    , & outputFrame);
Duvalier answered 27/5, 2011 at 9:56 Comment(2)
you didn't use pixBufSettingsCinerarium
but if you pass nil to sourcePixelBufferAttributes there are no chances that the adapter's gonna create any pixel buffer whatsoeverJamieson
S
2

According to the documentation:

"This property is NULL before the first call to startSessionAtTime:on the associated AVAssetWriter object."

So if you'e trying to access the pool too early, it will be NULL. I'm just learning this stuff myself so I can't really elaborate at the moment.

Sinistrocular answered 20/6, 2011 at 19:59 Comment(1)
what do you mean for too early?? I can assure you that I call CVPixelBufferPoolCreatePixelBuffer (kCFAllocatorDefault, _pixelBufferAdaptor.pixelBufferPool, &pixelBuffer) after calling methods supposed to be called on the assetWriter and, still, it keeps returning error cause apparently no buffer pool has been createdJamieson
N
1

For every one still looking for the solution: Firstly, make sure your AVAssetWriter is working properly by checking it's status. I've had this problem and after checking the status, although I've call start some what, the writer just haven't started yet.(In my case, I've point the writing path to an existing file, so after deleting it, it work like a charm)

Nobell answered 15/8, 2015 at 17:22 Comment(0)
A
0

I got it all working! With the options dictionary set to compatibility, they say its possible to use the Buffer pool, Here is working samples and Code for writing without the buffer, but its a good place to start.

Here is the sample code link

Here is the code you need:

- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);


NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

//---
// insert demo debugging code to write the same image repeated as a movie

CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block         frame = 0;

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData])
    {
        if(++frame >= 120)
        {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            break;
        }

        CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
        if (buffer)
        {
            if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
                NSLog(@"FAIL");
            else
                NSLog(@"Success:%d", frame);
            CFRelease(buffer);
        }
    }
}];

NSLog(@"outside for loop");

}


- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);

CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}
Avril answered 1/6, 2011 at 18:21 Comment(2)
Could you explain which pixelBuffer this solution is supposed to use.Jamieson
Apple should fire the whole department that writes their awful, appalling, horrible and disgusting incomplete, vague, misleading documentations. They are a disgrace.Caa
R
0

It works when there is no file at outputURL for AVAssetWriter.

extension FileManager {
    func removeItemIfExist(at url: URL) {
        do {
            if FileManager.default.fileExists(atPath: url.path) {
                try FileManager.default.removeItem(at: url)
            }
        } catch {
            fatalError("\(error)")
        }
    }
}

Usage

let assetWriter = try? AVAssetWriter(outputURL: outputURL, fileType: .mov)
FileManager.default.removeItemIfExist(at: outputURL)
// do something
Rhea answered 9/8, 2018 at 7:18 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.