Replaykit, startCaptureWithHandler() not sending CMSampleBufferRef of Video type in captureHandler
Asked Answered
R

4

8

I've implemented a RPScreenRecorder, which records screen as well as mic audio. After multiple recordings are completed I stop the recording and merge the Audios with Videos using AVMutableComposition and then Merge all the videos to form Single Video.

For screen recording and getting the video and audio files, I am using

- (void)startCaptureWithHandler:(nullable void(^)(CMSampleBufferRef sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error))captureHandler completionHandler:

For stopping the recording. I Call this function:

- (void)stopCaptureWithHandler:(void (^)(NSError *error))handler;

And these are pretty Straight forward.

Most of the times it works great, I receive both video and audio CMSampleBuffers. But some times it so happens that startCaptureWithHandler only sends me audio buffers but not video buffers. And once I encounter this problem, it won't go until I restart my device and reinstall the app. This makes my app so unreliable for the user. I think this is a replay kit issue but unable to found out related issues with other developers. let me know if any one of you came across this issue and got the solution.

I have check multiple times but haven't seen any issue in configuration. But here it is anyway.

NSError *videoWriterError;
videoWriter = [[AVAssetWriter alloc] initWithURL:fileString fileType:AVFileTypeQuickTimeMovie
                                           error:&videoWriterError];


NSError *audioWriterError;
audioWriter = [[AVAssetWriter alloc] initWithURL:audioFileString fileType:AVFileTypeAppleM4A
                                           error:&audioWriterError];

CGFloat width =UIScreen.mainScreen.bounds.size.width;
NSString *widthString = [NSString stringWithFormat:@"%f", width];
CGFloat height =UIScreen.mainScreen.boNSString *heightString = [NSString stringWithFormat:@"%f", height];unds.size.height;

NSDictionary  * videoOutputSettings= @{AVVideoCodecKey : AVVideoCodecTypeH264,
                                       AVVideoWidthKey: widthString,
                                       AVVideoHeightKey : heightString};
videoInput  = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:videoOutputSettings];

videoInput.expectsMediaDataInRealTime = true;

AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary * audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
                                      [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                                      [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                                      [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                                      [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
                                      [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                                      nil ];

audioInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];

[audioInput setExpectsMediaDataInRealTime:YES];

[videoWriter addInput:videoInput];
    [audioWriter addInput:audioInput];
    
    [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:nil];

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {

Block

}

The startCaptureWithHandler function has pretty straight forward functionality as well:

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable myError) {
                    
                    dispatch_sync(dispatch_get_main_queue(), ^{
                        
                        
                        if(CMSampleBufferDataIsReady(sampleBuffer))
                        {
                            
                            if (self->videoWriter.status == AVAssetWriterStatusUnknown)
                            {
                                    self->writingStarted = true;
                                    [self->videoWriter startWriting];
                                    [self->videoWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                                    
                                    [self->audioWriter startWriting];
                                    [self->audioWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
                            }
                            if (self->videoWriter.status == AVAssetWriterStatusFailed) {
                                return;
                            }
                            
                            if (bufferType == RPSampleBufferTypeVideo)
                            {
                                
                                if (self->videoInput.isReadyForMoreMediaData)
                                {
                                        [self->videoInput appendSampleBuffer:sampleBuffer];
                                }
                            }
                            else if (bufferType == RPSampleBufferTypeAudioMic)
                            {
                                //                                printf("\n+++ bufferAudio received %d \n",arc4random_uniform(100));
                                if (writingStarted){
                                    if (self->audioInput.isReadyForMoreMediaData)
                                    {
                                            [self->audioInput appendSampleBuffer:sampleBuffer];
                                    }
                                }
                            }
                            
                        }
                    });
                    
                }

Also, when this situation occurs, the system screen recorder gets corrupted as well. On clicking system recorder, this error shows up:

mediaservice error

The error says "Screen recording has stopped due to: Failure during recording due to Mediaservices error".

There must be two reasons:

  1. iOS Replay kit is in beta, which is why it is giving problem after sometimes of usage.
  2. I have implemented any problematic logic, which is cause replaykit to crash.

If it's issue no. 1, then no problem. If this is issue no. 2 then I have to know where I might be wrong?

Opinions and help will be appreciated.

Roti answered 20/7, 2018 at 10:13 Comment(0)
R
3

So, I have come across some scenarios where Replay kit totally crashes and System recorder shows error every time unless you restart the device.

1st Scenario

When you start recording and stop it in completion handler

[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
    printf("recording");
} completionHandler:^(NSError * _Nullable error) {
    [RPScreenRecorder.sharedRecorder stopCaptureWithHandler:^(NSError * _Nullable error) {
        printf("Ended");
    }];
}];

2nd Scenario

When you start recording and stop it directly in capture handler

__block BOOL stopDone = NO;
[RPScreenRecorder.sharedRecorder startCaptureWithHandler:^(CMSampleBufferRef  _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error) {
    if (!stopDone){
        [RPScreenRecorder.sharedRecorder stopCaptureWithHandler:^(NSError * _Nullable error) {
            printf("Ended");
        }];
        stopDone = YES;
    }
    printf("recording");
} completionHandler:^(NSError * _Nullable error) {}];

More Scenarios are yet to be discovered and I will keep updating the answer

Update 1

It is true that the system screen recorded gives error when we stop recording right after the start, but it seem to work alright after we call startcapture again.

I have also encountered a scenario where I don't get video buffer in my app only and the system screen recorder works fine, will update the solution soon.

Update 2

So here is the issue, My actual app is old and it is being maintained and getting updated timely. When the replaykit becomes erroneous, My original app can't receive video buffers, I don't know if there is a configuration that is making this happen, or what?

But new sample app seem to work fine and after replay kit becomes erroneous. when I call startCapture next time, the replay kit becomes fine. Weird

Update 3

I observed new issue. When the permission alert shows up, the app goes to background. Since I coded that whenever the app goes to background, some UI changes will occur and the recording will be stopped. This led to the error of

Recording interrupted by multitasking and content resizing

I am not yet certain, which particular UI change is creating this failure, but it only comes when permission alert shows up and the UI changes are made. If someone has noticed any particular case for this issue, please let us know.

Roti answered 13/9, 2018 at 5:31 Comment(5)
#54485867 can u help me on this ??Leund
Can you please help me?Abomb
This answer does not provide any answer, it only list more failure modes. I would be interested in getting an actual answer as my application has the same issue.Timi
@Timi This answer tells you in how many ways the replay kit gets failed , so that you will avoid all these scenarios. The Replay kit was quiet buggy, as when I used it, and there was (or maybe is) no straight forward solution for its bugs.Roti
@TalhaAhmadKhan: I am facing same issue which you mentioned in Update 3, Did you find any solution for this weird issue? Or any other way to record screen?Imprecise
P
1

If screen has no change, ReplayKit does not call processSampleBuffer() with video. For example on PowerPoint presentation, processSampleBuffer() is called only when new slide is shown. No processSampleBuffer() with video is called for 10 sec or 1 min. Sometimes Replaykit does not call processSampleBuffer() on new slide. No this case, user is missing one slide. It is critical and show stopper bug.

On the other hand, processSampleBuffer with Audio is called on every 500ms on iOS 11.4.

Peroxidize answered 23/7, 2018 at 23:38 Comment(1)
Yes it makes sense that when there is no change, there should not be any screen buffer. But in my case, buffer is not received sometimes even when screen is changing.Roti
V
1

In videoOutputSettings make AVVideoWidthKey & AVVideoHeightKey NSNumber instead of NSString.

In audioOutputSettings remove AVEncoderBitDepthHintKey & AVChannelLayoutKey. Add AVEncoderBitRateKey with NSNumber 64000 and change AVFormatIDKey value to kAudioFormatMPEG4AAC replacing kAudioFormatAppleLossless.

In my project I faced similar problem. As far as I can remember, the problem was my output settings.

You can also try moving all your code in startCaptureWithHandler success block inside a synchronous block.

dispatch_sync(dispatch_get_main_queue(), ^ {
    // your block code
}
Veneer answered 3/9, 2018 at 10:51 Comment(1)
I think your solution is a good precaution and it prevents another issue where video writer fails to save. But the above mention problem (in question) is related to RPScreenRecorder not giving video buffers.Roti
R
0

I had exactly the same issue. I changed many things and write the code again and again. I finally understood that the reason of the problem was about the main window.

If you change anything about the main window (for instance windowLevel), reverting them back will solve the problem.

p.s: If you ask the relationship between the main window and replay kit, replay kit records the main window.

Regenerate answered 4/5, 2021 at 18:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.