iOS 8 iPad AVCaptureMovieFileOutput drops / loses / never gets audio track after 13 - 14 seconds of recording
Asked Answered
S

2

17

I have the following code which works for iOS 6 & 7.x.

In iOS 8.1 I have a strange issue where if you capture a session for about 13 seconds or longer, the resulting AVAsset only has 1 track (video), the audio track is just not there.

If you record for a shorter period the AVAsset has 2 tracks (video and audio) as expected. I have plenty of disk space, the app has permission to use camera and microphone.

I created a new project with minimal code, it reproduced the issue.

Any ideas would be greatly appreciated.

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController
{
    enum RecordingState { Recording, Stopped };
    enum RecordingState recordingState;

    AVCaptureSession *session;
    AVCaptureMovieFileOutput *output;
    AVPlayer *player;
    AVPlayerLayer *playerLayer;
    bool audioGranted;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    [self setupAV];
    recordingState = Stopped;
}

-(void)setupAV
{
    session = [[AVCaptureSession alloc] init];
    [session beginConfiguration];
    AVCaptureDevice *videoDevice = nil;

    for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
        if ( device.position == AVCaptureDevicePositionBack ) {
            videoDevice = device;
            break;
        }
    }
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    if (videoDevice && audioDevice)
    {
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
        [session addInput:input];

        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
        [session addInput:audioInput];

        NSURL *recordURL = [self tempUrlForRecording];
        [[NSFileManager defaultManager] removeItemAtURL:recordURL error:nil];

        output= [[AVCaptureMovieFileOutput alloc] init];
        output.maxRecordedDuration = CMTimeMake(45, 1);
        output.maxRecordedFileSize = 1028 * 1028 * 1000;
        [session addOutput:output];
    }
    [session commitConfiguration];
}

- (IBAction)recordingButtonClicked:(id)sender {
    if(recordingState == Stopped)
    {
        [self startRecording];
    }
    else
    {
        [self stopRecording];
    }
}

-(void)startRecording
{
    recordingState = Recording;
    [session startRunning];
    [output startRecordingToOutputFileURL:[self tempUrlForRecording] recordingDelegate:self];

}

-(void)stopRecording
{
    recordingState = Stopped;
    [output stopRecording];
    [session stopRunning];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    AVAsset *cameraInput = [AVAsset assetWithURL:[self tempUrlForRecording]];
    //DEPENDING ON HOW LONG RECORDED THIS DIFFERS (<14 SECS - 2 Tracks, >14 SECS - 1 Track)
    NSLog(@"Number of tracks: %i", cameraInput.tracks.count);
}

-(id)tempUrlForRecording
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    NSString *path = @"camerabuffer.mp4";
    NSString *pathCameraInput =[documentsDirectoryPath stringByAppendingPathComponent: path];
    NSURL *urlCameraInput = [NSURL fileURLWithPath:pathCameraInput];

    return urlCameraInput;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end
Shona answered 4/11, 2014 at 12:52 Comment(4)
I should also mention that no errors are reported (nil) in didFinishRecordingToOutputFileAtURLShona
Ok setting the fragmentInterval greater than the recording is going to be fixes it. But I'm sure I shouldn't need this CMTime fragmentInterval = CMTimeMake(5,1); [movieOutput setMovieFragmentInterval:fragmentInterval];Shona
What happens if you don't use the maxRecordedDuration and stop the recording manually after 45 seconds?Travelled
I have the same issue. I found out that if you transcode the stream with ffmpeg, explicitly setting the volume (i.e. ffmpeg -i movie.mp4 -vol 256 movie2.mp4) you get sound back.Moscow
R
18

This will help you to fix it.

[movieOutput setMovieFragmentInterval:kCMTimeInvalid];

I think this is a bug. The documentation says the sample table is not written if the recording does not complete successfully. So it will automatically be written if it does complete successfully. But now it seems like it doesn't.

Any ideas?

Rech answered 5/12, 2014 at 4:51 Comment(5)
Wow. That worked -- I've been pulling my hair out on this bug. For others reference, I didn't have a max duration or size.Brittney
I have a max size and duration. I was previously using movieOutput.movieFragmentInterval = CMTime(value: 2, timescale: 1) and was occasionally getting audio but no video. Setting to kCMTimeInvalid solved 40% of my issues. My videos are 8 seconds long so fragments aren't needed.Nepos
Hi @Nepos . Could you provide me a demo? I did not encounter this issue.Rech
@dusty I'd be happy to share a sample off-line. This bug also part of my trouble. https://mcmap.net/q/683779/-consecutive-calls-to-startrecordingtooutputfileurlNepos
self.movieOutput!.movieFragmentInterval = kCMTimeInvalid (in swift)Antilogarithm
S
3

I had this issue and the way to fix this in Swift 4 is the following:

  • Do not set movieFileOutput.maxRecordedDuration. There seems to be a bug with this where if you set this then if you are recording videos for longer than 12-13 seconds they will have no audio.

  • Instead use a timer to stop the recording and set movieFragmentInterval like this:

movieFileOutput.movieFragmentInterval = CMTime.invalid

Here is a whole block of code just to show you how I did it:

var seconds = 20
var timer = Timer()
var movieFileOutput = AVCaptureMovieFileOutput()

func startRecording(){
    movieFileOutput.movieFragmentInterval = CMTime.invalid
    movieFileOutput.startRecording(to: URL(fileURLWithPath: getVideoFileLocation()), recordingDelegate: self)
    startTimer()
}

func stopRecording(){
    movieFileOutput.stopRecording()
    timer.invalidate()
}

func startTimer(){
    timer = Timer.scheduledTimer(timeInterval: 1, target: self, selector: (#selector(updateTimer)), userInfo: nil, repeats: true)
}

@objc func updateTimer(){
    seconds -= 1
    if(seconds == 0){
        stopRecording()
    }
}

func getVideoFileLocation() -> String {
    return NSTemporaryDirectory().appending("myrecording.mp4")
}


extension FTVideoReviewViewController : AVCaptureFileOutputRecordingDelegate{
    public func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        print("Finished recording: \(outputFileURL)")
        // do stuff here when recording is finished
    }
}
Subjoin answered 28/9, 2018 at 14:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.