Blank frame on merging videos using AVMutableComposition
Asked Answered
M

1

8

This question has been asked many times before but nothing helped me. I am merging multiple videos using AVMutableComposition. After merging videos, I get blank frames in between 30 - 40% of the videos. Others merge fine. I just play the composition directly using AVPlayer as an AVPlayerItem. Code is below:

AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];

    NSMutableArray *instructions = [NSMutableArray new];
    CGSize size = CGSizeZero;

    CMTime time = kCMTimeZero;
    for (AVURLAsset *asset in assets)
    {
        AVAssetTrack *assetTrack;
        assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;

        NSError *error;
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration )
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];


        if (error) {
            NSLog(@"asset url :: %@",assetTrack.asset);
            NSLog(@"Error - %@", error.debugDescription);
        }

        [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration)
                                       ofTrack:audioAssetTrack
                                        atTime:time
                                         error:&error];


        if (error) {
            NSLog(@"Error - %@", error.debugDescription);
        }
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
        videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]];
        [instructions addObject:videoCompositionInstruction];

        time = CMTimeAdd(time, assetTrack.timeRange.duration);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = assetTrack.naturalSize;;
        }
    }

    AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
    mutableVideoComposition.instructions = instructions;
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30);
    mutableVideoComposition.renderSize = size;

    playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition];
    playerItem.videoComposition = mutableVideoComposition;
Myronmyrrh answered 21/5, 2015 at 10:54 Comment(2)
Your layerInstructions is not incorrect, take look by commenting the last line: playerItem.videoComposition = mutableVideoComposition;Lawman
Do you mean not correct? What is incorrect in instructions? After commenting that line, I get black frame between all videos.Myronmyrrh
L
1

As far as I know, the AVMutableVideoCompositionLayerInstruction can not be simply "appended" or "added" as your code way.

From your code, I guess you want to keep the video instructions informations when merging video assets, but the instructions can not be "copied" directly.

If you want to do this, see docs for AVVideoCompositionLayerInstruction, e.g.

    getTransformRampForTime:startTransform:endTransform:timeRange:
    setTransformRampFromStartTransform:toEndTransform:timeRange:
    setTransform:atTime:

    getOpacityRampForTime:startOpacity:endOpacity:timeRange:
    setOpacityRampFromStartOpacity:toEndOpacity:timeRange:
    setOpacity:atTime:

    getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange:
    setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange:
    setCropRectangle:atTime:

You should use getFoo... methods on source track, then calc the insertTime or timeRange for the final track, then setFoo..., then append to the layerInstructions of the final videoComposition.

YES, a little complicated... Additionally, the most importantly, you can not get all video effects which applied for the source asset.

So what is your purpose? And what is your source asset backed with?

If you just want to merge some mp4/mov files, just loop tracks and append them to AVMutableCompositionTrack, no videoComposition. And I tested your code, it works.

If you want to merge AVAssets which with video instructions, see above explanation and docs. And my best practice is, before merging, save those AVAssets to file using AVAssetExportSession, then just do merge video files.

p.s. Maybe there are some issues with your test files or source assets.

Code from my project like Vine:

    - (BOOL)generateComposition
    {
            [self cleanComposition];

            NSUInteger segmentsCount = self.segmentsCount;
            if (0 == segmentsCount) {
                    return NO;
            }

            AVMutableComposition *composition = [AVMutableComposition composition];
            AVMutableVideoComposition *videoComposition = nil;
            AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil;
            AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil;
            AVMutableAudioMix *audioMix = nil;

            AVMutableCompositionTrack *videoTrack = nil;
            AVMutableCompositionTrack *audioTrack = nil;
            AVMutableCompositionTrack *musicTrack = nil;
            CMTime currentTime = kCMTimeZero;

            for (MVRecorderSegment *segment in self.segments) {
                    AVURLAsset *asset = segment.asset;
                    NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
                    NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio];

                    CMTime maxBounds = kCMTimeInvalid;

                    CMTime videoTime = currentTime;
                    for (AVAssetTrack *videoAssetTrack in videoAssetTracks) {
                            if (!videoTrack) {
                                    videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
                                    videoTrack.preferredTransform = CGAffineTransformIdentity;

                                    videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
                                    videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
                            }

                            /* Fix orientation */
                            CGAffineTransform transform = videoAssetTrack.preferredTransform;
                            if (AVCaptureDevicePositionFront == segment.cameraPosition) {
                                    transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);
                                    transform = CGAffineTransformScale(transform, -1.0, 1.0);
                            } else if (AVCaptureDevicePositionBack == segment.cameraPosition) {

                            }
                            [videoCompositionLayerInstruction setTransform:transform atTime:videoTime];

                            /* Append track */
                            videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds];
                            maxBounds = videoTime;
                    }

                    if (self.sessionConfiguration.originalVoiceOn) {
                            CMTime audioTime = currentTime;
                            for (AVAssetTrack *audioAssetTrack in audioAssetTracks) {
                                    if (!audioTrack) {
                                            audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
                                    }
                                    audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds];
                            }
                    }

                    currentTime = composition.duration;
            }

            if (videoCompositionInstruction && videoCompositionLayerInstruction) {
                    videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
                    videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction];

                    videoComposition = [AVMutableVideoComposition videoComposition];
                    videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);
                    videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);
                    videoComposition.instructions = @[videoCompositionInstruction];
            }


            // 添加背景音乐 musicTrack
            NSURL *musicFileURL = self.sessionConfiguration.musicFileURL;
            if (musicFileURL && musicFileURL.isFileExists) {
                    AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL];
                    AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
                    if (musicAssetTrack) {
                            musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
                            if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) {
                                    // 如果背景音乐时长大于视频总时长, 则直接添加
                                    [musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL];
                            } else {
                                    // 否则, 循环背景音乐
                                    CMTime musicTime = kCMTimeZero;
                                    CMTime bounds = composition.duration;
                                    while (true) {
                                            musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds];
                                            if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) {
                                                    break;
                                            }
                                    }
                            }
                    }
            }

            // 处理音频
            if (musicTrack) {
                    AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack];

                    /* 背景音乐添加淡入淡出 */
                    AVAsset *musicAsset = musicTrack.asset;
                    CMTime crossfadeDuration = CMTimeMake(15, 10); // 前后都是1.5秒
                    CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5);
                    crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration);
                    CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration);
                    CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration);
                    [audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin];
                    [audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd];

                    audioMix = [AVMutableAudioMix audioMix];
                    [audioMix setInputParameters:@[audioMixParameters]];
            }

            _composition = composition;
            _videoComposition = videoComposition;
            _audioMix = audioMix;

            return YES;
    }


    - (AVPlayerItem *)playerItem
    {
            AVPlayerItem *playerItem = nil;
            if (self.composition) {
                    playerItem = [AVPlayerItem playerItemWithAsset:self.composition];
                    if (!self.videoComposition.animationTool) {
                            playerItem.videoComposition = self.videoComposition;
                    }
                    playerItem.audioMix = self.audioMix;
            }
            return playerItem;
    }

    ///=============================================
    /// MVHelper
    ///=============================================

    + (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds
    {
            CMTimeRange timeRange = track.timeRange;
            atTime = CMTimeAdd(atTime, timeRange.start);

            if (!track || !compositionTrack) {
                    return atTime;
            }

            if (CMTIME_IS_VALID(bounds)) {
                    CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration);
                    if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) {
                            timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds)));
                    }
            }
            if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) {
                    NSError *error = nil;
                    [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];
                    if (error) {
                            MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error);
                    }
                    return CMTimeAdd(atTime, timeRange.duration);
            }

            return atTime;
    }
Lawman answered 29/5, 2015 at 10:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.