Using AVMutableComposition iPhone
Asked Answered
Q

1

2

I am using the below code, for streaming the two videos sequentially. But it is not showing any video in the simulator, its totally blank.

Also how can I seek through these two videos. Like, if one video is of 2 minutes and the second is 3 minutes. Now I need to get the total time of these videos and seek through them. When I slide the slider bar to 4 minutes so the 2nd video should be played from minute 2 to onward.

Is it possible?

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    NSURL *url1 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20bionic.mp4"];
    NSURL *url2 = [NSURL URLWithString:@"http://www.tools4movies.com/dvd_catalyst_profile_samples/Harold%20Kumar%203%20Christmas%20tablet.mp4"];

    NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];

    AVMutableComposition *composition = [[AVMutableComposition alloc] init];

    asset1 = [[AVURLAsset alloc] initWithURL:url1 options:options];
    AVURLAsset * asset2 = [[AVURLAsset alloc]initWithURL:url2 options:options];

    CMTime insertionPoint = kCMTimeZero;
    NSError * error = nil;
    composition = [AVMutableComposition composition];

    if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset1.duration) 
                              ofAsset:asset1 
                               atTime:insertionPoint 
                                error:&error]) 
    {
        NSLog(@"error: %@",error);
    }

    insertionPoint = CMTimeAdd(insertionPoint, asset1.duration);

    if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset2.duration) 
                              ofAsset:asset2 
                               atTime:insertionPoint 
                                error:&error]) 
    {
        NSLog(@"error: %@",error);
    }

    AVPlayerItem * item = [[AVPlayerItem alloc] initWithAsset:composition];
    player = [AVPlayer playerWithPlayerItem:item];
    AVPlayerLayer * layer = [AVPlayerLayer playerLayerWithPlayer:player];

    [layer setFrame:CGRectMake(0, 0, 320, 480)];
    [[[self view] layer] addSublayer:layer];
    [player play];   
}

Can anyone tell me that what is the error in my code?

Quarles answered 30/4, 2012 at 12:11 Comment(8)
Have you tested this code on the device?Approximation
See the answer to this question: #8318922Lundt
@Approximation No I haven't check it on real deviceQuarles
@madmw then is there any other way to achieve the above scenario?Quarles
AVQueuePlayer? Two AVPlayers? Downloading the files first? There are options. You need to know the duration of each video and make some calculations before deciding which video to play and which time to seek.Lundt
Yes tried it, but its playback is not smooth/gapless between the videos, also network files need to be streamed and should not be downloaded and should be seekable too like the way I defined aboveQuarles
I think the error is how you calculate the time. It is the insertPoint variable which seems to be wrong. You need to take the length of the composition, and not the length of your asset. Look in my answer below how to calculate time. I use 600 as constant, so the ticks are very precise.Colophon
@madmw I have got another way around to resolve this issue, I have made multiple instances of AVPlayer, but now the problem I am getting is that when I pause the AVplayer1 and and set the AVPlayeLayer to avplayer2 and play it, there is a jerk between these two videos or avplayerlayer view shows its background. How can i remove it ? So that the next video would show immediately and avplayerlayer view background should not be shownQuarles
C
3

The simulator is NOT ABLE to display video. Nether the inbuilt UIImagePickerController nor any video controller will work. It's not implemented and mostly appears black or red on the iOS simulator. You have to debug on the iOS target. Sometimes debugging will not work properly. Use NSLog() istead. This will always work (i.e. if you compile without debug informations using 'release' code)

you can seek using the player:

if mp is your media player:

[mp pause];
CMTime position = mp.currentTime;

// maybe replace something
[mp replaceCurrentItemWithPlayerItem:[AVPlayerItem playerItemWithAsset:self.composition]];

[mp seekToTime:length];
[mp play];

summary:
Edit: use composition and player item
Seek: use player

Here is a short formal example of how to do this (and already thread safe):

AVMutableComposition *_composition = [AVMutableComposition composition];

// iterate though all files
// And build mutable composition
for (int i = 0; i < filesCount; i++) {

    AVURLAsset* sourceAsset = nil;

    NSURL* movieURL = [NSURL fileURLWithPath:[paths objectAtIndex:i]];
    sourceAsset = [AVURLAsset URLAssetWithURL:movieURL options:nil];

    // calculate time
    CMTimeRange editRange = CMTimeRangeMake(CMTimeMake(0, 600), sourceAsset.duration);

    NSError *editError;
    BOOL result = [_composition insertTimeRange:editRange
                                        ofAsset:sourceAsset
                                        atTime:_composition.duration
                                        error:&editError];

    dispatch_sync(dispatch_get_main_queue(), ^{

        // maybe you need a progress bar
        self.loaderBar.progress = (float) i / filesCount;
        [self.loaderBar setNeedsDisplay];
     });

}

// make the composition threadsafe if you need it later
self.composition = [[_composition copy] autorelease];

// Player wants mainthread?    
dispatch_sync(dispatch_get_main_queue(), ^{

    mp = [AVPlayer playerWithPlayerItem:[[[AVPlayerItem alloc] initWithAsset:self.composition] autorelease]];

    self.observer = [mp addPeriodicTimeObserverForInterval:CMTimeMake(60, 600) queue:nil usingBlock:^(CMTime time){

        // this is our callback block to set the progressbar
        if (mp.status == AVPlayerStatusReadyToPlay) {

            float actualTime = time.value / time.timescale;

            // avoid division by zero
            if (time.value > 0.) {

                CMTime length = mp.currentItem.asset.duration;
                float lengthTime = length.value / length.timescale;

                if (lengthTime) {

                    self.progressBar.value = actualTime / lengthTime;
                } else {

                        self.progressBar.value = 0.0f;    
                }
            }];
        });

        // the last task must be on mainthread again
        dispatch_sync(dispatch_get_main_queue(), ^{

            // create our playerLayer
            self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:mp];
            self.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;  
            self.playerLayer.frame = [self view].layer.bounds;

            // insert into our view (make it visible)
            [[self view].layer insertSublayer:self.playerLayer atIndex:0];
        });

    // and now do the playback, maybe mp is global (self.mp)
    // this depends on your needs
    [mp play];
});

I hope this helps.

Colophon answered 8/5, 2012 at 12:56 Comment(7)
so, does it mean the above scenario of streaming multiple videos sequentially without gap, and seeking them properly is possible through AVMutableComposition ?Quarles
Yes. But it's not easy adding videos on the fly (there may be a small interruption and you have to pause the player first). But if you FIRST compose your videos, an THEN play them using the player it will feel like one contiguous video. The benefit of the mutable composition is that you can concatenate another videos to the and of your composition. This can be done using a for- or a while loop. Together with MutableComposition you are free in editing. Afterwards simply wrap into playeritem->player. Then you can playback everything seamlessly.Colophon
This can be done using the AVQueuePlayer, too. But the AVQueuePlayer loads the next video after the first one has finished. The queued player will not calculate the total videolength. This means using MutableComposition makes ONE video out of many. Queued player plays a 'playlist'.Colophon
Last but not least: Be aware of storing AVMutableComposition in global scope (i.e. using a property). A mutable composition is NOT threadsafe and maybe will crash or not being displayed in a block or any other thread. If you would like to store your composition inside your class, convert its to a standard composition first.Colophon
Thanx, let me check and get back :)Quarles
I have tried it but it is not streaming, you specified the local files in the code, I need to stream the network files ... any idea ?Quarles
I think streaming directly from URL is not possible with assets, because the video must be available to calculate the length. You first have to download the files and then assemble your composition. The video player can stream http-lifestreaming and .ts files. But this seems not to be what you are looking for. Try "NSData* video = [NSData dataWithContentsOfURL:yourURL]; [video writeToFile:yourFilename atomically:YES];" to download your videos. And then play from your sandbox using "yourFilename".Colophon

© 2022 - 2024 — McMap. All rights reserved.