AVPlayer streaming progress
Asked Answered
B

6

37

I'm successfully using AVPlayer to stream audio from a server and what I want to do now is to show a custom UISlider who shows the progress of the buffering.

Something like this:

enter image description here

With AVPlayer there doesn't seem to be a way to get the total download size or the current downloaded amount for the audio file, only the current playing time and total play time.

There's any workarounds for this?

Bemean answered 7/10, 2011 at 19:14 Comment(4)
Did you ever implement the UI part of this? I need exactly this, and would rather not roll my own if there's already something out there.Hola
See the top answer to this question for the UI part: #4495933Aleasealeatory
Simple solution to implement the above UI is to simply put a UIProgressBar underneath a UISlider and set the maximumTrackTintColor of the slider to [UIColor clearColor].Renny
Follow this answer. #4218590 Might be helpful for you.Danialdaniala
O
58

I am just working on this, and so far have the following:

- (NSTimeInterval) availableDuration;
{
  NSArray *loadedTimeRanges = [[self.player currentItem] loadedTimeRanges];
  CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue];
  Float64 startSeconds = CMTimeGetSeconds(timeRange.start);
  Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration);
  NSTimeInterval result = startSeconds + durationSeconds;
  return result;
}
Oceanus answered 11/10, 2011 at 18:30 Comment(7)
To add to this, the value youre looking for is the AVPlayerItem property loadedtimeRanges. It's an NSArray that contains an NSValue of a CMTimeRange. This code chunk is what I was having trouble coming up with, that is, how to get that data into something useful.Trometer
why do you take first time, not last? [loadedTimeRanges lastObject]Cermet
I remember I wrote a function to watch how this behaved, and was always seeing just one element in the array, so that is probably moot/arbitrary to take first or last. A more theoretically 'correct' calculation would be to use this array, and find the max duration based on the max start + duration, but wasn't necessary in practice.Oceanus
The answerer here also confirms that there is 1 element to the array, so my memory is good: #3999728Oceanus
carefully check whether array is non-empty. on iOS 8 it sometimes returns an empty array causing the above code to crash.Clung
Is it me or this code always give the same value as the duration of the song? Like if the asset was completely buffered instantlyQuinary
Where i must put it? I am not understandKirbie
D
12

It should work well:

Objective-C:

- (CMTime)availableDuration
{
    NSValue *range = self.player.currentItem.loadedTimeRanges.firstObject;
    if (range != nil){
        return CMTimeRangeGetEnd(range.CMTimeRangeValue);
    }
    return kCMTimeZero;
}

Swift version:

func availableDuration() -> CMTime
{
    if let range = self.player?.currentItem?.loadedTimeRanges.first {
        return CMTimeRangeGetEnd(range.timeRangeValue)
    }
    return .zero
}

To watch current time value you can use: CMTimeShow([self availableDuration]); or CMTimeShow(availableDuration()) (for swift)

Diagnostician answered 7/7, 2015 at 10:25 Comment(0)
A
8

Personally I do not agree that the timeRanges value will always have a count of 1.

According to the documentation

The array contains NSValue objects containing a CMTimeRange value indicating the times ranges for which the player item has media data readily available. The time ranges returned may be discontinuous.

So this may have values similar to:

[(start1, end1), (start2, end2)]

From my experience with the hls.js framework within the desktop web world, the holes between these time ranges could be very small or large depending on a multitude of factors, ex: seeking, discontinuities, etc.

So to correctly get the total buffer length you would need to loop through the array and get the duration of each item and concat.

If you are looking for a buffer value from current play head you would need to filter the time ranges for a start time that's greater than the current time and an end time that's less than current time.

public extension AVPlayerItem {

    public func totalBuffer() -> Double {
        return self.loadedTimeRanges
            .map({ $0.timeRangeValue })
            .reduce(0, { acc, cur in
                return acc + CMTimeGetSeconds(cur.start) + CMTimeGetSeconds(cur.duration)
            })
    }

    public func currentBuffer() -> Double {
        let currentTime = self.currentTime()

        guard let timeRange = self.loadedTimeRanges.map({ $0.timeRangeValue })
            .first(where: { $0.containsTime(currentTime) }) else { return -1 }

        return CMTimeGetSeconds(timeRange.end) - currentTime.seconds
    }

}
Aucoin answered 22/3, 2018 at 17:28 Comment(0)
I
0

This method will return buffer time interval for your UISlider

public var bufferAvail: NSTimeInterval {

    // Check if there is a player instance
    if ((player.currentItem) != nil) {

        // Get current AVPlayerItem
        var item: AVPlayerItem = player.currentItem
        if (item.status == AVPlayerItemStatus.ReadyToPlay) {

            var timeRangeArray: NSArray = item.loadedTimeRanges
            var aTimeRange: CMTimeRange = timeRangeArray.objectAtIndex(0).CMTimeRangeValue
            var startTime = CMTimeGetSeconds(aTimeRange.start)
            var loadedDuration = CMTimeGetSeconds(aTimeRange.duration)

            return (NSTimeInterval)(startTime + loadedDuration);
        }
        else {
            return(CMTimeGetSeconds(kCMTimeInvalid))
        }
    } 
    else {
        return(CMTimeGetSeconds(kCMTimeInvalid))
    }
}
Inoculum answered 9/6, 2016 at 10:37 Comment(1)
Could you explain a little more what the code does and how it solves the OP's question? In it's current format, it's a bit hard to readKuhlman
M
0

Selected answer may cause you problems if returned array is empty. Here's a fixed function:

- (NSTimeInterval) availableDuration
{
    NSArray *loadedTimeRanges = [[_player currentItem] loadedTimeRanges];
    if ([loadedTimeRanges count])
    {
        CMTimeRange timeRange = [[loadedTimeRanges objectAtIndex:0] CMTimeRangeValue];
        Float64 startSeconds = CMTimeGetSeconds(timeRange.start);
        Float64 durationSeconds = CMTimeGetSeconds(timeRange.duration);
        NSTimeInterval result = startSeconds + durationSeconds;
        return result;
    }
    return 0;
}
Maximilian answered 4/8, 2016 at 17:23 Comment(0)
B
0

The code from Suresh Kansujiya in Objective C

NSTimeInterval bufferAvail;

if (player.currentItem != nil) {

    AVPlayerItem *item = player.currentItem;
    if (item.status == AVPlayerStatusReadyToPlay) {
        NSArray *timeRangeArray = item.loadedTimeRanges;
        CMTimeRange aTimeRange = [[timeRangeArray objectAtIndex:0] CMTimeRangeValue];
        Float64 startTime = CMTimeGetSeconds(aTimeRange.start);
        Float64 loadedDuration = CMTimeGetSeconds(aTimeRange.duration);

        bufferAvail = startTime + loadedDuration;

        NSLog(@"%@ - %f", [self class], bufferAvail);
    } else {
        NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid)); }
}
else {
    NSLog(@"%@ - %f", [self class], CMTimeGetSeconds(kCMTimeInvalid));
}
Buffoon answered 5/11, 2016 at 18:48 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.