I am designing a custom HTML5
video player. Thus, it will have its own custom slider to mimic the video progress, so I need to understand the entire buffering shebang of a HTML5
video.
I came across this article: Video Buffering. It says that the buffered object consists of several time ranges in linear order of start time. But I couldn't find out the following:
Say the video starts. It continues upto 1:45 on its own (occasionally stalling perhaps, waiting for further data), after which I suddenly jump to 32:45. Now after some time, if I jump back to 1:27 (within the time range initially loaded and played through, before I made the jump), will it start playing immediately as it was already loaded before? Or is it that since I made a jump, that portion is lost and will have to be fetched again? Either way, is the behavior consistent for all such scenarios?
Say I make 5 or 6 such jumps, each time waiting for a few seconds for some data to load after the jump. Does that mean the
buffered
object will have all those time ranges stored? Or might some get lost? Is it a stack kind of thing, where the earlier ranges will get popped off as more ranges get loaded due to further jumps?Will checking whether the
buffered
object has one time range starting at 0 (forget live streaming) and ending at the video duration length ensure that the entire video resource has been loaded fully? If not, is there some way to know that the entire video has been downloaded, and any portion is seekable, from which video can play continuously upto end without a moment's delay?
The W3C specs are not very clear on this, and I also can't find a suitably large (say more than an hour) remote video resource to test.