Retry when connection disconnect not working
Asked Answered
A

5

17

I am using youtube-dl for downloading the videos from YouTube. But in my office the internet will disconnect every 20Mb of download. [Error: Connection forcibly closed by remote server].

I have to type the URL again to resume the download and again it will disconnect after '20Mb' I want youtube-dl to reconnect and retry to download the file.

I tried using --retries switch but it is not retrying once disconnected.

Is there any inbuild method or Work around for this?

Amie answered 4/4, 2016 at 11:39 Comment(0)
M
11

Educated guess

My best guess would be to specify a cache directory, and use the -c flag to force it to continue downloads if possible.

Source: youtube-dl man page

--cache-dir DIR
              Location  in  the  filesystem  where  youtube-dl  can  store  some  downloaded  information  permanently.   By  default
              $XDG_CACHE_HOME  /youtube-dl or ~/.cache/youtube-dl .  At the moment, only YouTube player files (for videos with obfus‐
              cated signatures) are cached, but that may change.

-c, --continue
              Force resume of partially downloaded files.  By default, youtube-dl will resume downloads if possible.

Alternative solution

If you want to give python a try, this script should do what you need with some minor tweaking.

import sys
import youtube_dl

def download_no_matter_what(url):
    try:
        youtube_dl.YoutubeDL(options).download([url])
    except OSError:
        download_no_matter_what(url)
    except KeyboardInterrupt:
        sys.exit()

if __name__ == '__main__':
    # Read the URL from the command line
    url = sys.argv[1]

    # Specify extra command line options here
    options = {} 

    # GET THAT VIDEO! 
    download_no_matter_what(url)

Reference for the youtube_dl API: https://github.com/rg3/youtube-dl/blob/master/README.md#readme

Moxley answered 7/4, 2016 at 4:31 Comment(4)
No. this is about the partial downloaded file, not about connection reconnect.. Resuming is happening but I have to type URL manually for it to resume..Amie
Gotcha, sorry. What system are you on? Also, are you comfortable running/editing python?Moxley
Check out my edit. With some minor tweaking, it should do what you want. Cheers!Moxley
In Python the recursion limit is 1000. So after exactly 1000 retries this script would crash.Aureus
D
16

Get a bash , either via steve's win-bash, the new windows10/Ubuntu thing or cygwin

Call youtube-dl like this:

while ! youtube-dl <video_uri> -c --socket-timeout 5; do echo DISCONNECTED; done

You may want to add some sleep time between retries.

while ! youtube-dl <video_uri> -c --socket-timeout 5; do echo DISCONNECTED; sleep 5; done

There should be a power shell equivalent, or an ugly batch while loop checking ERRORLEVEL

Diablerie answered 13/4, 2016 at 13:20 Comment(2)
This works nice and simple. Helps when on an unstable connection and the bandwidth slips away without YT-DL calling it quits. Or even if Youtube-DL itself errors out.Protomorphic
works well with a text file of videos to download while ! youtube-dl -a TextFile.txt -c --socket-timeout 5; do echo DISCONNECTED; sleep 5; donePrem
M
11

Educated guess

My best guess would be to specify a cache directory, and use the -c flag to force it to continue downloads if possible.

Source: youtube-dl man page

--cache-dir DIR
              Location  in  the  filesystem  where  youtube-dl  can  store  some  downloaded  information  permanently.   By  default
              $XDG_CACHE_HOME  /youtube-dl or ~/.cache/youtube-dl .  At the moment, only YouTube player files (for videos with obfus‐
              cated signatures) are cached, but that may change.

-c, --continue
              Force resume of partially downloaded files.  By default, youtube-dl will resume downloads if possible.

Alternative solution

If you want to give python a try, this script should do what you need with some minor tweaking.

import sys
import youtube_dl

def download_no_matter_what(url):
    try:
        youtube_dl.YoutubeDL(options).download([url])
    except OSError:
        download_no_matter_what(url)
    except KeyboardInterrupt:
        sys.exit()

if __name__ == '__main__':
    # Read the URL from the command line
    url = sys.argv[1]

    # Specify extra command line options here
    options = {} 

    # GET THAT VIDEO! 
    download_no_matter_what(url)

Reference for the youtube_dl API: https://github.com/rg3/youtube-dl/blob/master/README.md#readme

Moxley answered 7/4, 2016 at 4:31 Comment(4)
No. this is about the partial downloaded file, not about connection reconnect.. Resuming is happening but I have to type URL manually for it to resume..Amie
Gotcha, sorry. What system are you on? Also, are you comfortable running/editing python?Moxley
Check out my edit. With some minor tweaking, it should do what you want. Cheers!Moxley
In Python the recursion limit is 1000. So after exactly 1000 retries this script would crash.Aureus
M
3

powershell equivalant:

Do { youtube-dl.exe <video_uri> -c } until ($?)
Malaspina answered 17/11, 2018 at 20:45 Comment(0)
N
3

Try retry-cli. You will need to install Node.js (with npm) first

npm install --global retry-cli
retry youtube-dl <URL>
Nonpayment answered 19/2, 2021 at 13:31 Comment(0)
F
1

Batch equivalent:

for /L %%? in (0,0,1) do @(youtube-dl <video_uri> -c --socket-timeout 5 && break)

This includes a 5 second sleep:

for /L %%? in (0,0,1) do @(youtube-dl <video_uri> -c --socket-timeout 5 && break || timeout /t 5 >NUL)
Fredericton answered 27/7, 2020 at 18:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.