Does WGET timeout?
Asked Answered
T

5

106

I'm running a PHP script via cron using Wget, with the following command:

wget -O - -q -t 1 http://www.example.com/cron/run

The script will take a maximum of 5-6 minutes to do its processing. Will WGet wait for it and give it all the time it needs, or will it time out?

Trona answered 18/2, 2010 at 19:21 Comment(0)
I
178

According to the man page of wget, there are a couple of options related to timeouts -- and there is a default read timeout of 900s -- so I say that, yes, it could timeout.


Here are the options in question :

-T seconds
--timeout=seconds

Set the network timeout to seconds seconds. This is equivalent to specifying --dns-timeout, --connect-timeout, and --read-timeout, all at the same time.


And for those three options :

--dns-timeout=seconds

Set the DNS lookup timeout to seconds seconds.
DNS lookups that don't complete within the specified time will fail.
By default, there is no timeout on DNS lookups, other than that implemented by system libraries.

--connect-timeout=seconds

Set the connect timeout to seconds seconds.
TCP connections that take longer to establish will be aborted.
By default, there is no connect timeout, other than that implemented by system libraries.

--read-timeout=seconds

Set the read (and write) timeout to seconds seconds.
The "time" of this timeout refers to idle time: if, at any point in the download, no data is received for more than the specified number of seconds, reading fails and the download is restarted.
This option does not directly affect the duration of the entire download.


I suppose using something like

wget -O - -q -t 1 --timeout=600 http://www.example.com/cron/run

should make sure there is no timeout before longer than the duration of your script.

(Yeah, that's probably the most brutal solution possible ^^ )

Incorporator answered 18/2, 2010 at 19:26 Comment(4)
if i set -t 0 will it wait indefinitely?Trona
The -t option seems to be an alias of --tries, which Set number of retries to number. ;;; It doesn't seem to relate to any kind of timeout, but to the number of times wget will re-try to download if there is an error -- and you probably don't want a timeout be considered as an error, and the script being re-called.Incorporator
--timeout=0 will disable timeoutBozeman
i'm probably the one on the planet but for ubuntu 16 wget just ignores values for --timeout. Probably because I'm using basic auth in the commandNika
P
44

The default timeout is 900 second. You can specify different timeout.

-T seconds
--timeout=seconds

The default is to retry 20 times. You can specify different tries.

-t number
--tries=number

link: wget man document

Prolactin answered 18/2, 2010 at 19:26 Comment(0)
A
8

Prior to version 1.14, wget timeout arguments were not adhered to if downloading over https due to a bug.

Adrenalin answered 14/8, 2014 at 6:54 Comment(3)
Damn! CentOS 6 ships with wget 1.12, I got the same issue with https linksErotica
I'm having the exact same error! Wget stopped at https link which has expired certificate !Incorporated
I have a Linux with a wget version 1.20.3 which also ignores "-T x" and never terminates on a wrong IP for example, which is why I use the "timeout" comand instead: "timeout 5 wget ..." works perfectly in my case.Emerson
C
7

Since in your question you said it's a PHP script, maybe the best solution could be to simply add in your script:

ignore_user_abort(TRUE);

In this way even if wget terminates, the PHP script goes on being processed at least until it does not exceeds max_execution_time limit (ini directive: 30 seconds by default).

As per wget anyay you should not change its timeout, according to the UNIX manual the default wget timeout is 900 seconds (15 minutes), whis is much larger that the 5-6 minutes you need.

Claro answered 19/12, 2012 at 15:57 Comment(0)
T
3

None of the wget timeout values have anything to do with how long it takes to download a file.

If the PHP script that you're triggering sits there idle for 5 minutes and returns no data, wget's --read-timeout will trigger if it's set to less than the time it takes to execute the script.

If you are actually downloading a file, or if the PHP script sends some data back, like a ... progress indicator, then the read timeout won't be triggered as long as the script is doing something.

wget --help tells you:

  -T,  --timeout=SECONDS           set all timeout values to SECONDS
       --dns-timeout=SECS          set the DNS lookup timeout to SECS
       --connect-timeout=SECS      set the connect timeout to SECS
       --read-timeout=SECS         set the read timeout to SECS

So if you use --timeout=10 it sets the timeouts for DNS lookup, connecting, and reading bytes to 10s.

When downloading files you can set the timeout value pretty low and as long as you have good connectivity to the site you're connecting to you can still download a large file in 5 minutes with a 10s timeout. If you have a temporary connection failure to the site or DNS, the transfer will time out after 10s and then retry (if --tries aka -t is > 1).

For example, here I am downloading a file from NVIDIA that takes 4 minutes to download, and I have wget's timeout values set to 10s:

$ time wget --timeout=10 --tries=1 https://developer.download.nvidia.com/compute/cuda/11.2.2/local_installers/cuda_11.2.2_460.32.03_linux.run
--2021-07-02 16:39:21--  https://developer.download.nvidia.com/compute/cuda/11.2.2/local_installers/cuda_11.2.2_460.32.03_linux.run
Resolving developer.download.nvidia.com (developer.download.nvidia.com)... 152.195.19.142
Connecting to developer.download.nvidia.com (developer.download.nvidia.com)|152.195.19.142|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 3057439068 (2.8G) [application/octet-stream]
Saving to: ‘cuda_11.2.2_460.32.03_linux.run.1’

cuda_11.2.2_460.32.03_linux.run.1        100%[==================================================================================>]   2.85G  12.5MB/s    in 4m 0s

2021-07-02 16:43:21 (12.1 MB/s) - ‘cuda_11.2.2_460.32.03_linux.run.1’ saved [3057439068/3057439068]


real    4m0.202s
user    0m5.180s
sys 0m16.253s

4m to download, timeout is 10s, everything works just fine.

In general, timing out DNS, connections, and reads using a low value is a good idea. If you leave it at the default value of 900s you'll be waiting 15m every time there's a hiccup in DNS or your Internet connectivity.

Thirteenth answered 2/7, 2021 at 23:50 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.