Here is a simple wget command:
wget http://www.example.com/images/misc/pic.png
How to make wget skip download if pic.png
is already available ?
Here is a simple wget command:
wget http://www.example.com/images/misc/pic.png
How to make wget skip download if pic.png
is already available ?
Try the following parameter:
-nc
,--no-clobber
: skip downloads that would download to existing files.
Sample usage:
wget -nc http://example.com/pic.png
[ ! -e "$(basename $URL)" ] && wget $URL
–
Open --recursive
option. –
Music wget -nc -i list.txt
. Don't think it's possible for a server to crawl 3k links in a tenth of a second! –
Sobriquet -N, --timestamping
says don't re-retrieve files unless newer than local
if you are looking to sync, in-case some remote files might ACTUALLY be worth re-downloading (edit: I see another answer now that says the same). –
Posen The -nc
, --no-clobber
option isn't the best solution as newer files will not be downloaded. One should use -N
instead which will download and overwrite the file only if the server has a newer version, so the correct answer is:
wget -N http://www.example.com/images/misc/pic.png
Then running Wget with -N, with or without
-r
or-p
, the decision as to whether or not to download a newer copy of a file depends on the local and remote timestamp and size of the file.-nc
may not be specified at the same time as-N
.
-N
,--timestamping
: Turn on time-stamping.
-N
may fail and wget will always redownload. So sometimes -nc
is better solution. –
Recognizor wget
will complain Last-modified header missing
; this is exactly the situation outlined. –
Osteoporosis -O
always creates a new file with a current timestamp and hence -N
and -O
exclude each other. –
Papillose -N
option are real, the use of -nc
does not solve them. If something goes bad as described above (partial download, incorrect time stamp) the file will exist locally, so -nc
will force wget
will skip them. –
Sparkle aria2c --file-allocation=none -c -x 16 -s 16 --log-level=warn --summary-interval=1 {File_URL} -d {Directory} {Filename}
–
Uitlander The answer I was looking for is at https://unix.stackexchange.com/a/9557/114862.
Using the
-c
flag when the local file is of greater or equal size to the server version will avoid re-downloading.
wget -i filelist.txt -c
will resume a failed download of a list of files. –
Windpipe -c
means continue
. If the file is was changed to a bigger file with different content you get will start download at the end of the local file and add the new file contents. You may end up garbage. –
Nanny When running Wget with -r
or -p
, but without -N
, -nd
, or -nc
, re-downloading a file will result in the new copy simply overwriting the old.
So adding -nc
will prevent this behavior, instead causing the original version to be preserved and any newer copies on the server to be ignored.
I had issues with -N
as I wanted to save output to a different file name.
A file is considered new if one of these two conditions are met:
- A file of that name does not already exist locally.
- A file of that name does exist, but the remote file was modified more recently than the local file.
Using test
:
test -f stackoverflow.html || wget -O stackoverflow.html https://stackoverflow.com/
If the file exists does not exist test
will evaluate to FALSE so wget
will be executed.
© 2022 - 2024 — McMap. All rights reserved.