Skip download if files already exist in wget?
Asked Answered
L

5

323

Here is a simple wget command:

wget http://www.example.com/images/misc/pic.png

How to make wget skip download if pic.png is already available ?

Lefkowitz answered 9/2, 2011 at 11:33 Comment(1)
Does this answer your question? How to (re)download file with wget only when the file is newer or the size changed?Developing
O
409

Try the following parameter:

-nc, --no-clobber: skip downloads that would download to existing files.

Sample usage:

wget -nc http://example.com/pic.png
Open answered 9/2, 2011 at 11:40 Comment(4)
As noted on the linked question, I disagree - If no-clobber is used and the filename exists it exits. No HEAD request even. Even if this wasn't the case, check if you have a file to begin with :-) [ ! -e "$(basename $URL)" ] && wget $URLOpen
I think I may be getting different results because I'm using the --recursive option.Music
Great answer! Going to disagree with ma11hew28. I just tested this on a list of 3,000 URL's with GNU Wget 1.14 and wget -nc -i list.txt. Don't think it's possible for a server to crawl 3k links in a tenth of a second!Sobriquet
Additionally, -N, --timestamping says don't re-retrieve files unless newer than local if you are looking to sync, in-case some remote files might ACTUALLY be worth re-downloading (edit: I see another answer now that says the same).Posen
O
288

The -nc, --no-clobber option isn't the best solution as newer files will not be downloaded. One should use -N instead which will download and overwrite the file only if the server has a newer version, so the correct answer is:

wget -N http://www.example.com/images/misc/pic.png

Then running Wget with -N, with or without -r or -p, the decision as to whether or not to download a newer copy of a file depends on the local and remote timestamp and size of the file. -nc may not be specified at the same time as -N.

-N, --timestamping: Turn on time-stamping.

Ocko answered 30/5, 2013 at 16:0 Comment(9)
When server is not configured properly -N may fail and wget will always redownload. So sometimes -nc is better solution.Recognizor
what could be the applicable scenario where 'When server is not configured properly' would occur?Diapositive
when you are downloading from a location that was copied, changing all the timestamps.Fragrant
Whether this is best depends on context. For example, I'm downloading ~1600 files from a list, and then updated the list to include some more files. The files don't change so I don't care about the latest version and I don't want it to check the server for new versions of the 1600 files that I already have.Cobra
@AjayKumarBasuthkar: When the server doesn't support any way of checking for newer file, wget will complain Last-modified header missing; this is exactly the situation outlined.Osteoporosis
If download failure are possible -N should be avoided. wget only applies timestamps when downloads are complete, so failed downloads end up with current timestamps always newer than on the server, and causing those downloads to remain in a partial state until a future update on the server.Zawde
Beware that -O always creates a new file with a current timestamp and hence -N and -O exclude each other.Papillose
While all problems mentioned with the -N option are real, the use of -nc does not solve them. If something goes bad as described above (partial download, incorrect time stamp) the file will exist locally, so -nc will force wget will skip them.Sparkle
So in other words, wget is incapable of solving this issue reliably. It would be just better to use aria2 when dealing with large file downloads. No fuss with that, as it detects same files by default too. Example code: aria2c --file-allocation=none -c -x 16 -s 16 --log-level=warn --summary-interval=1 {File_URL} -d {Directory} {Filename}Uitlander
T
37

The answer I was looking for is at https://unix.stackexchange.com/a/9557/114862.

Using the -c flag when the local file is of greater or equal size to the server version will avoid re-downloading.

Trefler answered 8/11, 2017 at 13:40 Comment(3)
This is especially great when you are downloading a bunch of files with the -i flag. wget -i filelist.txt -c will resume a failed download of a list of files.Windpipe
I am downloading from a server which provides neither the Length header nor the Last-modified header (mentioned elsewhere on this page). So, I'd like to check only if a file with the same name exists on the disk and skip the re-download if it does. Still looking for that solution.Overhasty
-c means continue. If the file is was changed to a bigger file with different content you get will start download at the end of the local file and add the new file contents. You may end up garbage.Nanny
A
27

When running Wget with -r or -p, but without -N, -nd, or -nc, re-downloading a file will result in the new copy simply overwriting the old.

So adding -nc will prevent this behavior, instead causing the original version to be preserved and any newer copies on the server to be ignored.

See more info at GNU.

Arethaarethusa answered 9/2, 2011 at 11:42 Comment(1)
The following doesn't work as it is not recursive. The already downloaded files should be parsed for links so that everything that's missing is downloaded: wget -w 10 -r -nc -l inf --no-remove-listing -H "<URL>"Blomquist
K
2

I had issues with -N as I wanted to save output to a different file name.

Timestamping, wget docs:

A file is considered new if one of these two conditions are met:

  1. A file of that name does not already exist locally.
  2. A file of that name does exist, but the remote file was modified more recently than the local file.

Using test:

test -f stackoverflow.html || wget -O stackoverflow.html https://stackoverflow.com/

If the file exists does not exist test will evaluate to FALSE so wget will be executed.

Kristin answered 3/12, 2021 at 12:15 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.