Can I use wget to check for a 404 and not actually download the resource? If so how? Thanks
There is the command line parameter --spider
exactly for this. In this mode, wget does not download the files and its return value is zero if the resource was found and non-zero if it was not found. Try this (in your favorite shell):
wget -q --spider address
echo $?
Or if you want full output, leave the -q
off, so just wget --spider address
. -nv
shows some output, but not as much as the default.
wget --spider
sends a HEAD request, not a GET. –
Went wget --spider
does a HEAD and, if successful, follows with a GET to the same URL. Thus, with the recursive option, it's useful for building the cache for a server-side website. –
Morganne If you want to check quietly via $? without the hassle of grep'ing wget's output you can use:
wget -q "http://blah.meh.com/my/path" -O /dev/null
Works even on URLs with just a path but has the disadvantage that something's really downloaded so this is not recommended when checking big files for existence.
--spider
arg does set a return code. But maybe that's because after 4 years 3 months and 7 days, the spider has got smarter. –
Convert Yes easy.
wget --spider www.bluespark.co.nz
That will give you
Resolving www.bluespark.co.nz... 210.48.79.121
Connecting to www.bluespark.co.nz[210.48.79.121]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
200 OK
You can use the following option to check for the files:
wget --delete-after URL
Yes, to use the wget
to check , but not download the target URL/file, just run:
wget --spider -S www.example.com
If you are in a directory where only root have access to write in system. Then you can directly use wget www.example.com/wget-test
using a standard user account. So it will hit the url but because of having no write permission file won't be saved..
This method is working fine for me as i am using this method for a cronjob.
Thanks.
sthx
--spider
which does exactly what the OP asks –
Archaism © 2022 - 2024 — McMap. All rights reserved.