Does robots.txt
accept regular expression ?
I have many URL
s with this format:
https://example.com/view/99/title-sample-text
ID ----------------------^
Title -----------------------------^
I used this:
Disallow: /view
But look like this not working because google
indexed more pages. so i want to do this with regex, something like this:
Disallow: /view/([0-9]+)/([^/]*)
But is this correct format or valid in robots.txt
?
*
globs. – Knopp