Should a sitemap have *every* url
Asked Answered
O

2

8

I have a site with a huge number (well, thousands or tens of thousands) of dynamic URLs, plus a few static URLs.

In theory, due to some cunning SEO linkage on the homepage, it should be possible for any spider to crawl the site and discover all the dynamic urls via a spider-friendly search.

Given this, do I really need to worry about expending the effort to produce a dynamic sitemap index that includes all these URLs, or should I simply ensure that all the main static URLs are in there?

That actual way in which I would generate this isn't a concern - I'm just questioning the need to actually do it.

Indeed, the Google FAQ (and yes, I know they're not the only search engine!) about this recommends including URLs in the sitemap that might not be discovered by a crawl; based on that fact, then, if every URL in your site is reachable from another, surely the only URL you really need as a baseline in your sitemap for a well-designed site is your homepage?

Ointment answered 23/6, 2010 at 8:20 Comment(1)
A fair point perhaps - I could have included that I'm using Asp.Net MVC and writing in C# - therefore building a dynamic sitemap in Asp.Net MVC has its own issues. But I figured that this question applies to anybody designing a new site, or who is architecting a website and is agonizing over how much time and tech to devote to building their sitemap.Ointment
I
5

If there is more than one way to get to a page, you should pick a main URL for each page that contains the actual content, and put those URLs in the site map. I.e. the site map should contain links to the actual content, not every possible URL to get to the same content.

Also consider putting canonical meta tags in the pages with this main URL, so that spiders can recognise a page even if it's reachable through different dynamical URLs.

Spiders only spend a limited time searching each site, so you should make it easy to find the actual content as soon as possible. A site map can be a great help as you can use it to point directly to the actual content so that the spider doesn't have to look for it.

We have had a pretty good results using these methods, and Google now indexes 80-90% of our dynamic content. :)

Illustrator answered 23/6, 2010 at 8:31 Comment(0)
S
1

In an SO podcast they talked about limitations on the number of links you could include/submit in a sitemap (around 500 per page with a page limit based on pagerank?) and how you would need to break them over multiple pages.

Given this, do I really need to worry about expending the effort to produce a dynamic sitemap index that includes all these URLs, or should I simply ensure that all the main static URLs are in there?

I was under the impression that the sitemap wasn't necessarily about disconnected pages but rather about increasing the crawling of existing pages. In my experience when a site includes a sitemap, minor pages even when prominently linked to are more likely to appear on Google results. Depending on the pagerank/inbound links etc. of your site this may be less of an issue.

Sprint answered 23/6, 2010 at 8:22 Comment(2)
Yeah this is one of the (understandable) pains with sitemaps - having to break them up either based on size or number of links. Clearly, if the datastore that your sitemap mirrors is weighty it can be quite a load to keep such a thing up to date - so in that case by focussing on good linking (after all - it's page content and link count that must matter the most for search engine ranking) you should be able to avoid the pain. But is it an unnecessary gamble to assume this and avoid the sitemap?Ointment
Having read the Google FAQ a bit more google.com/support/webmasters/bin/… it does suggest that you can only gain by having a good sitemap with 100% coverage.Ointment

© 2022 - 2024 — McMap. All rights reserved.