sitemap for multiple domains of same site
Asked Answered
D

6

10

Here is the situation, i have a website that can be accessed from multiple domains, lets say www.domain1.com, www.domain2.net, www.domain3.com. the domains access the exact same code base, but depending on the domain, different CSS, graphics, etc are loaded.

everything works fine, but now my question is how do i deal with the sitemap.xml? i wrote the sitemap.xml for the default domain (www.domain1.com), but what about when the site is accessed from the other domains? the content of the sitemap.xml will contain the wrong domain.

i read that i can add multiple sitemap files to robots.txt, so does that mean that i can for example create sitemap-domain2.net.xml and sitemap-domain3.com.xml (containing the links with the matching domains) and simply add them to robots.txt?

somehow i have doubts that this would work thus i turn to you experts to shed some light on the subject :)

thanks

Dustin answered 21/6, 2011 at 14:12 Comment(0)
E
3

You should use server-side code to send the correct sitemap based on the domain name for requests to /sitemap.xml

Eternal answered 21/6, 2011 at 14:16 Comment(2)
how would i go about that approach? specifying some rewrite rule in .htaccess that will serve instead some PHP file that will return the correct XML depending on the domain? heck, i like the sound of that, any idea about the mod_rewrite part? :DDustin
Regarding to Google's guideline support.google.com/webmasters/answer/183668 sitemap doesn't need to be named sitemap.xml.Sacramentalist
B
3

Apache rewrite rules for /robots.txt requests

If you're using Apache as a webserver, you can create a directory called robots and put a robots.txt for each website you run on that VHOST by using Rewrite Rules in your .htaccess file like this:

# URL Rewrite solution for robots.txt for multidomains on single docroot
RewriteCond %{REQUEST_FILENAME} !-d # not an existing dir
RewriteCond %{REQUEST_FILENAME} !-f # not an existing file
RewriteCond robots/%{HTTP_HOST}.txt -f # and the specific robots file exists
RewriteRule ^robots\.txt$ robots/%{HTTP_HOST}.txt [L]

NginX mapping for /robots.txt requests

When using NginX as a webserver (while taking yourdomain1.tld and yourdomain2.tld as example domains), you can achieve the same goal as post above with the following conditional variable (place this outside your server directive):

map $host $robots_file {
    default /robots/default.txt;
    yourdomain1.tld /robots/yourdomain1.tld.txt;
    yourdomain2.tld /robots/yourdomain2.tld.txt;
}

This way you can use this variable in a try_files statement inside your server directive:

location = /robots.txt {
    try_files /robots/$robots_file =404;
}

Content of /robots/*.txt

After setting up the aliases to the domain-specific robots.txt-files, add the sitemap to each of the robots files (e.g.: /robots/yourdomain1.tld.txt) using this syntax at the bottom of the file:

# Sitemap for this specific domain
Sitemap: https://yourdomain1.tld/sitemaps/yourdomain1.tld.xml

Do this for all domains you have, and you'll be set!

Boman answered 22/1, 2021 at 13:35 Comment(0)
M
1

You have to make sure URLs in each XML sitemap match within domain/subdomain. But, if you really want, you can host all sitemaps on one domain look using "Sitemaps & Cross Submits"

Mychael answered 12/7, 2011 at 22:5 Comment(0)
S
1

I'm facing a similar situation for a project I'm working on right now. And Google Search Central actually have the following answer:

If you have multiple websites, you can simplify the process of creating and submitting sitemaps by creating one or more sitemaps that include URLs for all your verified sites, and saving the sitemap(s) to a single location. All sites must be verified in Search Console.

So it seems that as long as you have added the different domains as your properties in Google Search Console, at least Google will know how to deal with the rest, even if you upload sitemaps for the other domains to only one of your properties in the Google Search Console.

For my use case, I then use server side code to generate sitemaps where all the dynamic pages with English content end up getting a location on my .io domain, and my pages with German content end up with a location on the .de domain:

<url>
    <loc>https://www.mydomain.io/page/some-english-content</loc>
    <changefreq>weekly</changefreq>
</url>
<url>
    <loc>https://www.mydomain.de/page/some-german-content</loc>
    <changefreq>weekly</changefreq>
</url>

And then Google handles the rest. See docs.

Suboceanic answered 17/9, 2021 at 15:41 Comment(0)
K
0

I'm not an expert with this but I have a similar situation

for my situation is that I have one domain but with 3 sub-domain

so what happen is that each of the sub-domain contain the sitemap.xml

but since my case was different directory for each of the sub-domain

but I'm pretty sure that the sitemap.xml can be specify for which of each domain.

Kathlyn answered 21/6, 2011 at 14:18 Comment(0)
A
0

The easiest method that I have found to achieve that is to use an XML sitemap generator to create a sitemap for each domain name. Place both the /sitemap.xml in the root directory of your domains or sub-domains. Go to Google Search and create separate properties for each domain name. Submit an appropriate sitemap to each domain in the Search Console. The submission will say show success.

Anallise answered 25/5, 2020 at 3:43 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.