/
Pack Hosting Panel

Robots and sitemap multistore

How to configure multiple robots.txt files en sitemaps in a multistore environment?


In a multistore environment multiple sitemaps and robots.txt files quickly become an issue. Surely you want them to be crawled individually and using an unique URL.

In this article we explain how to add a configuration to accomplish this.

We assume you are familiar with the here described location and folder structure of the Nginx configurations.

Set up multiple robots.txt files

To set up multiple robots.txt files, you have to create a separate robots.txt for each storeview. Make sure that the robots.txt files are placed in a subfolder, where each subfolder is named according to the storeview.

In our example we have 3 storeviews with the following storecodes:

  • nl
  • de
  • fr

In the shop structure the three maps above are created, each holding its own robots.txt file.

Next we add in the /home/<username>/domains/<domain>/var/etc/map the following Nginx configuration:

location /robots.txt {
    rewrite ^/robots\.txt$ /$mage_run_code/robots.txt;
}

After adjusting the basic authentication the nginx configuration needs to be reoladed using the nginx-reloadcommand.

Configure multiple sitemaps

We can set the sitemaps in a multistore environment in a similar manner to the robots. Create a subfolder for each store code in the sitemaps directory of the shop. An identical `sitemap.xml' is placed in each subfolder.

We place the following Nginx configuration in the /home/<username>/domains/<domain>/var/etc/ folder:

location /sitemap.xml {
    rewrite ^/sitemap\.xml$ /sitemaps/$mage_run_code/sitemap.xml;
}