There is couple of reasons to develop site distributed under multiple subdomains, e.g. subdomain1.domain.com, subdoman2.domain.com, etc. Each subdomain might be a distinguishable mini-site, which offers better separation of topics and even protects main site rankings to some extent.
However, subdomain approach has one issue that needs to be solved. That is each subdomain is treated as separate site by some search engine spiders. This leads to insane amounts of concurrent hits. The worst offender is yandex – a Russian search engine whose spider scan list is probably alphabet based. Thus many of your minisites get scanned at the same time for content.
This might not be a problem if each of your minisites are hosted separately and not on the same server. Also, this would not be a problem if your subdomains are simplistic. In other cases you have to rely on your page caching and server capacity.
There is a crawl delay option that yandex listens to. However, this option can be set in robots.txt only. And robots.txt is subdomain-based. Thus it works for single subdomain only.
I have decided to totally block yandex from indexing one of my sites once. Luckily, this site can live from visitors from Russia, as it is geared towards different location. But now I doubt I would do a heavily multi-domain (300+ subdomains) site knowing that I will have to use yandex traffic as well.

There is couple of reasons to develop site distributed under multiple subdomains, e.g. subdomain1.domain.com, subdoman2.domain.com, etc. Each subdomain might be a distinguishable mini-site, which offers better separation of topics and even protects main site rankings to some extent.

However, subdomain approach has one issue that needs to be solved. That is each subdomain is treated as separate site by some search engine spiders. This leads to insane amounts of concurrent hits. The worst offender is yandex – a Russian search engine whose spider scan list is probably alphabet based. Thus many of your minisites get scanned at the same time for content.

This might not be a problem if each of your minisites are hosted separately and not on the same server. Also, this would not be a problem if your subdomains are simplistic. In other cases you have to rely on your page caching and server capacity.

There is a crawl delay option that yandex listens to. However, this option can be set in robots.txt only. And robots.txt is subdomain-based. Thus it works for single subdomain only.

I have decided to totally block yandex from indexing one of my sites once. Luckily, this site can live from visitors from Russia, as it is geared towards different location. But now I doubt I would do a heavily multi-domain (300+ subdomains) site knowing that I will have to use yandex traffic as well.


Giedrius Majauskas

I am a internet company owner and project manager living at Lithuania. I am interested in computer security, health and technology topics.

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *