Separate robots.txt for each showcase
Please sign in so that we can notify you about a reply
Do you have a problem that your robots.txt file is common for every of your showcases?
Nowadays most of the visitors comes from organic search. And search robots Yandex or Google address to this file first and determine what information they can access and what they do not. And if you have more than one showcase - you will face the problem of robots.txt is common for all of them. It will lead to a problem - in the host and sitemap directive you must set the website address, but you can set more than one address in one file. We solved this issue. After you install our module you can create robots.txt for each of your showcases.
This module allows easily to change access criteria for search robots to every of your showcases while adding robots.txt file for every showcase separately.
Module configuration process you can watch the video: