I am trying to add a
robots.txt file to our File Cabinet (under Web Site Hosting Files > Live Hosting Files).
But we have multiple domains in this account, associated to 1 website record (ex:
domain1.com,
domain2.com,
domain3.com).
How can I disable crawling for a specific domain (ex:
domain1.com), and still allow crawling for the other domains?
We also have another website record which also needs to have crawling enabled.