Results 1 to 3 of 3
Thread: robots.txt file and sub domains
-
09-12-2004, 07:33 PM #1Junior Guru Wannabe
- Join Date
- Dec 2002
- Location
- esperanto
- Posts
- 50
robots.txt file and sub domains
Hi All, another n00b question please,
I'm puzzling about how robots.txt works for sub domains, if I want to disallow robot to sub_domain.my_site_com, can I do:
1).
User-agent: *
Disallow: /sub_domain/
am I doing right? and how about for sub_sub_domain.sub_domain.my_site_com?
2).
User-agent: *
Disallow: /sub_domain/sub_sub_domain/
Or if I disallow spidering "sub_domain", is that disallow spidering "sub_sub_domain" too? I mean I just only need No. 1 commands, or do I need both No.1 and No 2 in the robot.txt file?
Thank you,
-
09-12-2004, 09:23 PM #2Linux Guru
- Join Date
- Mar 2004
- Location
- Odessa, Ukraine
- Posts
- 610
Need put robot.txt in each subdomain.
-
09-13-2004, 12:55 AM #3Junior Guru Wannabe
- Join Date
- Dec 2002
- Location
- esperanto
- Posts
- 50
Thank you!