Results 1 to 5 of 5
Thread: Need help with robots.txt
-
08-04-2011, 03:41 AM #1Junior Guru Wannabe
- Join Date
- Aug 2007
- Posts
- 72
Need help with robots.txt
So I have a site that hosts thousands of files (GB worth) and it's all just indexed, nothing spectacular about it, however it seems all the search engines (especially Google and Yahoo) love to download ALL the files hosted on the site consuming my bandwidth. How can I prevent this with a robot.txt file?
All files have extension .uz3
This is what I have done, not sure if it works or not yet.
User-agent: *
Disallow: /*.uz3$
-
08-04-2011, 03:48 AM #2Newbie
- Join Date
- Jul 2011
- Posts
- 8
This is very simple, restrict the bots in robots.txt using the same parameters you have defined in your questions. Just remove the forward slash before *.uz3$
Just write
User-agent: *
Disallow: *.uz3$
This will restrict to read all the files having extension .uz3$
-
08-04-2011, 03:51 AM #3Junior Guru Wannabe
- Join Date
- Aug 2007
- Posts
- 72
Oh the forward slash wasn't needed? Thanks!
-
08-06-2011, 01:50 AM #4Temporarily Suspended
- Join Date
- Jul 2011
- Posts
- 107
Yes, definitely this will work but you don't have to use this forward slash. rest of it is all right.
-
08-09-2011, 07:59 AM #5New Member
- Join Date
- Jul 2011
- Posts
- 1
Re:
You can create your own robots.txt:
User-agent: *
Site: sitemap.com/sitemap.xml
Similar Threads
-
what is robots.txt
By aamir-designer in forum Web Design and ContentReplies: 2Last Post: 05-23-2006, 06:47 PM -
Directing Robots With Robots.txt
By Nullified in forum Web Design and ContentReplies: 0Last Post: 02-27-2005, 12:12 PM -
what is robots.txt ?
By constantine in forum Programming DiscussionReplies: 1Last Post: 01-13-2005, 02:56 PM -
robots.txt
By saghir69 in forum Web Design and ContentReplies: 12Last Post: 08-08-2004, 04:58 AM -
robots.txt help
By Lanarchy in forum Hosting Security and TechnologyReplies: 2Last Post: 01-21-2004, 09:33 AM