Results 1 to 5 of 5
  1. #1
    Join Date
    Aug 2007
    Posts
    72

    Need help with robots.txt

    So I have a site that hosts thousands of files (GB worth) and it's all just indexed, nothing spectacular about it, however it seems all the search engines (especially Google and Yahoo) love to download ALL the files hosted on the site consuming my bandwidth. How can I prevent this with a robot.txt file?

    All files have extension .uz3

    This is what I have done, not sure if it works or not yet.

    User-agent: *
    Disallow: /*.uz3$

  2. #2
    This is very simple, restrict the bots in robots.txt using the same parameters you have defined in your questions. Just remove the forward slash before *.uz3$
    Just write

    User-agent: *
    Disallow: *.uz3$

    This will restrict to read all the files having extension .uz3$

  3. #3
    Join Date
    Aug 2007
    Posts
    72
    Oh the forward slash wasn't needed? Thanks!

  4. #4
    Join Date
    Jul 2011
    Posts
    107
    Yes, definitely this will work but you don't have to use this forward slash. rest of it is all right.

  5. #5

    Post Re:

    You can create your own robots.txt:

    User-agent: *
    Site: sitemap.com/sitemap.xml

Similar Threads

  1. what is robots.txt
    By aamir-designer in forum Web Design and Content
    Replies: 2
    Last Post: 05-23-2006, 06:47 PM
  2. Directing Robots With Robots.txt
    By Nullified in forum Web Design and Content
    Replies: 0
    Last Post: 02-27-2005, 12:12 PM
  3. what is robots.txt ?
    By constantine in forum Programming Discussion
    Replies: 1
    Last Post: 01-13-2005, 02:56 PM
  4. robots.txt
    By saghir69 in forum Web Design and Content
    Replies: 12
    Last Post: 08-08-2004, 04:58 AM
  5. robots.txt help
    By Lanarchy in forum Hosting Security and Technology
    Replies: 2
    Last Post: 01-21-2004, 09:33 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •