hackers can view the robots.txt to look at your directories and find any valuable files, what i would recommend is disallowing folders containing valuable information (certain bots don't follow the rules of robots.txt)
you should block some bad bots using .htaccess and you can also add a code so that any folders without an index page is accessed it will re-direct viewers to another page (preventing access to subfolders)
I'm confused by all this.
I thought that .htaccess was the missing file file when someone goes to a place that doesn't egsist?
And can't you set the chmod on folders to make it so no one can read or write?
.htaccess if a file with a 7 letter extension which controls the folder that you put it in and all the subfolders below it, more than likely with your webspace/server you will already have one of these in your public_html/www folder (or whatever)
this can be used to set passwords for directories, create custom error pages, re-direct and keep bad bots from snooping through your files