I have a site that will be going offline soon. I wanted to download a copy of it before then so I can preserve it. Can anyone recommend a spidering tool to use? I don't have command line access to it. Only web access.
it's not a big site and only has static pages. I do, however, want the files that the site serves as well. (ie: it's a small software site) If I don't end up using a spider tool I'll take you up on your offer lightwave. I'd like to try a spidering tool first though.