Results 1 to 8 of 8
  1. #1

    Cool how to wget multiples files ?

    there is a file like http://localhost/test.html which contents more than ten dl url in that file like http://1234.com/test1.zip , http://2345.com/test2.zip , and http://4532.com/test3.zip etc....

    What is the best way to get all the files in test.html ? Thanks

  2. #2
    Join Date
    Jul 2003
    Location
    Kuwait
    Posts
    5,099
    From the wget manual :

    Code:
     -i file
           --input-file=file
               Read URLs from file, in which case no URLs need to be on the command line.  If there are URLs both on the command line and in an
               input file, those on the command lines will be the first ones to be retrieved.  The file need not be an HTML document (but no harm
               if it is)---it is enough if the URLs are just listed sequentially.
    
               However, if you specify --force-html, the document will be regarded as html.  In that case you may have problems with relative
               links, which you can solve either by adding "<base href="url">" to the documents or by specifying --base=url on the command line.
    In order to understand recursion, one must first understand recursion.
    If you feel like it, you can read my blog
    Signal > Noise

  3. #3
    Thanks for your help, but I still don't get it. ;(

    Can you show me more specific detail ?

    Like there 2 links like http://1234.com/asdf.zip and http://1234.com/jkl.zip in http://localhost.com/test.html

    Then how do I use the command --input-file like you just showed me? Thanks

  4. #4
    Join Date
    Jul 2003
    Location
    Kuwait
    Posts
    5,099
    Code:
    wget -i --force-html http://localhost.com/test.html
    should work
    In order to understand recursion, one must first understand recursion.
    If you feel like it, you can read my blog
    Signal > Noise

  5. #5
    It doesn't work for me. It just dl the .html file not the file in .html . It showed up on the screen is

    --force-html: No such file or directory
    No URLs found in --force-html.

  6. #6
    Join Date
    Apr 2004
    Location
    Scottsdale, Arizona
    Posts
    28
    Are these files links on that html page?

    If they are, and in a lower or the same directory try this

    wget -r -np http://wwww.test.com/test1.html

    Take a look at the manual page.

  7. #7
    Join Date
    Nov 2003
    Location
    Ljubljana, Slovenija, Europe
    Posts
    298
    perhaps there are some security or privilege issues you are unaware of? Airnine

  8. #8
    Join Date
    Jun 2003
    Location
    UK
    Posts
    6,601
    If you want to just mirror a directory you can do wget -mp http://....
    Russ Foster - Industry Curmudgeon

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •