Results 1 to 22 of 22
  1. #1
    Join Date
    Jan 2004
    Posts
    397

    FTP 10.000 file to other server (phpftp cant handle it)

    Hi,

    I have a problem (posted on another forum a week ago but no one seems to be able to help, so im trying in here).

    My users upload about 1800 files per day (avg 4.3MB/upload) some as big as 60MB. The problem is that 90% of all uploads above 5MB fail while sometimes it upload okay (rare cases). Everything at the server has been setup correct. Have been told the problem is the standard ftp in php which isnt able to handle that kind of traffic. All files are uploaded successfull by the upload form (can echo the filesizes) so im positive the error is when I FTP the file from the tmp dir to the correct fileserver using ftp.

    So my question is:
    - How would you create a system capable of uploading 10.000 files per day using ftp with an average filesize of 5MB?
    (main server, Upload form -> FTP -> Fileserver)
    ...

  2. #2
    Join Date
    Jun 2003
    Posts
    961
    you use a php form to upload the files to your main server, then ftp them (with php) to your fileserver?
    if so, you can make a cronjob to batch upload (ftp) the files
    or does the transfer to the filesever have to be instant?

  3. #3
    Join Date
    Jan 2004
    Posts
    397
    you use a php form to upload the files to your main server, then ftp them (with php) to your fileserver?
    - Yes and the upload from the upload form to the main server is okay, just the FTP move that cant handle it.

    if so, you can make a cronjob to batch upload (ftp) the files
    or does the transfer to the filesever have to be instant?
    - It has to be instant.
    ...

  4. #4
    Join Date
    Jan 2004
    Posts
    397
    Maby form the question like this:

    How will you choose to create a system that is capable of uploading 10.000 files per day (4MB average filesize) using a main server and several fileservers?
    ...

  5. #5
    Join Date
    Jun 2003
    Posts
    961
    if you dont have to use ftp, you could try to mount the fileservers dir with nfs on your main server

    or you could try to execute the systems ftp client to upload the file instead of using php functions

  6. #6
    Join Date
    Jan 2003
    Posts
    1,715
    Are the file servers for storage or mirroring?

    For storage, I would suggest uploading to the file servers directly, since the main server will be a constant bottleneck. For mirroring, I would use squid proxies for the mirrors, rather than a normal web server.
    Game Servers are the next hot market!
    Slim margins, heavy support, fickle customers, and moronic suppliers!
    Start your own today!

  7. #7
    Join Date
    Jan 2004
    Posts
    397
    have been told that nfs will be too slow and not sure how the last suggestion will work and if it will work.
    ...

  8. #8
    Join Date
    Jan 2004
    Posts
    397
    Originally posted by hiryuu
    Are the file servers for storage or mirroring?

    For storage, I would suggest uploading to the file servers directly, since the main server will be a constant bottleneck. For mirroring, I would use squid proxies for the mirrors, rather than a normal web server.
    The fileservers are used for storing the users files so others can download them (like putfile).
    What do you mean by FTP direct, isnt that what im doing already or you want to skip uploading to the tmp dir before FTP to fileserver?Iif so, how does that work (havent been able to figure that out or find any info related to that)?
    ...

  9. #9
    Join Date
    Jan 2003
    Posts
    1,715
    The question is, basically, why not run the web/php script on the fileserver nodes, skipping centralized upload and FTP stage entirely? You could still run a central DB server and whatnot for authentication purposes.
    Game Servers are the next hot market!
    Slim margins, heavy support, fickle customers, and moronic suppliers!
    Start your own today!

  10. #10
    Join Date
    Apr 2003
    Location
    UK
    Posts
    2,560
    we have a similar system, whereby we copy an average of 8-9k files per day. we do this using scp with a generic key. admittedly we copy from about 80-90 servers to 1 central machine, but the files are anywhere upto 180megs usually. as long as your fileserver is beefed up it might get slow but it should hold up ok. afaik scp has very little overhead so might be a better choice

  11. #11
    Join Date
    Jan 2004
    Posts
    397
    Slidey: That is defendly worth a try.
    I found this article about SCP but have problems finding informations how to use it with my php scripts.

    http://www.umbc.edu/oit/sans/helpdesk/ftpguide.html
    ...

  12. #12
    Join Date
    Apr 2003
    Location
    UK
    Posts
    2,560
    using a *nix based system, i'd generate a keypair, and then the syntax would be something like scp -i privatekey localfile [email protected]:/remotefile

  13. #13
    Join Date
    Jan 2004
    Posts
    397
    hmm... made me think.
    If the uplink port of the fileserver is running at max when a user try to upload, will that upload fail using standard ftp_put (or another protocol for that matter)?
    ...

  14. #14
    Join Date
    Apr 2003
    Location
    UK
    Posts
    2,560
    chances are you'd get a connection timeout or thereabouts

  15. #15
    Join Date
    Jan 2004
    Posts
    397
    I dont get any "official" error message, only a blank page.
    Does this seems to be because of a timeout error?
    ...

  16. #16
    Join Date
    Jun 2003
    Posts
    961
    blank page is possible when the execution exceeds the max time, then you would have a "PHP Fatal error: Maximum execution time of X seconds exceeded" error in your apache's error log
    got any of those?

  17. #17
    Join Date
    Jan 2004
    Posts
    397
    let me check
    main or fileserver?
    ...

  18. #18
    Join Date
    Jan 2004
    Posts
    397
    Searched the logs at the main server for "PHP Fatal error" but 0 results.

    Same with fileserver
    ...

  19. #19
    Join Date
    Jun 2003
    Posts
    961
    would have been the main server, hmm, any apache/php errors connected with the script at all?

  20. #20
    Join Date
    Jan 2004
    Posts
    397
    not that I can see.
    ...

  21. #21
    Join Date
    Jan 2004
    Posts
    397
    Any suggestions how to build a system capable of FTP 10.000 files per day (4.3MB/avg)?
    Dont know if scp will work and need an expamle how to ftp a file from one server to another using php code.
    ...

  22. #22
    Join Date
    Jan 2004
    Posts
    397
    Have been told that curl might be the best solution.
    Any comments?
    ...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •