Results 1 to 10 of 10
  1. #1

    Question Throttling Bandwidth Question

    Hi,

    I'm not sure if this question should go in the programming forum or here (or both), apologies if it's in the wrong place.

    We are planning to set up a file hosting site, like Rapidshare, that kind of thing. We are going to start with one or a few servers (LAMP). Like on most of these type of sites, there is one download speed for free users, and a superfast speed for members.

    Our programmer says that he can throttle the bandwidth to free users in the code.

    However, my friend who is a sysadmin tells me that this approach will kill the cpu if there is a few dozen users at a time. He says the best way is to have several servers - one for free users, one for members - mirror the content across both servers but throttle the bandwidth at the hardware level. Then serve files to free users from the slow server, and files to members via the full speed server.

    I am not sure which one is correct.

    Could anyone here let me know what the ideal situation is for this kind of project?

    Your assistance is appreciated, thanks.

  2. #2
    Join Date
    Dec 2006
    Posts
    4,151
    The ideal situation is not to throttle at all.
    Ironically, throttling will incur a performance penalty as resources (CPU/RAM/IO) are used up for far longer times if the download were to take longer.

    On the other hand, if throttling is required in your business model (baaad idea. try things like waiting, ads, etc.), the suggestion by your second friend will yield better performance.

  3. #3
    Join Date
    May 2010
    Location
    The Netherlands
    Posts
    1,418
    It will struggle with the CPU, so I think you should not throttle.
    ★★★★★
    HostSlim - Premium Managed Hosting Solutions
    Dedicated & Premium Hosting - Premium Network - 1.6Tbps DDoS Protection Available - OWN DATACENTER NETHERLANDS
    Resell Whitelabel Dedicated Servers - Twitter: @HostSlim - www.HostSlim.eu - Facebook: @HostSlimBV

  4. #4
    Quote Originally Posted by BigTed View Post
    Hi,

    I'm not sure if this question should go in the programming forum or here (or both), apologies if it's in the wrong place.

    We are planning to set up a file hosting site, like Rapidshare, that kind of thing. We are going to start with one or a few servers (LAMP). Like on most of these type of sites, there is one download speed for free users, and a superfast speed for members.

    Our programmer says that he can throttle the bandwidth to free users in the code.

    However, my friend who is a sysadmin tells me that this approach will kill the cpu if there is a few dozen users at a time. He says the best way is to have several servers - one for free users, one for members - mirror the content across both servers but throttle the bandwidth at the hardware level. Then serve files to free users from the slow server, and files to members via the full speed server.

    I am not sure which one is correct.

    Could anyone here let me know what the ideal situation is for this kind of project?

    Your assistance is appreciated, thanks.
    You can throttle with nginx which will work well in a lot of different ways compared to throttling in software with apache. Apache has to have a copy of itself, and php, open for each simultaneously connected downloading user. This uses a bit of cpu, but the biggest problem is that it uses massive amounts of ram, often as much as 10MB per connected user. That, and apache will tend to fall over entirely when you start getting 500-1000 connected users. Throttling only makes this problem worse as the apache slot for each user will be open longer than it needs to be, as the apache slot is tied up while the user is downloading.

    Nginx works better in this way. The way I do things like this with proxies, is I have:

    Nginx -> Apache -> PHP

    users connect to nginx, proxies to apache, which runs php. Php then replies back with a reproxy header, which tells nginx "hey, send the user this file", at which point, apache and php have done their job and are freed up for other things. Nginx can handle thousands of connections without any cpu or memory issues. In the process of php telling nginx what file to send to the user, you can send another header telling nginx to cap the transfer speed as well. This is going to be your best way to get this done. You could also drop apache entirely and use fastcgi or php-fpm, but that's a bit harder to get set up correctly, and won't really be necessary anyway.
    IOFLOOD.com -- We Love Servers
    Phoenix, AZ Dedicated Servers in under an hour
    ★ Ryzen 9: 7950x3D ★ Dual E5-2680v4 Xeon ★
    Contact Us: sales@ioflood.com

  5. #5
    Join Date
    May 2008
    Posts
    858
    You don't need a second server. You could just get a second copy of Apache/nginx/whatever running on a secondary IP that comes with that server and configure a bandwidth limit for it (for apache, there's a plugin and even some included options which allow you to cap connections to a maximum speed)

    funnkywizard's comment above is a pretty smart way to do it but somewhat complex. Either way, limiting transfer speed in code should be the last resort for you, as would be the one using most of the cpu and memory.

    ps. You'll just have to work some way to make the links expire after a period of time, if your business relies on people waiting and clicking ads before they get the download (maybe some hash in the url generated based on user ip and time or something like that)

  6. #6
    Join Date
    Aug 2003
    Posts
    599
    See this example for an efficient way to send protected files via nginx.

    http://wiki.nginx.org/XSendfile

  7. #7

    Thumbs up

    Hey TSJ/hostSlim/funkywizard/mariushm/topgun

    that's some great info there. Good points on the irony of throttling when throttling actually has more overhead than non throttling.

    Last 3 posts very interesting I had not heard of NGINX before. FunkyWizard thanks for your comprehensive explanation. If it is possible to use this and run everything on one server intially, it will be a great way to keep costs down while the company is setting up. Also sounds like it does what we need, so I will go and research. Thanks for pointing that out to me it is very much appreciated.

    I have another question actually, on a related/similar but different topic but not sure where to post this one either! If anyone can point me to the right forum it would also be appreciated.

    Assuming we use the nginx on one server and things are running well, and we want to start adding servers to the site storage, would this (again) need to be done via the PHP software, or is there a better way to do it? I keep hearing about this "cloud" thing (sorry a bit out of the tech loop at the moment!), would this be useful for what we are trying to do? Well anyway, not sure if I should post this in the dedicated forum, the programming forum (or the cloud forum). Maybe some kind person could point me in the right direction

    Thanks again for all your help.

    Ted.

  8. #8
    Join Date
    Aug 2003
    Posts
    599
    You really need to speak with your developer about this as it depend on whether you wish to have dedicated web/database/storage servers or a hybrid system.

    I would be inclined to go with a dedicated database server (that can also be used for storage while the load is low), this would maintain metadata about each uploaded file such as name, size, and storage node.

    when a user requests a file, it will be read from an NFS mount on the storage node over a dedicated 1gbps backend LAN.

    i.e when the user requests a file, php will simply add the header:

    X-Accel-Redirect: /mount/server2/protected/2011/04/11/my_movie.avi;

    Somehow you will need to make sure the user stays on the same web server as otherwise you will have no way to limit concurent downloads as most hosts do.

  9. #9
    Hi Topgun

    thanks for your reply.

    Yes, we will be aiming to work with dedicated servers on this project.

    I will have a word with my developer anyway. I guess somehow we need to (as you say) manage users through software on one main web server but serve files from various servers, if that is possible. Not sure how we would do that.

  10. #10
    have the same situation...

Similar Threads

  1. Bandwidth monitoring/throttling
    By nihkiruks in forum Colocation, Data Centers, IP Space and Networks
    Replies: 10
    Last Post: 09-03-2009, 12:08 PM
  2. Bandwidth Throttling
    By akny in forum Dedicated Server
    Replies: 6
    Last Post: 09-01-2005, 04:51 PM
  3. Throttling bandwidth for subdomains
    By kcdworks in forum Hosting Software and Control Panels
    Replies: 0
    Last Post: 07-31-2002, 04:46 PM
  4. Bandwidth Throttling
    By chrisb in forum Web Hosting
    Replies: 1
    Last Post: 07-09-2002, 07:17 AM
  5. Bandwidth throttling
    By The Laughing Cow in forum Web Hosting
    Replies: 2
    Last Post: 01-09-2002, 02:25 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •