var sidebar_align = 'right';
var content_container_margin = parseInt('350px');
var sidebar_width = parseInt('330px');
Server with high number of outgoing connections
Does anyone have recommendations on box stats to have for a server with a large number of outgoing connections?
I have a server that will on request go out and crawl a user defined page. Ideally I'd like to be able to handle as many connections as possible to limit the wait time for each user.
I assume one of the biggest attributes to look for is a fast connection speed to fetch the page as fast as possible. (Are any specific data centers known for fast connections?)
What other factors affect the max number of connections a server can handle?
Would lighttpd be better suited than apache for this?
And finally, any estimate on the max number of connections I could reasonably achieve (given the right stats)?
Any thoughts would be much appreciated.
Last edited by WishIwasntH1; 06-30-2009 at
Should also mention that the connections will likely be made from curl inside of php. (Unless I find a faster, easier alternative)
given that you want to achieve the highest number of concurrent threads possible in your application the more cores the better.
The number of maximum connections is defined by the amount of memory and connection rate (switch port speed and band width). In your particular case, if each of your PHP processes consumes 5MB on average to run. You'll need about 1GB of RAM for 200 simultaneous users. This doesn't account for other memory consumed by the webserver itself which could be as much 10MB (approx.) per Apache process.
So for 200 simultaneous connections you're probably looking at a server with 4GB of RAM. Yes, Lighttpd can help trip down the amount of memory each user consumes on average.
Your bandwidth requirement depends on how much data you're pulling down per fetch (simple math can help define it).
A high number of threads only causes thrashing in context switches.
What you need is a small thread pool managing *all* the connections. Memory is not a big deal. You should be able to do this in under 50MB.
The biggest chokepoint will be dns resolution if you are using anything based on gethostbyname() as it is synchronous. What you need to overcome that is an async dns resolver library.
Awesome, thanks for all of the replies, that's very helpful.
Someone else I talked to suggested looking into Amazon's Cloud service for this sort of thing. Anyone have any experience in regards to how fast those connections are?
By linuxgrapter in forum Web Hosting
Last Post: 05-30-2005, 06:25 PM
By Qwerty_KB in forum Dedicated Server
Last Post: 02-14-2005, 09:35 AM
By maximumhost in forum Web Hosting
Last Post: 08-28-2004, 02:02 PM
By nogi in forum Hosting Security and Technology
Last Post: 07-09-2003, 02:59 AM