It really depends on what kind of site you would be hosting, if you plan on hosting a small forum (15-20 users on at anytime) it would be fine. If you plan on running a hosting company then you might look for something more robust. I believe (correct me if I am wrong) that apache launches a new process everytime someone connects to your site, so 20 processes max would mean 20 max users on your site at any given time would be the max.
█ Dimension Servers / Toll-Free: 1-888-750-6942
█ RapidSSL Cert Included with ALL Reseller & Ultra Accounts
█ 24x7x365 Unbeatable Technical Support!
█ Shared CPanel Hosting // Reseller WHM Hosting Solutions
Nice point! I will have to ask them about the processes to find out...
Basically I am going to use it for a 'guide to Internet' site / sort of some directory, but only selected best sites per category.
So not much of dynamic features, but possibly quite a few visitors per given time (and bear in mind that the site might grow with time), so your point is very important, ie how many visitors allowed at anytime....
You seem experienced guy to me, so how important is the "CPU 180 sec." limitation. I dont know that much, so can u guess any particular script(I mean commonly usedon sites) having a problem to run? I mean for a database I guess its OK with "CPU 180 sec.".... Am I right?
If you take a closer look in my first line, you will notice that I state the word "shared"
I believe (correct me if I am wrong) that apache launches a new process everytime someone connects to your site, so 20 processes max would mean 20 max users on your site at any given time would be the max.
When you startup apache, it creates a serveral child processes. Each children processes request one at a time and can serve multiple requests in its life time.
If you are talking about process that are forked from the child process like CGI, yes. So if somebody executes a CGI script via the web, then the child http process would start a new CGI process.
so how important is the "CPU 180 sec." limitation. I dont know that much, so can u guess any particular script(I mean commonly usedon sites) having a problem to run? I mean for a database I guess its OK with "CPU 180 sec.".... Am I right?
If you are talking about a script that is running 180 seconds per request or a mysql query that is executing for 180 seconds, then there must be something wrong with the script.
I came upon this thread about our company today and I'd just like to make some clarifications about the limits we have in place on our shared hosting servers.
First of all we have recently raised the limits to the following:
Memory (RAM): 65MB per process
CPU: 180 sec.
Simultaneously running Processes(for PHP/CGI scripts): 20
The upgrade was a direct result of new features/modules compiled into our PHP installation.
The Memory(RAM) limit is per process; this means that each process can consume up to 65MB of RAM. It is not a limit for all the process of a single hosting user.
This limit is in place to ensure that a single user cannot bring the server down. In other words the user can start up to 20 processes, each consuming 65MB of RAM. This means that a single user can consume up to 1.3GB of RAM at a given time.
The 20 Processes limit applies only to PHP/CGI scripts. It does not affect static HTML pages.
We are hosting more than 80,000 sites on our servers, and the aforementioned limits are proven to work fine.
Of course there were occasions where we have raised these limits per customer depending on their application requirements. For example, some graphic processing scripts require more RAM to operate properly, a web blog may require more memory to rebuild its pages, etc.
I hope this information helped. Thank you for your time.