Hello, I am a very small ISP. I am not doing the 5.00 a month hosting plans. I started to build out a new server the other day. Running php 6 and Apache 2, MySql. Bind and postfix, Dovecot etc, run on another box that is not a shared one.
I know all my clients, I am not worried about them being malicious. However, I want to be able to feel safer in a shared environment for them, so that a bad WordPress install, or some other app that is installed, would be sandboxed to a degree. No SSH access currently, just ftp and ftp with TLS. I will be mandating TLS on this one probably.
I am building from source and managing via the command line. Not looking for a control panel. I spent the better part of an entire day looking into how to secure php, most data is old, about safe mode, a lot of it is all over the board. If there is a definitive guide I am missing?
Current issue is that the usual tactic of file_get_contents(./../../db.inc.php) or exec(cat ../../../foo.txt) are going to allow someone to read others files. May ways to ../../ and recurse.
I am using mod_php, from what I can gather, the other options (suexec etc), are performance hits, and share the same weakness if the users do not know what they are doing, which most of mine do not.
Of course, I set a virtual host and a directory block in Apache 2, locking the httpd process to that directory. That does not stop php from looking elsewhere, as all files are owned by a user and group that Apache can serve.
Looking into open_basedir, this seems to solve a lot of this, sandboxing a user to their own area. It does not appear, as far as I can tell, to sandbox system calls, like exec(). I will of course limit those, though how many off the shelf forums and other packages would break? Or is this really a case of those functions not being used in community software?
I am looking for some general pointers in how to secure a shared hosting environment so users can not read files others than those that are in a defined directory, and that an exploit to something like WordPress would be locked to their files and DB as well.
I am assuming every 5.00 a month host has figured this out, I just can not find anything current.
Hmmm, I've been trying suPHP, DSO and FastCGI. suPHP seems to be a memory eater or perhaps even a CPU killer (literally), FastCGI seems to be throwing loads of Error 500 and DSO (mod_php) seems to be okay so far... i'm just really really concerned about the security issues faced with it and hopefully someone can enlighten me about this?
Compile Apache with mod_fcgid and PHP FCGI. Configure Apache to handle the child processes, timeouts, etc. You will see 500 errors when using fcgi with PHP handling the timeouts, etc. There is quite a bit of memory usage involved but if coupled with suexec then you can provide a pretty secure environment. Just get the permissions right on the system. I'm assuming all of this works correctly on Linux and memory management may be a little different with Linux. We run a similar config on all of our shared servers (FreeBSD). No problems.
Hmm, that seems like a good configuration to try out. But what do you mean by configuring Apache to handle child processes, timeouts, etc? Is there any guide for it since I'm quite a beginner with these 500 errors im getting. Thanks.