I have a very huge dynamic and heavy traffic website with more then 700,000,000 pages all of them generated dynamically.
I'm looking to optimize the website using cache engine and want to cache the whole 700,000,000 pages. i found in the forum that mmcache is the best cache engine for php, but the problem is that, it store all the cached pages in a single directory. storing 700,000,000 files in a single directory will create lot of problems for the server.
is there any caching engine out there that can store the cached pages in multiple sub directories based on directory names? ie /tmp/cahce/1/2/3/123234324.something instead of /tmp/cache/123234324.something
MMCache is open source, you could probally modify it to store files differently. You could probally do it your self or if you rather make a post in the offers forum and get someone else to do it for a small fee.
i can make changes myself if you can guide me little bit.
i have already installed mmcache on the server.
do i need to get the mmcache source code, modify it and recompile in order to work or can i use the already complied script and modify it to do the job ?
I have always used jpcache, it works in all the php envoirments, I also modified it to give a basic protection from dos, and bandwidth stealing, if a user refreshes few times, the user will be blocked for an adjustable time, this way we reduce the server load, and bandwidth, since jpcache uses gzip compression to send the data.
jpcache + mmcache = incredible performance.
If you are a good programmer, you do what ever you need, you can modify jpcache and mmcahe to do you job..
but you can implement your own cache system, based on jpcache, or modify jpcache to do your task, just exactly as we do..
If you just explain your problem a bit clearer, we all going to be glad to help you.
Something else, is that, what do you mean by many files in one directory causes problems in the server?? We've been using mmcache for a while on shared hosting servers, we never faced a single problem..
Almahdi, as per my information and experience, if you put lot of files, let say more then 10,000 files in one directory, it will give hard time to OS & CPU to access those files. it's advisable to seperate files in different directories so that OS can easily access it.
Actually i want to cache each page of my website in 3 parts. header, body and footer. the header and footer will change according to the reseller account, but body will be same for each reseller but will contain unique content for each page.
i will see how easy it will be to modify jpcache to start cache different parts of the same page instead of the whole..
as per my information and experience, if you put lot of files, let say more then 10,000 files in one directory, it will give hard time to OS & CPU to access those files. it's advisable to seperate files in different directories so that OS can easily access it.
We've been running shared hosting servers, and all run with mmcache for caching with no problem at all.
Anyway, you can modify jpcache to do your job easily.. only if you know what you are doing...
700,000,000 <-- this number seems a bit far fetched to me. You say that the pages are generated dynamically, so there aren't that many static files.
Unless you plan on investing a large amount of money on a real caching/high-availability system -- you must decide what pages need caching, and what are okay to be dynamically created. The pages that need to be cached, have a script generate them and store them on the disk (turning them into static files). Then you cache these files only.
I really think that you are approaching the problem from the wrong direction.
In order to understand recursion, one must first understand recursion.
If you feel like it, you can read my blog
Signal > Noise