I was trying to setup my client's CGI script when I encounter a very strange situation. Apache keeps refusing to execute the script even though all the permissions are right, and the perl interpreter runs fine. The error shows entry that says "No such file or directory /home/site/cgi-bin/script.cgi"
I am using the RPM install of apache 1.3.20, redhat 7.2, and the script is a proxy script. I am running mod_cgi.
This is also strange in another respect because there are two sites that are setup, and one of them runs the script fines, while the other gives out error. Yet both sites have identical configuration. I have done many different configurations, and it seems that apache is even able to read the file, because I was able to download the file (not execute it) if I don't set the execCGI option. Another thing also strange was that if I copy the script from the working site to the non-working site, the script also becomes non-working.
So far, the only explanation is that there is an "easter egg" somehow either in apache or linux or something in between that got activated by that directory (renaming it doesn't seem to help either).
I hope this information is not too confusing, but I am very puzzled myself. I would appreciate any kind of help.
Clocker, thanks for the suggestion. Right now I am just trying to get my client up as soon as possible. I plan to do some compiling later.
David, as mentioned above, I did check the log. The log says apache cannot find the file, even though I have verified the permssion and that the file exists. I know (guessing) it is not a vhost configuration because I have another site with the exact same configuration/directory layout, and it runs fine.
The log entry says:
"script not found or unable to stat: /home/site/script.cgi"
(edited to take out path name)
Do you have suEXEC enabled? If so, have you tried looking in the suEXEC error logs? You've already made sure that the script exists where Apache thinks it is, that makes me think that it's either a vhost problem or a suEXEC problem. Also check directory permissions all the way back to the root dir. Good luck with this one!
I am not sure if I had suEXEC, but I didn't noticed any log named suexec(I looked), also, that still wouldn't explain why one site works and the other doesn't, despite the same configuraiotn(I copy and pasted).
Actually, I have the problem solved, but not in a typcial way. In our desperate attempt, we decided to remove vsftpd and install wu-ftpd. The script magically worked afterward. I have thought about the FTP upload mode, but that still wouldn't explain why scripts copied from the working site becomes non-working when placed in the non-working site.
Anyway, the whole thing is over. However, I appreciate all the help I have gotten.
If you get an internal server error when you browse, then the error in your error log might suggest that your path to perl on the first line is incorrect for the location of perl on the server, check:
If you don't get an internal server error when you browse to it but get a file not found, then it's most likely that cgi-bin been setup as an alias to a different path on the server. Your best bet is then to search through httpd.conf if you have access to it. If not, try creating a different cgi directory called /cgi-local and stick your .htaccess file in with execCGI and see if that works.
I did check to make sure the perl interpreter is on the right path. Also, as mentioned, there were two sites that use the exact same (with only path name different) configuration. One works and the other doesn't. (hence the pazzling part)
Anyway, I have already got this (mysteriously) solved as mentioned above. And the client is no longer staying with me.