Results 1 to 6 of 6
  1. #1
    Join Date
    Oct 2005
    Posts
    130

    mod_security whitelist

    googlebot always gets blocked.

    ----------
    [Tue Sep 23 03:02:37 2008] [error] [client 66.249.72.243] ModSecurity: Access denied with code 500 (phase 4). Pattern match "<b>Warning<\\/b>.{0,100}?:.{0,1000}?\\bon line\\b" at RESPONSE_BODY. [file "/usr/local/apache/conf/modsecurity/modsecurity_crs_50_outbound.conf"] [line "42"] [id "970009"] [msg "PHP Information Leakage"] [severity "WARNING"] [tag "LEAKAGE/ERRORS"] [hostname "www.mysite.com"] [uri "/index.php"] [unique_id "wK49MEVIheIAAAF85nQAAAAd"]
    -----------

    How do i whitelist a range of IP or domain in mod_security?

    Thanks.

  2. #2
    Join Date
    Sep 2008
    Location
    Bangalore
    Posts
    77
    My best solution is you just disable mod_security locally for this account alone using .htaccess file.

    Just add the below in to it.

    SecFilterEngine Off
    SecAuditEngine Off

  3. #3
    Join Date
    Mar 2003
    Location
    Canada
    Posts
    8,873
    If you're using mod_security v2, you can whitelist your website by adding the following entry to the mod_security configuration file:

    SecRule SERVER_NAME "website.com" phase:1,nolog,allow,ctl:ruleEngine=off
    Patrick William | RACK911 Labs | Software Security Auditing
    400+ Vulnerabilities Found - Free Quote @ https://www.RACK911Labs.com

    www.HostingSecList.com - Security notices for the hosting community.

  4. #4
    Join Date
    Oct 2005
    Posts
    130
    sabarishks, That is your best solution? You probably don't own a server. There's no point for me installing mod_secutiry if i'm just gonna disable it in some accounts. Specially, that's the account that always get hacked.

    Pat, thank for your help.
    Last edited by Joomla; 09-23-2008 at 04:43 PM.

  5. #5
    Join Date
    Apr 2000
    Location
    California
    Posts
    3,051
    I wouldn't disable it for a site, though you can for some good reasons, and you should whitelist, maybe any spiders/bots by checking the USER_AGENT env var, which would make it easier. Exactly what is it flagging though? It looks like it was denying access to index.php, and regardless of it being a spider or not, this probably shouldn't be happening. If it's blocking a search engine spider, it might be blocking legitimate, normal users as well. What was in the body, "on line"?

  6. #6
    Join Date
    Oct 2005
    Posts
    130
    I got the rules from gotroot.com. I think their rules is a bit more restrictive. I've been trying to analyze the logs and I find a couple of false positives.

    Do you have any example rules so I can quickly whitelist the spiders. It's just sometimes, mod_security syntax is sometimes difficult to follow.

  7. Newsletters

    Subscribe Now & Get The WHT Quick Start Guide!

Related Posts from theWHIR.com

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •