hosted by liquidweb


Go Back   Web Hosting Talk : Web Hosting Main Forums : Web Hosting Talk Tutorials : Hosting Security and Technology Tutorials : Log Wget Commands
Reply

Forum Jump

Log Wget Commands

Reply Post New Thread In Hosting Security and Technology Tutorials Subscription
 
Send news tip View All Posts Thread Tools Search this Thread Display Modes
  #1  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533

Log Wget Commands


Logging wget commands is a simple proccess. Remember you will have to set permissions on binarys along with the directory. You dont want people looking to see what the new file is called. Generally, this is a excellent way to pickup problems, due to the fact most attacks now adays are automated and do not put in place checks.

What this simple but useful script does, is first you move the wget binary to a new file name, you extra security you can even move it to a completely different directory. Please remember your wget path will be different, from distro to distro, get the path by typing `whereis wget`

Now, lets move our wget to a new name, for this example I choose ekigrowbwo. For my specific distro wget is located in /usr/bin/wget

To move it we use the mv command, for more information on this command please read the man page, `man mv`

mv /usr/bin/wget /usr/bin/ekigrowbwo

This renames wget to /usr/bin/ekigrowbwo , now all we have to do is create our own wget script, so just open it up with pico,
pico /usr/bin/wget

Now place this simple script inside

#!/bin/bash
ME=`whoami`
TIME=`date`
DIR=`pwd`
echo "$TIME - $ME - $1 - >> $DIR" >> /usr/bin/wget.log
/usr/bin/ekigrowbwo $1

and save and exit(ctrl+x)

Now you should have a simple script in your wget binary. Now create the log file, in the script it logs too /usr/bin/wget.log just create that with the touch command.

touch /usr/bin/wget.log

That is you all done, you have now created a simple yet effective wget script, Test it by typing `wget http://www.google.com`, then `cat /usr/bin/wget` , you should see what was downloaded, when, where and by whom.

I hope this helps you.
-Written by HostGeekZ.com
http://www.hostgeekz.com/guides/cPan...t_Commands.htm

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com



Sponsored Links
  #2  
Old
New Member
 
Join Date: Jun 2005
Posts: 1
Excellent! Thanks very much for this. I have been spending a lot of time in log files among other things trying to find out who might be using the wget command. I also wish there was an easier way to find out where junk comes from found in the tmp folder.

Thanks again and I'll give this a try.

  #3  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
Most of the "junk" found in /tmp is from webshells now adays.

Just get the file name, go to your apache logs directory. cPanel users this is in /usr/local/apache/domlogs.

Just
cat * | grep filename

Then you have what they exploited and what was "wgeted"

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

Sponsored Links
  #4  
Old
Junior Guru Wannabe
 
Join Date: Jul 2005
Posts: 62
I really feel kind of bad pointing this out, but that doesn't stop anyone from uploading their own copy of wget and running it in their own home directory. It most definitely doesn't stop users from connecting to remote servers via CGI scripts or local applications. Additionally, there are a plethora of other commands that have "wget functionality." For example, fetch, ncftp, and the lynx text browser with the correct "just dump it here" flags.

Additionally, the 'wget' program runs as the user executing it. This means that any log file you create is going to have to be world writable. Once that's done, there is nothing stopping *any* user from simply removing that file (never mind the fact that the wget wrapper is by nature world readable).

Limiting the command is superficial. The real goal should be hardening the system to the point that it doesn't matter *what* they download.

If anyone is really interested in logging commands that are run by users, look into standard Unix process logging.

__________________
http://www.blue-giraffe.com

  #5  
Old
Web Hosting Master
 
Join Date: Oct 2003
Posts: 566
Additionally the script quoted above effectively strips all parameters passed to wget but the first one. Actually there's no need for uploading your own wget binary, since the location of the "true" binary is easily viewable (since the script needs to be +r since the bash interpreter running with the user rights needs to parse it).

Calling the renaming of a binary a "security measure" is quite a perversion of the term, by the way.

Actually you would reach quite the opposite, since your users would be given permission to corrupt your server by filling up your /usr partition with crap, since, as mentioned before, the log file would have to be world writable.

Try

cp /dev/zero /usr/bin/wget.log

and wait a few minutes to see what I mean ;-).

By the way, deleting the file wouldn't be possible for users but nothing would prevent them from overwriting it with /dev/null as opposed to /dev/zero ;-).

If you found a way to let your script run as a different user it would make things even worse since there are no input checks. So, passing commands in backticks would allow you to execute random commands with the script's full user rights.

Like

yourscript "`cd $HOME >/dev/null; ls -l >/tmp/foo.bar; chmod 666 /tmp/foo.bar >/dev/null`"

The "script" wouldn't even log anything in that case, since the above does not generate any output on STDOUT.

Edit: Just saw that the "script author" is actually offering hosting and is even displaying this so called "guide" on his page. Did I miss something? Is all of this part of some kind of kiddie host joke?


Last edited by aldee; 07-10-2005 at 01:27 PM.
  #6  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
Firstly, I am not a host.

Second I clearly stated that it is not very good.

Instead of all your lame ***, attempts at trying to go on about what it actualy does I also clearly state and I quote.

Quote:
due to the fact most attacks now adays are automated and do not put in place checks.
Where on earth does it say, running this will keep your server secure? Go on tell me.

"Did I miss something? Is all of this part of some kind of kiddie host joke?"

Again I state I am not a host. Nor do I even offer any sort of webhosting. So get your facts right, you are clearly stupid if you thought it was a webhost, to put it blunt.

It was one of the ways to pickup the r0nin exploit especailly quickly without checking apache logs.

So you try to come into a thread, rant about things I never stated it will do, for what? To think you know everything?

Quite frankly, I do not appreciate your comments because I have never once stated it will do much, it is also clearly stated that it is for automated attempts.

So infuture read what was said before going on your idiotic rants.

-Scott

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  #7  
Old
Web Hosting Master
 
Join Date: Oct 2003
Posts: 566
The point is not that it doesn't secure your sever, it is that it's opening your server up to DoS attacks from local users (or compromised user accounts) by granting them permission to fill up your /usr partition and that it's crippling the usage of wget for legitimate users since $1 - as opposed to $* - only contains the first command line parameter passed on to the script.

Posting something like this in a forum that might be read by people who could potentially make use of the script is grossly negligent.

And frankly, I did not comment on this thread to earn your appreciation.

EOD as far as I am concerned.

  #8  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
It does not open permissions to the full user partion.

It clearly states 1 and I repeat 1!! log file.

It only logs the first string because if you ever look at whats passes threw, 90% of the time it is 1 string, nothing more. You can add it to read all strings if you wish but very rarly will you find automatic srings with switches and more that 1 url.

-Scott

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  #9  
Old
Junior Guru Wannabe
 
Join Date: Jul 2005
Posts: 62
HG - his point is this..

cat /dev/zero >> /usr/bin/wget.log... that command, by any user, would fill up the /usr partition. Depending on how you have your system configured, it could fill up your entire drive. Watch:

[root@marvin jeff]# touch /usr/bin/wget.log
[root@marvin jeff]# chmod 777 /usr/bin/wget.log
[root@marvin jeff]# exit
[jeff@marvin ~]$ id
uid=500(jeff) gid=500(jeff) groups=500(jeff) context=user_uystem_r:unconfined_ t
[jeff@marvin ~]$ date
Sun Jul 10 17:50:55 EDT 2005
[jeff@marvin ~]$ ls -al /usr/bin/wget.log
-rwxrwxrwx 1 root root 0 Jul 10 17:50 /usr/bin/wget.log
[jeff@marvin ~]$ cat /dev/zero > /usr/bin/wget.log

[jeff@marvin ~]$ ls -alh /usr/bin/wget.log
-rwxrwxrwx 1 root root 341M Jul 10 17:51 /usr/bin/wget.log
[jeff@marvin ~]$ date
Sun Jul 10 17:51:17 EDT 2005
[jeff@marvin ~]$

The point is that by adding the logging functionality, you're at the same time opening a bigger hole. While someone MIGHT take your machine offline by something they bring down the HTTP, they *WILL* take your machine offline if they can fill up your file systems.

By not accepting more than one command, you'll almost guarentee that users will find your script. If a user passes a flag to wget and gets a strange error, they're going to explore that a bit.

I'm not attacking you, I just want to make sure it's clear that individuals following this tip might be setting themselves up for some pain without realizing it.

If execution logging is important to you, look into process logging.

Look, it's great that you want to help out a community. Many folks could care less. But be aware that *many* people will following what you've typed up verbatim and they will take what you say as gospel. Especially when you run a site full of tips and tutorials.

-Jeff

__________________
http://www.blue-giraffe.com

  #10  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
I understand what you are talking about, but you still all miss the basic point of it. I was somthing I wrote while typing out as it was just a crap idea I had at the time, just like the rest of the guides I make.

Its for automated attempts. You say they can just fill up the partion, well if someone has access, i'm afraid they are going to do alot worse than that anyway. Not to mention, caI have never ONCE saw anyone look at /usr/bin/wget or the relivent path.

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  #11  
Old
Junior Guru Wannabe
 
Join Date: Jul 2005
Posts: 62
Then, please, do everyone a favor and ensure you mention that there are inherent risks with doing this type of thing this way.

__________________
http://www.blue-giraffe.com

  #12  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
You say that now. Look at alot of things around here, there are ALWAYS risks.

It just depends on how you deal with it.

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  #13  
Old
Junior Guru Wannabe
 
Join Date: Jul 2005
Posts: 62
I'm not going to bicker back and forth with you. I've made every effort to point out issues I've seen in your work with a very high level of respect. It wasn't me that replied in an agressive manner.

The bottom line here is you are doing a disservice to the hosting community by taking an authoritative stance on a topic and dispursing misleading information. If you're comfortable taking that approach, I'm comfortable with letting you do it.

__________________
http://www.blue-giraffe.com

  #14  
Old
Engineer
 
Join Date: Jan 2005
Location: Scotland, UK
Posts: 2,533
I have read over what I have been saying, I do not mean to come across like that. Although I have been like it all day for some reason.

I totaly understand what you are saying, but I am too stuborne too agree with you 100%, because I have my own point of view.

I guess its up to the person to decide.

-Scott

__________________
Server Management - AdminGeekZ.com
Infrastructure Management, Web Application Performance, mySQL DBA. System Automation.
WordPress/Magento Performance, Apache to Nginx Conversion, Varnish Implimentation, DDoS Protection, Custom Nginx Modules
Check our wordpress varnish plugin. Contact us for quote: sales@admingeekz.com

  #15  
Old
Web Hosting Master
 
Join Date: Apr 2003
Location: UK
Posts: 2,560
a better more foolproof way is to use a kernel level tool such as grsec to log execution, and run something like "grep wget messages" to get teh wget commands from the log

Reply

Related posts from TheWhir.com
Title Type Date Posted
DIY Website Builder Wix.com Launches Integrated Email Newsletter Tool Web Hosting News 2014-07-10 11:51:24
Facebook's Recently Acquired Mobile App Platform Parse Launches Web Hosting for Developers Web Hosting News 2013-05-08 10:49:17
Host Europe Group Launches Domain Management Service for SMEs Web Hosting News 2013-04-24 14:31:28
.co.za Domain Price Hike Expected to Push Registrars, ISPs to Use EPP System Web Hosting News 2013-01-11 13:35:51
Amazon Web Services Launches Limited Preview of Data Warehouse Cloud Service Redshift Web Hosting News 2012-12-07 16:49:43


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes
Postbit Selector

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump
Login:
Log in with your username and password
Username:
Password:



Forgot Password?
Advertisement:
Web Hosting News:
WHT Membership
WHT Membership



 

X

Welcome to WebHostingTalk.com

Create your username to jump into the discussion!

WebHostingTalk.com is the largest, most influentual web hosting community on the Internet. Join us by filling in the form below.


(4 digit year)

Already a member?