hosted by liquidweb


Go Back   Web Hosting Talk : Web Hosting Main Forums : Web Hosting : Better to upgrade Dedicated Server or Buy 2/3?
Reply

Forum Jump

Better to upgrade Dedicated Server or Buy 2/3?

Reply Post New Thread In Web Hosting Subscription
 
Send news tip View All Posts Thread Tools Search this Thread Display Modes
  #16  
Old 02-07-2013, 07:07 PM
JacobN JacobN is offline
Junior Guru Wannabe
 
Join Date: Jan 2013
Location: Virginia Beach, Va
Posts: 52
Post

Quote:
Originally Posted by RRWH View Post
So, short answer is you do not need to upgrade at the moment.

Long answer is NEVER upgrade without knowing the reason for excessive resource usage.
Yes RRWH, salmanpasta doesn't look like he needs a server upgrade at the moment since we were able to block a lot of unnecessary bot requests from happening anymore.

I can't agree with you more that you shouldn't simply throw more hardware at an issue, without a good understanding of what's really causing the excessive usage. If you run a server you should get really good and comfortable with being connected via SSH and looking at the requests happening in your access logs. I can't even count the number of times like in this case where telling a few bots to stop slamming your server can decrease the average usage dramatically.

__________________
Jacob Nicholson
System Admin turned customer advocate
Need help with server usage or optimizing wordpress ? I'm here to help!

Reply With Quote
Sponsored Links
  #17  
Old 02-13-2013, 01:34 PM
salmanpasta salmanpasta is offline
Newbie
 
Join Date: Nov 2012
Posts: 12
Alright, so the problem still exists, but it's better.

I've been using the top c command in SSH and watching what's going on. Since I have 37 sites running, I see the index.php being hit a lot. The other parts are my cron scripts (php files) and XML-RPC which gets included in 1/4 of the cron scripts.

I just installed the P3 Profiler and here are my wordpress stats on the plugins. Can anyone recommend anything?

http://s11.postimage.org/r1ietwb2b/p...erformance.jpg

This is what it looks like without contact form 7

http://s9.postimage.org/qnwiovmdb/profiler_2.jpg

It looks like W3 Total Cache is making everything slower for me? I have Page, Minify, Database, Object and Browser cache all enabled using Disk. I know that APC would probably be a better option, but I've disabled APC because of caching of my cron scripts which I could not afford to do.


Last edited by salmanpasta; 02-13-2013 at 01:41 PM.
Reply With Quote
  #18  
Old 02-20-2013, 09:06 PM
salmanpasta salmanpasta is offline
Newbie
 
Join Date: Nov 2012
Posts: 12
I just added deny to a whole bunch of IPs. If you take a look through them, they're all either Amazon AWS or random proxies from Germany and Sweden. I believe that should do the trick? I'm guessing Alexa's bot would say Alexa, and it probably wouldn't be Windows NT bot.

Reply With Quote
Sponsored Links
  #19  
Old 02-21-2013, 02:07 AM
JacobN JacobN is offline
Junior Guru Wannabe
 
Join Date: Jan 2013
Location: Virginia Beach, Va
Posts: 52
Post

Quote:
Originally Posted by salmanpasta View Post
Another great post Jacob! I tried running

PHP Code:
zgrep "20/Feb/2013:06:4" /home/*/logs/* | grep "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0" | awk '{print $2}' | sort -n | uniq -c | sort -n 
And it just shows me 2290 -

It doesn't show the IP addresses. However, when I ran

PHP Code:
zgrep "Windows NT 5.1; de" /home2/*/logs/*-Feb-2013.gz | sed 's#^.*gz:##' | awk '{print $1}' | sort -n | uniq -c | sort -n 
It showed me all of the IP addresses (Not all from Amazon). Should I just block out all of these IP addresses with more than 2 hits? I was wondering this whole time why my server was spiking every morning. I thought that the Amazon bot could have been Alexa?

I just added deny to a whole bunch of IPs. If you take a look through them, they're all either Amazon AWS or random proxies from Germany and Sweden. I believe that should do the trick? I'm guessing Alexa's bot would say Alexa, and it probably wouldn't be Windows NT bot.
Hey salmanpasta,

Sorry about that incorrect code snippet, I originally had it a bit different while testing, using the sed command to show which domains were getting requests as well as the IPs, I chopped out that bit so I could show you how to just grab globally all the unique IPs hitting your server, and it looks like somewhere along the line I pasted the wrong thing.

The orginal code I had was this:

Code:
zgrep "20/Feb/2013:06:4" /home/*/logs/* | grep "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0" | sed -e 's#/home/.*/logs/##' -e 's#-Feb-2013.gz:# #' | awk '{print $2}' | sort -n | uniq -c | sort -n
Which does correctly display the IPs, and here is what I was also using:

Code:
zgrep "20/Feb/2013:06:4" /home/*/logs/* | grep "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0" | sed -e 's#/home/.*/logs/##' -e 's#-Feb-2013.gz:# #' | awk '{print $1,$2}' | sort -nk2 -k1 | uniq -c | sort -n
Here is some dummy example output that provides:

Code:
85 shop.example.com 125.125.125.125
85 example.com 127.127.127.127
97 example.com 124.124.124.124
187 example.com 123.123.123.123
Nice job on figuring out how to find the IPs on your own! That way you did it is also really nice in this case, since it is going to capture requests from all of February with that user-agent instead of just the 10 minute window I was grabbing from.

In this case because we noticed a particularly old, odd, foreign user-agent, and then noticed that all of the IPs seem to belong to an Amazon service that can be used by people trying to make themselves the next Google by making their own crawlers, it would probably be safe to block any IPs with more than a handful of requests.

However do keep in mind here just for knowledge sake, that say the IP you block is a shared IP address on that Amazon server. One user on that server configuring a bad crawler that doesn't abide by robots.txt rules, might be doing that without the consent or knowledge of other users on that same server. So potentially if someone just happened to also use that server's IP at some point in time to do some legitimate stuff on your website, that connection would be blocked. Whereas providing a 403 - Access Denied HTTP response based off user-agent would still allow the good users on that server to communicate with your server, while blocking the bad.

Now that said, in most cases I'd say go ahead and block IPs of potential bots that could cause problems, especially if they hit your sites semi-regularly, to help always ensure good server availability for human visitors. But in some cases IPs are going to be hard to keep up with, as a bot creator could simply jump from one server IP to another and keep hitting you, so trying to block via user-agent or by anything really unique in the type of requests they're sending to your server is always going to be a good long-term solution.

If you do consistently notice a trend of always getting hit by bots from a certain provider, you could even take some measures as drastic as blocking their entire IP ranges which I mention in my article about blocking a range of IP addresses.

If that's something you're interested in doing for Amazon's EC2 service you can find their IP ranges here.

Most legitimate crawlers like Alexa will not have any kind of web-browser or OS mentioned in their user-agent, and that's usually a sign of a bot trying to disguise as legitimate traffic. Most good bots will also tell you where to go to find out more about them, in this case here's Alexa's full user-agent string:

Code:
"ia_archiver (+http://www.alexa.com/site/help/webmasters; crawler@alexa.com)"

__________________
Jacob Nicholson
System Admin turned customer advocate
Need help with server usage or optimizing wordpress ? I'm here to help!


Last edited by anon-e-mouse; 02-21-2013 at 06:53 PM.
Reply With Quote
  #20  
Old 02-21-2013, 03:49 PM
salmanpasta salmanpasta is offline
Newbie
 
Join Date: Nov 2012
Posts: 12
Quote:
Originally Posted by JacobN View Post
Hey salmanpasta,

Sorry about that incorrect code snippet, I originally had it a bit different while testing, using the sed command to show which domains were getting requests as well as the IPs, I chopped out that bit so I could show you how to just grab globally all the unique IPs hitting your server, and it looks like somewhere along the line I pasted the wrong thing.

The orginal code I had was this:

Code:
zgrep "20/Feb/2013:06:4" /home/*/logs/* | grep "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0" | sed -e 's#/home/.*/logs/##' -e 's#-Feb-2013.gz:# #' | awk '{print $2}' | sort -n | uniq -c | sort -n
Which does correctly display the IPs, and here is what I was also using:

Code:
zgrep "20/Feb/2013:06:4" /home/*/logs/* | grep "Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0" | sed -e 's#/home/.*/logs/##' -e 's#-Feb-2013.gz:# #' | awk '{print $1,$2}' | sort -nk2 -k1 | uniq -c | sort -n
Here is some dummy example output that provides:

Code:
85 shop.example.com 125.125.125.125
85 example.com 127.127.127.127
97 example.com 124.124.124.124
187 example.com 123.123.123.123
Nice job on figuring out how to find the IPs on your own! That way you did it is also really nice in this case, since it is going to capture requests from all of February with that user-agent instead of just the 10 minute window I was grabbing from.

In this case because we noticed a particularly old, odd, foreign user-agent, and then noticed that all of the IPs seem to belong to an Amazon service that can be used by people trying to make themselves the next Google by making their own crawlers, it would probably be safe to block any IPs with more than a handful of requests.

However do keep in mind here just for knowledge sake, that say the IP you block is a shared IP address on that Amazon server. One user on that server configuring a bad crawler that doesn't abide by robots.txt rules, might be doing that without the consent or knowledge of other users on that same server. So potentially if someone just happened to also use that server's IP at some point in time to do some legitimate stuff on your website, that connection would be blocked. Whereas providing a 403 - Access Denied HTTP response based off user-agent would still allow the good users on that server to communicate with your server, while blocking the bad.

Now that said, in most cases I'd say go ahead and block IPs of potential bots that could cause problems, especially if they hit your sites semi-regularly, to help always ensure good server availability for human visitors. But in some cases IPs are going to be hard to keep up with, as a bot creator could simply jump from one server IP to another and keep hitting you, so trying to block via user-agent or by anything really unique in the type of requests they're sending to your server is always going to be a good long-term solution.

If you do consistently notice a trend of always getting hit by bots from a certain provider, you could even take some measures as drastic as blocking their entire IP ranges which I mention in my article about blocking a range of IP addresses.

If that's something you're interested in doing for Amazon's EC2 service you can find their IP ranges here.

Most legitimate crawlers like Alexa will not have any kind of web-browser or OS mentioned in their user-agent, and that's usually a sign of a bot trying to disguise as legitimate traffic. Most good bots will also tell you where to go to find out more about them, in this case here's Alexa's full user-agent string:

Code:
"ia_archiver (+http://www.alexa.com/site/help/webmasters; crawler@alexa.com)"
Thanks again! Great tools to have for everyone else looking


Last edited by anon-e-mouse; 02-21-2013 at 06:53 PM.
Reply With Quote
  #21  
Old 02-21-2013, 09:28 PM
pxlfoo pxlfoo is offline
Junior Guru Wannabe
 
Join Date: Jun 2007
Posts: 36
If bots are the issue, you might also consider blocking all proxies as well. On all the websites I operate this is routine (I don't see why anyone should be accessing my sites though a proxy)

Consider doing this;
http://perishablepress.com/block-tough-proxies/

And using this service;
http://www.shroomery.org/ythan/proxyblock.php

Here's a simple script I wrote that you can use with them

<?php

$ip = $_SERVER['REMOTE_ADDR'];
$uri = "http://www.shroomery.org/ythan/proxycheck.php?ip=".$ip;

$res = file_get_contents($uri);

if($res == "Y"){

// ADDRESS BELONGS TO A KNOWN PROXY - HANDLE THEM HERE

}

?>


What I do is log the address and send them to a static ban page, you could also just send a 404 response or whatever...

Reply With Quote
  #22  
Old 05-21-2013, 05:40 AM
akyeame akyeame is offline
Newbie
 
Join Date: May 2013
Posts: 11
Similar issue

Hello,

I had a similar experience with inmotion hosting with CPU usage. However, rather than investigate what is going on with possible bots they DEMAND I throw more $$$ and hardware at the issue. I'm on VPS and they are forcing me to upgrade to dedicated. Well, rather than doing that, I'm moving. My highest traffic site is low traffic compared to anything. I even saw this thread initially and asked them to have JacobN check if the issue is the same, and they say they'll only check AFTER I upgrade. Bye inmotionhosting.

Reply With Quote
  #23  
Old 05-21-2013, 12:07 PM
JacobN JacobN is offline
Junior Guru Wannabe
 
Join Date: Jan 2013
Location: Virginia Beach, Va
Posts: 52
Post

Hello akyeame,

Sorry to hear that your were having CPU usage problems on your VPS with us. Did you happen to have a chance to follow any of my resource usage guides mentioned in this thread, or in my signature prior to these issues?

I apologize if our system administration team was forced to suspend your VPS and required a temporary upgrade to a dedicated server platform to investigate the issues. I'm assuming they must have been pretty large resource usage issues, that were affecting other customers on the VPS platform as well, and that's why it couldn't remain un-suspended while being investigated.

If you'd like to private message me any of your account details such as your domain name or VPS number, I'd be glad to see if there was anything else I could do for you to possibly pin-point what is being problematic while ensuring your server's usage is not affecting other users.

__________________
Jacob Nicholson
System Admin turned customer advocate
Need help with server usage or optimizing wordpress ? I'm here to help!

Reply With Quote
  #24  
Old 05-21-2013, 03:25 PM
akyeame akyeame is offline
Newbie
 
Join Date: May 2013
Posts: 11
Hello JacobN,

You are actually the inmotionhosting admin I was hoping to hear from as I saw you resolved the issues of someone else using your hosting services. I pointed Michael L. to this thread saying you had resolved it for someone else and since my site is (relatively) low traffic the culprit may be bots but the hard upsell was:

"Any testing of this would need to be done on a dedicated platform."

I had just read through this thread and your quote that
Quote:
Originally Posted by JacobN
I can't agree with you more that you shouldn't simply throw more hardware at an issue, without a good understanding of what's really causing the excessive usage.
However, here I was with Michael L. demanding that I upgrade to (at the very least) a $119.99 dedicated server that actually costs $149.99 unless I can cough up 12 months worth of cash.

The positive side is that according to Michael S.

Quote:
Your account has already been noted that you are migrating to another host, so we will make every effort to avoid suspending your account so as not to interfere with that process.
I can usually get 2-3 months without any issue before the resource police come knocking. In that the culprit has been cometchat for which I've installed cometservice. The last straw was that even after disabling the cometchat altogether I was still told my resource usage REQUIRED an upgrade to dedicated. This is on a site that gets 17,000 uniques PER MONTH rather than per day as some of the good folks here at WHT claim that they get on other VPS hosting platforms.

I can PM you my account info for you to take a look at it as you may be the only guy at IMH who isn't trying to force upgrades but at this point...

Reply With Quote
  #25  
Old 05-21-2013, 03:48 PM
JacobN JacobN is offline
Junior Guru Wannabe
 
Join Date: Jan 2013
Location: Virginia Beach, Va
Posts: 52
Post

Hey akyeame,

Sure shoot me a PM and I'd be glad to take a look for you, it sounds like the VPS might be un-suspended at the moment so hopefully I can collect some good data for you that pin-points where your main resource usage might have been coming from.

- Jacob

__________________
Jacob Nicholson
System Admin turned customer advocate
Need help with server usage or optimizing wordpress ? I'm here to help!

Reply With Quote
  #26  
Old 05-21-2013, 05:01 PM
JacobN JacobN is offline
Junior Guru Wannabe
 
Join Date: Jan 2013
Location: Virginia Beach, Va
Posts: 52
Hi again akyeame,

So after taking a look at your VPS's history usage, it does look like there were multiple problems with the usage tied back to the usage of CometChat, and also it looks like you had some problematic requests going to another one of your websites that eventually led to us requesting that you upgrade to a dedicated server.

It looks like you did work with our systems team back and forth over tickets to try to have your usage reduced, but unfortunately it looks like it wasn't able to be reduced enough where your usage isn't causing problems for other VPS customers sharing the same VP node hardware.

It looks like today for instance the VPS continues to spike over acceptable usage levels, the culprits behind the usage seem to be mainly your forum, and also slide show pro on your most highly visited website.

I'll send you an email from our system with more information on what I'm seeing, but ultimately it does look like your current usage would warrant a dedicated server. Unless you decided to either optimize or disable your forum which appears to be where the bulk of the usage is coming from.

If you read the articles from my server usage link in my signature, specifically the ones from the server tools to help with usage section, you would be able to see using commands like sar just how heavy your VPS's CPU activity was.

Unfortunately in these types of scenarios, because the VPS platform is shared, without simply disabling your forum, the usage would be pretty hard to get reduced on the VPS platform without affecting other customers. So it seems like that is why the system administration department recommended getting a dedicated server to continue hosting with us, after a few back and forth attempts trying to get the usage reduced to acceptable levels.

You can always upgrade to a dedicated platform temporarily while you're fully optimizing your websites, and then request a downgrade to the VPS platform again once it's been proven on the dedicated server that the usage would not continue to pose a possible threat to the other users on the VPS platform.

__________________
Jacob Nicholson
System Admin turned customer advocate
Need help with server usage or optimizing wordpress ? I'm here to help!

Reply With Quote
Reply

Similar Threads
Thread Thread Starter Forum Replies Last Post
Dedicated Server RAM upgrade Help knwats Web Hosting 18 08-07-2011 10:32 PM
upgrade from sharde to Dedicated server shan80 Dedicated Server 17 11-24-2009 07:59 AM
dedicated server upgrade jt2377 Dedicated Server 3 06-09-2005 02:12 AM
Feasibility for upgrade to a dedicated server lilwong Dedicated Server 8 04-27-2005 01:46 PM
Best way to upgrade to new kernel on dedicated server? chupa Hosting Security and Technology 5 06-20-2003 08:50 AM

Related posts from TheWhir.com
Title Type Date Posted
Web Hosting Sales and Promos Roundup - May 23, 2014 Web Hosting News 2014-05-23 17:06:57
Canadian Web Hosting Improves SSD Caching for Shared Hosting Servers Web Hosting News 2013-09-04 13:47:24
LeaseWeb Doubles Resources of Cloud, Virtual Server Plans Web Hosting News 2013-06-04 11:20:42
Web Host ServInt Launches Line of Semi-Dedicated, Fully Managed Servers Web Hosting News 2013-01-14 11:06:33
Web Host Milesweb Launches Dedicated Server Hosting in India Web Hosting News 2012-12-31 10:53:51


Tags
cpu load

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes
Postbit Selector

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump
Login:
Log in with your username and password
Username:
Password:



Forgot Password?
Advertisement:
Web Hosting News:



 

X

Welcome to WebHostingTalk.com

Create your username to jump into the discussion!

WebHostingTalk.com is the largest, most influentual web hosting community on the Internet. Join us by filling in the form below.


(4 digit year)

Already a member?