Results 1 to 12 of 12
  1. #1

    Is this bandwidth usage normal??

    Sitemeter is telling me the site gets an average of 38000 page views a day.

    The web host is telling us we're using 51gb of bandwidth a day on average.

    This is not a video site, or a download site. This is an article site, with jpg images. Our biggest page has about 2.4 meg of images on it. The average page is about 1.34 mb's, counting the navigation images which repeat on every page.

    But don't user's browsers cache anything??

    Is that bandwidth usage normal?

  2. #2
    It should be normal usage with that PV and BW.you got both income and outgo BW.

  3. #3
    Join Date
    May 2006
    Posts
    875
    Maybe because of Hotlinking?
    disable it and see how it goes.
    hosted by HawkHost
    I Recommend: LimeStone Networks!
    The OverSeller Defender!

  4. #4
    There is no incoming besides the requests. On other words, users can't upload files or anything.

  5. #5
    How many unique visitors? If you have heaps of unique visitors, then that means they'd be less caching.

    Either way you look at it, that bandwidth does not match that number of page views. Do you need to put code on the pages you wish to track stats for? if so, you may be missing that code for some of the pages. What I'm getting at is maybe your stats page is showing you less pageviews than you're actually getting.

  6. #6
    Join Date
    Jun 2004
    Location
    Bay Area
    Posts
    1,320
    38,000 * 1.34MB = 51GB
    Its exactly right when caching is not used.

    You might want to check the headers sent by your webserver. It could be that somehow caching is not allowed. An easy way to check is using firefox with the Live HTTP headers plugin.

  7. #7
    Join Date
    Mar 2007
    Location
    UK
    Posts
    852
    Also is the site used through proxies? or anything like this, as a proxy will not cache the site, meaning images will be reloaded every page view.

    Check your site stats and see where the main source of bandwidth is coming from, programs such as webalizer allow you to see what file/image is using the largest percent of the bandwidth.

    Ashley
    ZXPlay
    Premium Virtual Private Servers | Dedicated Media Streaming Servers
    Dedicated Resources | EU Based
    www.zxplay.co.uk

  8. #8
    Here are headers for an image:

    Code:
    http://www.<mysite>.com/images/navbar.gif
    
    GET /images/navbar.gif HTTP/1.1
    Host: www.<mysite>.com
    User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.5) Gecko/20070713 Firefox/2.0.0.3 MEGAUPLOAD 1.0
    Accept: image/png,*/*;q=0.5
    Accept-Language: en-us,en;q=0.5
    Accept-Encoding: gzip,deflate
    Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
    Keep-Alive: 300
    Connection: keep-alive
    Referer: http://www.<mysite>.com/
    Cookie: PHPSESSID=802f06c2f629031bc83eddce62e9a45a
    
    HTTP/1.x 200 OK
    Date: Sat, 28 Jul 2007 17:46:11 GMT
    Server: Apache/2.0.52 (Red Hat)
    Last-Modified: Mon, 12 Mar 2007 00:42:35 GMT
    Etag: "c6434e-211-ca034c0"
    Accept-Ranges: bytes
    Content-Length: 529
    Keep-Alive: timeout=5, max=96
    Connection: Keep-Alive
    Content-Type: image/gif
    The headers for the html do have "Expires: Sun, 19 Nov 1978 05:00:00 GMT" and "Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0". But that's only on the initial HTML, not any of the other pieces of the page. So that's not the problem, right?


    It's not served through a proxy..

    The site is entirely drupal based and the statcounter is in the template. So it's on every page.

  9. #9
    Join Date
    Oct 2004
    Location
    Southwest UK
    Posts
    1,175
    It probably is, if the other pieces of the page are appended to the 'initial html' then, yes you are serving pages that tells the browsers to not cache anything. Get rid of that and see what happens.

    And moved to Technical & Security.
    Do not meddle in the affairs of Dragons, for you are crunchy and taste good.

  10. #10
    Wait, what? If the initial html page has no-cache headers, then everyhing else on the page isn't cached either? Are you sure? How do CDN's work then? Like akamai and stuff...

  11. #11
    use Mark Nottingham's cacheabilty test, it will go through a page and mark the cacheability of each element.

    I think the site is mnot.org

    search for "cacheability test"
    edgedirector.com
    managed dns global failover and load balance (gslb)
    exactstate.com
    uptime report for webhostingtalk.com

  12. #12
    OK thanks, I tried this:

    http://www.ircache.net/cgi-bin/cacheability.py

    Most of the images on the page say:

    This object doesn't have any explicit freshness information set, so a cache may use Last-Modified to determine how fresh it is with an adaptive TTL (at this time, it could be, depending on the adaptive percent used, considered fresh for: 3 weeks 6 days (20%), 9 weeks 6 days (50%), 19 weeks 6 days (100%)). It can be validated with Last-Modified.
    So stuff should be caching, right?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •