Results 1 to 26 of 26
  1. #1

    Google Sitemap. XML, good for spiders but not for visitors?

    Hello,
    I am trying to get my bearings with some SEO. I am using Google Webmaster tools and I am working on a sitemap. I am going to use http://www.xml-sitemaps.com/ to create the XML file and then submit it.

    Now my question is, the XML file is not that great for actual human users. It looks just like a bunch of text (unless I am doing something wrong). So is there a way to create a Google compliant XML sitemap that is usable by spiders AND users?

    Also, is it best practice to create a new sitemap each time I add pages/content and then resubmit?

    Thanks a lot

  2. #2
    Join Date
    Jul 2007
    Location
    Needham, MA
    Posts
    43
    XML Sitemaps help spiders know what pages exist. This is especially helpful if you don't have pages that are organically accessible to the crawler via links. These XML based sitemaps were never created with the end-user in mind.

    You'll have to have dueling strategies. One is your XML file which no user will ever see, the other is an HTML file that for example could also serve as your 404 not found page, which would contain links to popular pages on your website. If you do EVERY page in this version of the sitemap, you'll be forced to update this constantly, and you may have too many pages for it to be useful to an end-user. So just watch out for that.

    You SHOULD update your XML sitemap periodically and use the optional tags to let Google know if pages have been updated / modified or if pages have been added or removed. Google automatically downloads your XML sitemap regularly and expects changes.



    Quote Originally Posted by taltos1 View Post
    Hello,
    I am trying to get my bearings with some SEO. I am using Google Webmaster tools and I am working on a sitemap. I am going to use http://www.xml-sitemaps.com/ to create the XML file and then submit it.

    Now my question is, the XML file is not that great for actual human users. It looks just like a bunch of text (unless I am doing something wrong). So is there a way to create a Google compliant XML sitemap that is usable by spiders AND users?

    Also, is it best practice to create a new sitemap each time I add pages/content and then resubmit?

    Thanks a lot
    RatePoint Customer Feedback Platform and Site Seal
    Become a partner today!
    http://www.ratepoint.com/

  3. #3
    Join Date
    Feb 2004
    Location
    Fort Worth, TX
    Posts
    2,585
    I would recommend a "human" sitemap and an "XML" site map.

    Create the human sitemap to look nice for your visitors, and the XML just for google. I would suggest resubmitting the XML site map every week or so. But in that week.. add 2 content pages.
    www.JGRoboMarketing.com / We Filter out the Bad Leads and Send you the Good ones!
    █ Office: (800) 959-0182 / Automated Lead Funnel Service

  4. #4
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    Sure make a HTML sitemap for your visitors but the XML is pretty useless. It doesnt help you rank and if you need it for spiders to crawl your site then you have bigger problems.

  5. #5
    Quote Originally Posted by 1boss1 View Post
    Sure make a HTML sitemap for your visitors but the XML is pretty useless. It doesnt help you rank and if you need it for spiders to crawl your site then you have bigger problems.
    So why does Google's webmaster tool request an XML sitemap?

  6. #6
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    They dont "request" you add one, they have an option to let Google know if your site does have one.

    It does nothing to help you rank, Google themselves have stated this. If you actually "need" an XML sitemap so your site gets crawled properly it indicates your sites structure needs addressing.

    A HTML sitemap on the other hand assists visitors, provides an internal backlink with anchor text to your inner pages which gives ranking gains plus provides an alternate route to your content to all search engines not just Google.

    Implemented in to your 404 it provides numerous options for all engines to keep crawling, as well as acts as a navigational aid to help your visitors find what they were looking for instead of bouncing.

    There are more technical reasons but thats the basics of it.

  7. #7
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Quote Originally Posted by 1boss1 View Post
    ... but the XML is pretty useless. It doesnt help you rank and if you need it for spiders to crawl your site then you have bigger problems.
    That may be true if you have a site of 25 pages with a good Pagerank, but if you have a new site of 10,000 or so pages an XML sitemap submitted to all the major search engines is a great help and it costs little to implement so why pass up one more useful tool?

    You will still need other strategies to get all your pages indexed, and more still to get all those pages ranking, but don't discard ideas before you try them.

    Example:
    A new site with 12,000+ pages was created on Jan 8 this year and up until one week ago it had only 1264 pages indexed in Google, a Sitemap was submitted five days ago and today the site has 4490 pages indexed in Google.

    I like xml sitemaps.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  8. #8
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    Quote Originally Posted by Melnel View Post
    That may be true if you have a site of 25 pages with a good Pagerank, but if you have a new site of 10,000 or so pages an XML sitemap submitted to all the major search engines is a great help and it costs little to implement so why pass up one more useful tool?

    You will still need other strategies to get all your pages indexed, and more still to get all those pages ranking, but don't discard ideas before you try them.

    Example:
    A new site with 12,000+ pages was created on Jan 8 this year and up until one week ago it had only 1264 pages indexed in Google, a Sitemap was submitted five days ago and today the site has 4490 pages indexed in Google.
    That's indication you're doing something wrong with your internal link architecture, page copy and site promotion. I kicked off a site at the beginning of January, and by the end of the month it had 76,800 pages indexed. So to only have 4,490 indexed all year is rather worrying.

    I see on your own SEO Company you have inherent problems, like canonical URL's, poor filename selection such as dash, underscore, upper & lower case characters all in the one URL, your XML change frequency as Weekly when your pages last modified headers show months ago, 2 instances of Analytics code on the one page, displaying a W3C compliant badge when your page has errors, no text link to your homepage with your primary term as anchor on your subpages etc etc.

    So i'm starting to see an indication why your other site is struggling to get indexed.

    XML being a "great tool" as you say is a hindrance for me, you see the major engines are link based crawlers. By adding an XML sitemap it destroys valuable optimization data by obscuring how crawlers navigate your site. You may have a whole subsection that Google can't internally access due to a Javascript issue, incorrect link structure or many other reasons.

    Would you rather know this data and be able to internally correct and optimize it, or throw a band-aid over a gaping wound and lead Google in by the hand to find these pages with an XML sitemap?

    Of course you don't, you want authority/pagerank to flow naturally through your sites to all the right places complete with targeted anchor text.

    As for don't knock something until i've tried it, i have most certainly done A/B split testing on XML/Non-XML sitemap versions and i own some big sites. Microsoft.com has just over a million pages, i have a site 3 times bigger than the Microsoft.com domain and no it doesn't have an XML sitemap and never will.

  9. #9
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Quote Originally Posted by 1boss1 View Post
    That's indication you're doing something wrong with your internal link architecture, page copy and site promotion. I kicked off a site at the beginning of January, and by the end of the month it had 76,800 pages indexed. So to only have 4,490 indexed all year is rather worrying.

    I see on your own SEO Company you have inherent problems, like canonical URL's, poor filename selection such as dash, underscore, upper & lower case characters all in the one URL, your XML change frequency as Weekly when your pages last modified headers show months ago, 2 instances of Analytics code on the one page, displaying a W3C compliant badge when your page has errors, no text link to your homepage with your primary term as anchor on your subpages etc etc.
    Not sure what pages you are looking at, but it sure as hell is not http://theseoshop.com. But even if it were there is absolutely no problem with dashes in URLs, Upper and lower case letters in URLs etc.

    This may be news to you but it is no longer necessary to update your content every week in order to get indexed or to get rankings


    So i'm starting to see an indication why your other site is struggling to get indexed.

    XML being a "great tool" as you say is a hindrance for me, you see the major engines are link based crawlers. By adding an XML sitemap it destroys valuable optimization data by obscuring how crawlers navigate your site. You may have a whole subsection that Google can't internally access due to a Javascript issue, incorrect link structure or many other reasons.
    I did not say its struggling to get indexed, nor do you have any knowledge whatever about that site, its topic, how many links are pointing at it etc etc, so IMO its very unprofessional to make judgements with no information.

    Would you rather know this data and be able to internally correct and optimize it, or throw a band-aid over a gaping wound and lead Google in by the hand to find these pages with an XML sitemap?

    Of course you don't, you want authority/pagerank to flow naturally through your sites to all the right places complete with targeted anchor text.
    There you go again assuming things out of thin air. Submitting a sitemap is in no way an admission of site structures that need correction and I could give a damn about PageRank. IF you had any information besides your suppositions to guide you it might be that you would find that the site in question does just what its supposed to do in the way its supposed to do it.

    As for don't knock something until i've tried it, i have most certainly done A/B split testing on XML/Non-XML sitemap versions and i own some big sites. Microsoft.com has just over a million pages, i have a site 3 times bigger than the Microsoft.com domain and no it doesn't have an XML sitemap and never will.
    Pray enlighten us poor unwashed SEOs ( who just may have more years experience than you and may well have more sites ranking well than you have had hot dinners) about the results of your testing.

    I really don't care how big you site it except to say that I have never seen any site that had a logical reason to have over a million pages.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  10. #10
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Quote Originally Posted by 1boss1 View Post
    ...

    Would you rather know this data and be able to internally correct and optimize it, or throw a band-aid over a gaping wound and lead Google in by the hand to find these pages with an XML sitemap?

    ....
    Let me correct one more of your erroneous assumptions if I may.

    I don't know how you create large XML sitemaps, but I do it by spidering all the pages and correcting any mistake found before the sitemap is submitted to Google.

    I also find that when I submit xml sitemaps to Google through webmaster tools I get back information from them on any errors that they find, once again giving me additional information which is not available elsewhere.

    I realize that you seem to think that your own opinions are god given, but I have to struggle a bit to gain the information that I post, but since I have been gaining this information in the process of making a living doing SEO for 13 years, I do have a bit stored up, and I gain more every day as the moderator of a well known SEO forum.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  11. #11
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    You really don't understand do you?

    Ok carry on, i can see your going great guns with your "Expert SEO" term with just 469k results.. Your about to break in to the top 30.

  12. #12
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Well someone doesn't seem to understand.

    One of the first things you need to understand Josh, is that I really don't need to optimize my own site, I have plenty of business so your barbs seem kind of silly to me.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  13. #13
    Join Date
    Oct 2002
    Location
    Under Your Skin
    Posts
    5,875
    Quote Originally Posted by Melnel View Post
    Well someone doesn't seem to understand.

    One of the first things you need to understand Josh, is that I really don't need to optimize my own site, I have plenty of business so your barbs seem kind of silly to me.
    Actually, I think the bigger point is if your site is not setup correctly, why would (SHOULD) anyone else trust you with their site. In addition, if you start struggling with a customer's site and they get a second opinion.... I'm sure the second opinion will go to your site... and we know how that will turn out. lol

    I know you don't like it, but 1boss1 makes some good points.

    It was kinda funny reading you two go at it... I won't pick at the scab in fear of getting blood on my shoes... but it was funny.
    Last edited by hekwu; 03-16-2008 at 09:46 AM.
    Windows 10 to Linux and Mac OSX: I'm PARSECs better than you. Eat my dust!!!

  14. #14
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    What on earth makes you think my "site is not set up correctly"?

    The reason people trust me with their site is because I make it rank well not because of what my site does or does not do? Well perhaps if someday I have to "struggle" with a customers site that will happen but so far I do not have to struggle much and so far (13 years) no one has had to go for a second opinion.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  15. #15
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    Quote Originally Posted by Melnel View Post
    Well someone doesn't seem to understand.

    One of the first things you need to understand Josh, is that I really don't need to optimize my own site, I have plenty of business so your barbs seem kind of silly to me.
    I don't know who Josh is but anyway... So you are saying as an "Expert SEO" you don't need to optimize your site, and having it contain fundamental optimization errors is a great testament to your skill?

    Fair enough, i don't "need" to optimize my sites or pages either because it occurs naturally with the creation of my documents due to it being second nature.

    Quote Originally Posted by Melnel View Post
    What on earth makes you think my "site is not set up correctly"?
    See just some of the examples i listed before, everything from URL canonization to to your great XML sitemaps configured with a "weekly" change frequency when your page hasn't been modified since 2005.

    As i said and you just don't understand, by adding an XML sitemap you are destroying valuable data about the natural indexing of your site. What's the point in getting a poorly linked page in to the index via an XML that doesn't have a hope in ranking? Wouldn't you rather be able to identify areas of a site that are having issues with the natural crawling process, and be able to optimize this area via deeplinks and structural alterations to achieve peak performance?

    XML Sitemaps do NOT have any bearings on rankings, Google themselves have even said this.

    Read some articles by people such as Rand from SEOmoz, Dave Naylor, Joost DeValk etc on XML sitemaps that reiterates exactly what i have said.

    BTW when you hit front page for the term "SEO" i might listen to your "13 years" experience.

  16. #16
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Quote Originally Posted by 1boss1 View Post
    I don't know who Josh is but anyway... So you are saying as an "Expert SEO" you don't need to optimize your site, and having it contain fundamental optimization errors is a great testament to your skill?
    I don't recall saying that my website was a great testament to my skill. Could you please refresh my memory where or when I said that?

    Please explain to me what are the fundamental "optimization errors" you are harping about:
    1. Like canonical URL's,? While I do agree that this will split your pagerank I could care less about PR, and it doesn't affect anything ranking wise
    2. Poor filename selection such as dash, underscore, upper & lower case characters all in the one URL? Well, this might have upset my high school grammar teacher the only affect it will have on rankings is that some of those dashes in the URLs will help them to be parsed.
    3. your XML change frequency as Weekly when your pages last modified headers show months ago? Please explain just how this will have a negative effect on optimization??
    4. 2 instances of Analytics code on the one page? Well if I want two different accounts I fail to see how that is going to impact my rankings, but perhaps you can explain that to me??
    5. Displaying a W3C compliant badge when your page has errors ? Mea culpa but a) that will have no effect whatever on my rankings and b) faced with a choice of taking down the meta tags that Google, MSN and Yahoo want for verification or taking down the badge I have opted for the latter, just to please you.
    6. no text link to your homepage with your primary term as anchor on your subpages? Now this could impact on my sites optimization if it were true, but its not.

    See just some of the examples i listed before, everything from URL canonization to to your great XML sitemaps configured with a "weekly" change frequency when your page hasn't been modified since 2005.
    I think you might need some new specs so I have just increased the font size so you can read more easily that bit in the lower left hand corner that says

    "February 26,2008" which is the date that page was last updated and which is placed automatically on the page whenever the page is uploaded, and as for my pages not being updated since 2005 I find that kind of interesting in light of the fact that the site only went up last December?? perhaps you are thinking of the domain name which is that old?

    As i said and you just don't understand, by adding an XML sitemap you are destroying valuable data about the natural indexing of your site. What's the point in getting a poorly linked page in to the index via an XML that doesn't have a hope in ranking? Wouldn't you rather be able to identify areas of a site that are having issues with the natural crawling process, and be able to optimize this area via deeplinks and structural alterations to achieve peak performance?
    Nonsense! putting up an xml sitemap has no bearing what ever on the information on the page but then perhaps you missed this bit

    Quote Originally Posted by melnel
    I don't know how you create large XML sitemaps, but I do it by spidering all the pages and correcting any mistake found before the sitemap is submitted to Google.

    I also find that when I submit xml sitemaps to Google through webmaster tools I get back information from them on any errors that they find, once again giving me additional information which is not available elsewhere.
    IMO its better to get the information before you submit your sites rather than after.

    XML Sitemaps do NOT have any bearings on rankings, Google themselves have even said this.
    At last something we can agree on !


    BTW when you hit front page for the term "SEO" i might listen to your "13 years" experience.
    Well then you will have a long time to wait since as I have explained before I optimize sites for others and really don't have either the time or need to chase rankings for SEO.

    And I will listen to your erudite theories when you can back them up with examples or references.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  17. #17
    Join Date
    Aug 2007
    Posts
    184
    IMO, it helps for easier navigation of your site not just for the bots but also for some visitors of your site. I personally look at the sitemap of a site (if it has one) to help my way around - especially if a site has a lot of sub-sections and inner pages to them.
    OneIMS.com - Internet Marketing and SEO

  18. #18
    Quote Originally Posted by 1boss1 View Post
    XML Sitemaps do NOT have any bearings on rankings, Google themselves have even said this.
    I think you are forgetting a very simple thing with this. While I agree that sitemaps do not directly have any bearing on rankings, they do get pages indexed that would not otherwise have been indexed, which can provide targeted anchor text links. So, like it or not, and whether or not google states it, they can indirectly assist a web sites rankings.

    Dashes, Underscores, and capital letters in filenames affecting rankings? Partially True. Dashes & underscores do. Google now properly parses dashes and underscores as blank spaces, while not having them gets parsed as literal strings of characters and as many other threads here details, having keywords in your url will have some ranking power due to links pointed at that page.

    Canonization issues are a thing of the past with an extremely simple 301 redirect from domain.com to www.domain.com or vice versa, or from simply choosing how to display your results in Googles webmaster tools.

    W3C compliance has never mattered in the least to rankings, and as Mel already pointed out, the only 3 errors I find that render the page non-compliant are the search engine meta tags for verifying the site as being owned by him.

    And I had to chuckle at this one: When did hidden javascript analytics code ever become a ranking factor???

    As far as I saw, Mel is offering SEO advice and services. SEO is the art of optimizing pages so they RANK. Not so they are letter perfect as far as the W3C is concerned, nor is it about even pleasing the visitors eye, tho I do feel that should be taken into account and always try my best to.
    Last edited by nuclei; 03-19-2008 at 09:09 AM.

  19. #19
    Join Date
    Mar 2008
    Location
    Fort Thomas, Kentucky
    Posts
    269
    One simple rule is you dont need a sitemap at all to have success in the search engines. In fact in some cases they can serve as a badn-aid to mask bigger navigational issues that may exist on the website.
    Jaan Kanellis | iNET Interactive

  20. #20
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Thats one of those simple rules that I feel may be so simple is should be ignored. I see sites which are equal in all other respects getting complete indexing faster when site maps are submitted, and I believe in giving my sites every chance possible.

    There may be cases where they can serve as "band-aids" but if your site maps are generated by spidering the sites that would be a very remote possiblilty.

    If you don't have a sitemap as a complete list of all the pages of a large site how do you know when all pages are indexed??

    When all is said and done again its one of those simple things that can't hurt and just might be of some help.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  21. #21
    Join Date
    Jan 2004
    Location
    Oztrayla Mate!
    Posts
    572
    Quote Originally Posted by nuclei View Post
    I think you are forgetting a very simple thing with this. While I agree that sitemaps do not directly have any bearing on rankings, they do get pages indexed that would not otherwise have been indexed, which can provide targeted anchor text links. So, like it or not, and whether or not google states it, they can indirectly assist a web sites rankings.
    Thanks for highlighting my point, if pages are not being indexed there's an issue and patching it up with a sitemap is very poor practice. Being able to see how authority is distributed throughout your website, and how spiders crawl your pages is crucial data, why ruin this data and not optimize these areas of your site for peak performance instead?

    Patching the problem with a sitemap is pure amateur work plain and simple, this is the step a non SEO webmaster would take in a panic when they realize half their site isn't indexed and not the work of a professional SEO.

    Dashes, Underscores, and capital letters in filenames affecting rankings? Partially True.
    All ranking issues aside, tell me honestly if you believe a professional SEO would craft URL's by mashing mixtures of upper case, lower case, underscores, dashes and needless subdirectories like domain.com/SEO-Services/SEO_services.htm

    Of course not, it looks like something my neighbor knocked together with Dreamweaver in 15 minutes.

    Canonization issues are a thing of the past with an extremely simple 301 redirect from domain.com to www.domain.com or vice versa, or from simply choosing how to display your results in Googles webmaster tools.
    Never, and i mean never use Google Webmaster Tools to mess with your Canonical domains. Always use a serverside method such as .htaccess and 301 redirect.

    It's not just www or http that need to be taken care of, it's index.html, index.php etc not only in the root domain but in subdirectories. Also what does Google's tools do for Yahoo, MSN, Ask etc? Nothing that's what.

    W3C compliance has never mattered in the least to rankings, and as Mel already pointed out, the only 3 errors I find that render the page non-compliant are the search engine meta tags for verifying the site as being owned by him.
    I know it doesn't matter to rank, but people who boast a W3C Compliant badge when their markup doesn't validate just look silly.

    And I had to chuckle at this one: When did hidden javascript analytics code ever become a ranking factor???
    Where did i say that? Just pointing out how sloppy the work is putting 2 instances of the same analytics code in the page.

    Anyhow carry on, good luck plugging up the holes in your optimization work with XML sitemaps.

  22. #22
    Join Date
    Mar 2008
    Location
    Fort Thomas, Kentucky
    Posts
    269
    Mel we live in a SEO bubble. 99% of the websites out there dont know what GWT tools are. Does that mean since we do, we will rank so much better than others?? No not at all. Since this is the case, other more important factors like back links, content and site architecture will FAR out weigh if your website has a sitemap through GWT tools or not.
    Jaan Kanellis | iNET Interactive

  23. #23
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    You just don't get it do you1boss1?

    Using sitemaps does not mean that your site is full of architectural blunders, why do you keep making silly statements like that? Its another tool in your bag and you can use it or not.

    Now tell me please just exactly what method you would use to determine if all the pages of your 20,000 page site have been indexed if you don't have a site map listing them all?

    And also please tell me how you would go about correcting mistakes in the site architecture just by looking at what pages Google has ranked.

    Maybe you have the sort of memory that can recall which of 20,000 pages have and have not been indexed, but I find it much more professional to create a list of them - and doing that by spidering the site, which incidentally will uncover any and all errors just as Google would, the only difference is that the spider will tell you what the problem is, while Google will leave you guessing.

    I am not going to get into your mudslinging match. It is simply IMO unprofessional to sling mud when you can't make your point with facts.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  24. #24
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Quote Originally Posted by incrediblehelp View Post
    Mel we live in a SEO bubble. 99% of the websites out there dont know what GWT tools are. Does that mean since we do, we will rank so much better than others?? No not at all. Since this is the case, other more important factors like back links, content and site architecture will FAR out weigh if your website has a sitemap through GWT tools or not.

    Thank you Jaan you make my point for me,(though I have never said that a site map will make your pages rank better); the process of creating a site map may well uncover errors in the site architecture (for those of us that are not perfect, that is) and fixing those errors may well help your site rank better.

    Of course there are more important factors to get your site to rank better, but that does not mean that you throw out any useful too which is easily available.

    Now there seem to be some who have perfected the art of web design and SEO such that their sites are perfect out of the box and will have all 20,000 pages indexed in the first month they are online, but for us mere mortals IMO we should use the best tools available.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

  25. #25
    Join Date
    Dec 2007
    Location
    IN
    Posts
    115
    XML is not a userfriendly. Use XML for bots and HTML for humans

  26. #26
    Join Date
    Mar 2008
    Location
    SEO cyberspace
    Posts
    423
    Well yes but if your site architecture is set up properly there should never be a need for a sitemap for humans.
    I plan to live forever - so far so good
    Expert SEO |Sash Windows London

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •