Results 1 to 25 of 26
-
03-13-2008, 10:08 AM #1WHT Addict
- Join Date
- Jun 2003
- Posts
- 130
Google Sitemap. XML, good for spiders but not for visitors?
Hello,
I am trying to get my bearings with some SEO. I am using Google Webmaster tools and I am working on a sitemap. I am going to use http://www.xml-sitemaps.com/ to create the XML file and then submit it.
Now my question is, the XML file is not that great for actual human users. It looks just like a bunch of text (unless I am doing something wrong). So is there a way to create a Google compliant XML sitemap that is usable by spiders AND users?
Also, is it best practice to create a new sitemap each time I add pages/content and then resubmit?
Thanks a lot
-
03-13-2008, 03:12 PM #2Junior Guru Wannabe
- Join Date
- Jul 2007
- Location
- Needham, MA
- Posts
- 43
XML Sitemaps help spiders know what pages exist. This is especially helpful if you don't have pages that are organically accessible to the crawler via links. These XML based sitemaps were never created with the end-user in mind.
You'll have to have dueling strategies. One is your XML file which no user will ever see, the other is an HTML file that for example could also serve as your 404 not found page, which would contain links to popular pages on your website. If you do EVERY page in this version of the sitemap, you'll be forced to update this constantly, and you may have too many pages for it to be useful to an end-user. So just watch out for that.
You SHOULD update your XML sitemap periodically and use the optional tags to let Google know if pages have been updated / modified or if pages have been added or removed. Google automatically downloads your XML sitemap regularly and expects changes.
RatePoint Customer Feedback Platform and Site Seal
Become a partner today!
http://www.ratepoint.com/
-
03-13-2008, 03:12 PM #3Business Consultant Manager
- Join Date
- Feb 2004
- Location
- Fort Worth, TX
- Posts
- 2,586
I would recommend a "human" sitemap and an "XML" site map.
Create the human sitemap to look nice for your visitors, and the XML just for google. I would suggest resubmitting the XML site map every week or so. But in that week.. add 2 content pages.█ www.JGRoboMarketing.com / "Automate. Grow. Repeat"
█ Office: (800) 959-0182 / A KEAP Certified Developer (KCD)
-
03-14-2008, 06:38 AM #4Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
Sure make a HTML sitemap for your visitors but the XML is pretty useless. It doesnt help you rank and if you need it for spiders to crawl your site then you have bigger problems.
-
03-14-2008, 10:01 AM #5WHT Addict
- Join Date
- Jun 2003
- Posts
- 130
-
03-14-2008, 02:24 PM #6Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
They dont "request" you add one, they have an option to let Google know if your site does have one.
It does nothing to help you rank, Google themselves have stated this. If you actually "need" an XML sitemap so your site gets crawled properly it indicates your sites structure needs addressing.
A HTML sitemap on the other hand assists visitors, provides an internal backlink with anchor text to your inner pages which gives ranking gains plus provides an alternate route to your content to all search engines not just Google.
Implemented in to your 404 it provides numerous options for all engines to keep crawling, as well as acts as a navigational aid to help your visitors find what they were looking for instead of bouncing.
There are more technical reasons but thats the basics of it.
-
03-15-2008, 03:00 AM #7Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
That may be true if you have a site of 25 pages with a good Pagerank, but if you have a new site of 10,000 or so pages an XML sitemap submitted to all the major search engines is a great help and it costs little to implement so why pass up one more useful tool?
You will still need other strategies to get all your pages indexed, and more still to get all those pages ranking, but don't discard ideas before you try them.
Example:
A new site with 12,000+ pages was created on Jan 8 this year and up until one week ago it had only 1264 pages indexed in Google, a Sitemap was submitted five days ago and today the site has 4490 pages indexed in Google.
I like xml sitemaps.
-
03-16-2008, 12:39 AM #8Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
That's indication you're doing something wrong with your internal link architecture, page copy and site promotion. I kicked off a site at the beginning of January, and by the end of the month it had 76,800 pages indexed. So to only have 4,490 indexed all year is rather worrying.
I see on your own SEO Company you have inherent problems, like canonical URL's, poor filename selection such as dash, underscore, upper & lower case characters all in the one URL, your XML change frequency as Weekly when your pages last modified headers show months ago, 2 instances of Analytics code on the one page, displaying a W3C compliant badge when your page has errors, no text link to your homepage with your primary term as anchor on your subpages etc etc.
So i'm starting to see an indication why your other site is struggling to get indexed.
XML being a "great tool" as you say is a hindrance for me, you see the major engines are link based crawlers. By adding an XML sitemap it destroys valuable optimization data by obscuring how crawlers navigate your site. You may have a whole subsection that Google can't internally access due to a Javascript issue, incorrect link structure or many other reasons.
Would you rather know this data and be able to internally correct and optimize it, or throw a band-aid over a gaping wound and lead Google in by the hand to find these pages with an XML sitemap?
Of course you don't, you want authority/pagerank to flow naturally through your sites to all the right places complete with targeted anchor text.
As for don't knock something until i've tried it, i have most certainly done A/B split testing on XML/Non-XML sitemap versions and i own some big sites. Microsoft.com has just over a million pages, i have a site 3 times bigger than the Microsoft.com domain and no it doesn't have an XML sitemap and never will.
-
03-16-2008, 03:05 AM #9Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
Not sure what pages you are looking at, but it sure as hell is not http://theseoshop.com. But even if it were there is absolutely no problem with dashes in URLs, Upper and lower case letters in URLs etc.
This may be news to you but it is no longer necessary to update your content every week in order to get indexed or to get rankings
So i'm starting to see an indication why your other site is struggling to get indexed.
XML being a "great tool" as you say is a hindrance for me, you see the major engines are link based crawlers. By adding an XML sitemap it destroys valuable optimization data by obscuring how crawlers navigate your site. You may have a whole subsection that Google can't internally access due to a Javascript issue, incorrect link structure or many other reasons.
Would you rather know this data and be able to internally correct and optimize it, or throw a band-aid over a gaping wound and lead Google in by the hand to find these pages with an XML sitemap?
Of course you don't, you want authority/pagerank to flow naturally through your sites to all the right places complete with targeted anchor text.
As for don't knock something until i've tried it, i have most certainly done A/B split testing on XML/Non-XML sitemap versions and i own some big sites. Microsoft.com has just over a million pages, i have a site 3 times bigger than the Microsoft.com domain and no it doesn't have an XML sitemap and never will.
I really don't care how big you site it except to say that I have never seen any site that had a logical reason to have over a million pages.
-
03-16-2008, 03:15 AM #10Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
Let me correct one more of your erroneous assumptions if I may.
I don't know how you create large XML sitemaps, but I do it by spidering all the pages and correcting any mistake found before the sitemap is submitted to Google.
I also find that when I submit xml sitemaps to Google through webmaster tools I get back information from them on any errors that they find, once again giving me additional information which is not available elsewhere.
I realize that you seem to think that your own opinions are god given, but I have to struggle a bit to gain the information that I post, but since I have been gaining this information in the process of making a living doing SEO for 13 years, I do have a bit stored up, and I gain more every day as the moderator of a well known SEO forum.
-
03-16-2008, 07:25 AM #11Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
You really don't understand do you?
Ok carry on, i can see your going great guns with your "Expert SEO" term with just 469k results.. Your about to break in to the top 30.
-
03-16-2008, 07:58 AM #12Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
Well someone doesn't seem to understand.
One of the first things you need to understand Josh, is that I really don't need to optimize my own site, I have plenty of business so your barbs seem kind of silly to me.
-
03-16-2008, 09:42 AM #13Web Hosting Master Disaster
- Join Date
- Oct 2002
- Location
- Under Your Skin
- Posts
- 5,904
Actually, I think the bigger point is if your site is not setup correctly, why would (SHOULD) anyone else trust you with their site. In addition, if you start struggling with a customer's site and they get a second opinion.... I'm sure the second opinion will go to your site... and we know how that will turn out. lol
I know you don't like it, but 1boss1 makes some good points.
It was kinda funny reading you two go at it... I won't pick at the scab in fear of getting blood on my shoes... but it was funny.Last edited by hekwu; 03-16-2008 at 09:46 AM.
Windows 10 to Linux and Mac OSX: I'm PARSECs better than you. Eat my dust!!!
-
03-16-2008, 10:34 AM #14Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
What on earth makes you think my "site is not set up correctly"?
The reason people trust me with their site is because I make it rank well not because of what my site does or does not do? Well perhaps if someday I have to "struggle" with a customers site that will happen but so far I do not have to struggle much and so far (13 years) no one has had to go for a second opinion.
-
03-16-2008, 01:47 PM #15Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
I don't know who Josh is but anyway... So you are saying as an "Expert SEO" you don't need to optimize your site, and having it contain fundamental optimization errors is a great testament to your skill?
Fair enough, i don't "need" to optimize my sites or pages either because it occurs naturally with the creation of my documents due to it being second nature.
See just some of the examples i listed before, everything from URL canonization to to your great XML sitemaps configured with a "weekly" change frequency when your page hasn't been modified since 2005.
As i said and you just don't understand, by adding an XML sitemap you are destroying valuable data about the natural indexing of your site. What's the point in getting a poorly linked page in to the index via an XML that doesn't have a hope in ranking? Wouldn't you rather be able to identify areas of a site that are having issues with the natural crawling process, and be able to optimize this area via deeplinks and structural alterations to achieve peak performance?
XML Sitemaps do NOT have any bearings on rankings, Google themselves have even said this.
Read some articles by people such as Rand from SEOmoz, Dave Naylor, Joost DeValk etc on XML sitemaps that reiterates exactly what i have said.
BTW when you hit front page for the term "SEO" i might listen to your "13 years" experience.
-
03-16-2008, 02:57 PM #16Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
I don't recall saying that my website was a great testament to my skill. Could you please refresh my memory where or when I said that?
Please explain to me what are the fundamental "optimization errors" you are harping about:- Like canonical URL's,? While I do agree that this will split your pagerank I could care less about PR, and it doesn't affect anything ranking wise
- Poor filename selection such as dash, underscore, upper & lower case characters all in the one URL? Well, this might have upset my high school grammar teacher the only affect it will have on rankings is that some of those dashes in the URLs will help them to be parsed.
- your XML change frequency as Weekly when your pages last modified headers show months ago? Please explain just how this will have a negative effect on optimization??
- 2 instances of Analytics code on the one page? Well if I want two different accounts I fail to see how that is going to impact my rankings, but perhaps you can explain that to me??
- Displaying a W3C compliant badge when your page has errors ? Mea culpa but a) that will have no effect whatever on my rankings and b) faced with a choice of taking down the meta tags that Google, MSN and Yahoo want for verification or taking down the badge I have opted for the latter, just to please you.
- no text link to your homepage with your primary term as anchor on your subpages? Now this could impact on my sites optimization if it were true, but its not.
See just some of the examples i listed before, everything from URL canonization to to your great XML sitemaps configured with a "weekly" change frequency when your page hasn't been modified since 2005.
"February 26,2008" which is the date that page was last updated and which is placed automatically on the page whenever the page is uploaded, and as for my pages not being updated since 2005 I find that kind of interesting in light of the fact that the site only went up last December?? perhaps you are thinking of the domain name which is that old?
As i said and you just don't understand, by adding an XML sitemap you are destroying valuable data about the natural indexing of your site. What's the point in getting a poorly linked page in to the index via an XML that doesn't have a hope in ranking? Wouldn't you rather be able to identify areas of a site that are having issues with the natural crawling process, and be able to optimize this area via deeplinks and structural alterations to achieve peak performance?
Originally Posted by melnel
XML Sitemaps do NOT have any bearings on rankings, Google themselves have even said this.
BTW when you hit front page for the term "SEO" i might listen to your "13 years" experience.
And I will listen to your erudite theories when you can back them up with examples or references.
-
03-19-2008, 07:48 AM #17Junior Guru
- Join Date
- Aug 2007
- Posts
- 185
IMO, it helps for easier navigation of your site not just for the bots but also for some visitors of your site. I personally look at the sitemap of a site (if it has one) to help my way around - especially if a site has a lot of sub-sections and inner pages to them.
OneIMS.com - Internet Marketing and SEO
-
03-19-2008, 08:55 AM #18WHT Addict
- Join Date
- Jan 2002
- Posts
- 159
I think you are forgetting a very simple thing with this. While I agree that sitemaps do not directly have any bearing on rankings, they do get pages indexed that would not otherwise have been indexed, which can provide targeted anchor text links. So, like it or not, and whether or not google states it, they can indirectly assist a web sites rankings.
Dashes, Underscores, and capital letters in filenames affecting rankings? Partially True. Dashes & underscores do. Google now properly parses dashes and underscores as blank spaces, while not having them gets parsed as literal strings of characters and as many other threads here details, having keywords in your url will have some ranking power due to links pointed at that page.
Canonization issues are a thing of the past with an extremely simple 301 redirect from domain.com to www.domain.com or vice versa, or from simply choosing how to display your results in Googles webmaster tools.
W3C compliance has never mattered in the least to rankings, and as Mel already pointed out, the only 3 errors I find that render the page non-compliant are the search engine meta tags for verifying the site as being owned by him.
And I had to chuckle at this one: When did hidden javascript analytics code ever become a ranking factor???
As far as I saw, Mel is offering SEO advice and services. SEO is the art of optimizing pages so they RANK. Not so they are letter perfect as far as the W3C is concerned, nor is it about even pleasing the visitors eye, tho I do feel that should be taken into account and always try my best to.Last edited by nuclei; 03-19-2008 at 09:09 AM.
-
03-21-2008, 07:08 AM #19Web Hosting Guru
- Join Date
- Mar 2008
- Location
- Fort Thomas, Kentucky
- Posts
- 269
One simple rule is you dont need a sitemap at all to have success in the search engines. In fact in some cases they can serve as a badn-aid to mask bigger navigational issues that may exist on the website.
Jaan Kanellis | iNET Interactive
-
03-21-2008, 07:28 AM #20Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
Thats one of those simple rules that I feel may be so simple is should be ignored. I see sites which are equal in all other respects getting complete indexing faster when site maps are submitted, and I believe in giving my sites every chance possible.
There may be cases where they can serve as "band-aids" but if your site maps are generated by spidering the sites that would be a very remote possiblilty.
If you don't have a sitemap as a complete list of all the pages of a large site how do you know when all pages are indexed??
When all is said and done again its one of those simple things that can't hurt and just might be of some help.
-
03-21-2008, 07:32 AM #21Web Hosting Master
- Join Date
- Jan 2004
- Location
- Oztrayla Mate!
- Posts
- 583
Thanks for highlighting my point, if pages are not being indexed there's an issue and patching it up with a sitemap is very poor practice. Being able to see how authority is distributed throughout your website, and how spiders crawl your pages is crucial data, why ruin this data and not optimize these areas of your site for peak performance instead?
Patching the problem with a sitemap is pure amateur work plain and simple, this is the step a non SEO webmaster would take in a panic when they realize half their site isn't indexed and not the work of a professional SEO.
Dashes, Underscores, and capital letters in filenames affecting rankings? Partially True.
Of course not, it looks like something my neighbor knocked together with Dreamweaver in 15 minutes.
Canonization issues are a thing of the past with an extremely simple 301 redirect from domain.com to www.domain.com or vice versa, or from simply choosing how to display your results in Googles webmaster tools.
It's not just www or http that need to be taken care of, it's index.html, index.php etc not only in the root domain but in subdirectories. Also what does Google's tools do for Yahoo, MSN, Ask etc? Nothing that's what.
W3C compliance has never mattered in the least to rankings, and as Mel already pointed out, the only 3 errors I find that render the page non-compliant are the search engine meta tags for verifying the site as being owned by him.
And I had to chuckle at this one: When did hidden javascript analytics code ever become a ranking factor???
Anyhow carry on, good luck plugging up the holes in your optimization work with XML sitemaps.
-
03-21-2008, 08:08 AM #22Web Hosting Guru
- Join Date
- Mar 2008
- Location
- Fort Thomas, Kentucky
- Posts
- 269
Mel we live in a SEO bubble. 99% of the websites out there dont know what GWT tools are. Does that mean since we do, we will rank so much better than others?? No not at all. Since this is the case, other more important factors like back links, content and site architecture will FAR out weigh if your website has a sitemap through GWT tools or not.
Jaan Kanellis | iNET Interactive
-
03-21-2008, 08:19 AM #23Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
You just don't get it do you1boss1?
Using sitemaps does not mean that your site is full of architectural blunders, why do you keep making silly statements like that? Its another tool in your bag and you can use it or not.
Now tell me please just exactly what method you would use to determine if all the pages of your 20,000 page site have been indexed if you don't have a site map listing them all?
And also please tell me how you would go about correcting mistakes in the site architecture just by looking at what pages Google has ranked.
Maybe you have the sort of memory that can recall which of 20,000 pages have and have not been indexed, but I find it much more professional to create a list of them - and doing that by spidering the site, which incidentally will uncover any and all errors just as Google would, the only difference is that the spider will tell you what the problem is, while Google will leave you guessing.
I am not going to get into your mudslinging match. It is simply IMO unprofessional to sling mud when you can't make your point with facts.
-
03-21-2008, 08:27 AM #24Aspiring Evangelist
- Join Date
- Mar 2008
- Location
- SEO cyberspace
- Posts
- 423
Thank you Jaan you make my point for me,(though I have never said that a site map will make your pages rank better); the process of creating a site map may well uncover errors in the site architecture (for those of us that are not perfect, that is) and fixing those errors may well help your site rank better.
Of course there are more important factors to get your site to rank better, but that does not mean that you throw out any useful too which is easily available.
Now there seem to be some who have perfected the art of web design and SEO such that their sites are perfect out of the box and will have all 20,000 pages indexed in the first month they are online, but for us mere mortals IMO we should use the best tools available.
-
04-15-2008, 12:36 AM #25WHT Addict
- Join Date
- Dec 2007
- Location
- IN
- Posts
- 146
XML is not a userfriendly. Use XML for bots and HTML for humans
█ Inway - India's Most Trusted Web Host
█ SSD Web Hosting + CMS + Email Solutions Simplified
█ Visit us at Web Hosting India