Jul 24 2007

SEO Myth Series Sitemaps and Robots.txt

Because I’m actively involved in the SEO community, I tend to hear a lot of misconceptions about the practice of SEO. Not only from people I know personally, but just from the vast information on the Internet from people branding themselves as Search Engine Optimizers that really don’t qualify, there’s even more incorrect information out there. What makes it worse, is that people continually spread false information about the practice all the time. What makes me so special as to know better than these people perpetuating these myths? I’m not going to pretend I know everything about SEO, but my knowledge is extensive and based on numerous industry recognized sources and years of experience and proven success – my favorite place for SEO newbies? SEOmoz’ Top Ranking Factors

So I’m dedicating an article series featuring myths I see and hear constantly – or at least often enough for me to remember. Granted, SEO is not an exact science, there’s no real definitive fact behind what works and what doesn’t, there are guidelines, best practices, and knowledge gained from experience that can tell us generally what has an impact on search engine placement and what doesn’t. So to kick things off, I give you this week’s SEO Myth – straight from LinkedIn’s Q&A.

The question is simply: “Does Search Engine Optimization (SEO) Really work? If so, what are the best tools?”

Many of the replies were helpful and accurate to a point. But one response caught my attention from Bonnie Burns at OnTheAvenues.com:

“As an SEO expert, I know of all the tools available… For example, it is known that having a Google xml site map is needed, as is having a sitemap for your site. as is a robots.txt. If you did not know that, there is no diagnostic tool that would tell you so…. ”

While having a Google XML sitemap is valuable for sites that aren’t being properly indexed, for new sites, or to ensure all of your pages get indexed, it is by no means necessary for SEO as long as your site is spidered correctly. The same holds true for a sitemap (as in an HTML sitemap I assume), and a robots.txt. Sitemaps and Robots.txt can be very valuable in SEO efforts, don’t get me wrong. However, they are only required in specific scenarios to ensure the correct indexing of your site if it is not already. You won’t be penalized if you don’t have them, and Google (nor the others) will not rank your site higher if you add them (although people have found creative ways to using an HTML sitemap’s structure to assist in their internal linking hierarchy). Everyone’s entitled to one mistake so I gave her the benefit of the doubt. But taking a look at her at her website and blog, I saw other inaccuracies abound. But today’s not a two for one special, so you’ll have to wait until next week for more.

Kevin Eichelberger

Founder & CEO

Founded in 2008, Blue Acorn is the product of Kevin’s great passion and knowledge of all things eCommerce. Kevin’s data-driven approach has culminated in a strong, growing business that’s success is closely tied to the success of its clients. When he’s not immersing himself in eCommerce, Kevin works toward expanding Charleston’s tech community by serving as a board member for the Charleston Digital Corridor Foundation, and is also a mentor and advisor to several startups. A business-savvy technologist, you can find Kevin evangelizing about data, optimization and eCommerce.

3 Comments

Bonnie
Aug 28 2007

The world of SEO is covered with do, do not, maybe, could work, never works. There are many different ways to do SEO as well. But, if one wants to do all they can, having the information for all possible techniques is useful. For waht works this month may be worthless in 3 months. What was a myth can become a factor and what is a factor can become useless. The use of robots and xml sitemaps and static sitemaps are all very useful when developing a web site and any seo worth their weight would always suggest or include them, as they have their own merits in helping a web site preform in some way. Leting an engine know what pages not to spider, what pages are existing on a web site etc are all part of development.

As an SEO, we spend much time reading all the ‘info ‘ written in the field. We post many interesting factors, articles as well from other seo pros on the blog as this is still informative, at least for a time until something changes. SEO is not static, and that is why we love learing and reading all the ‘new’ facets discovered, no matter how short lived they may be.

http://www.bruceclay.com/newsletter/0705/robots.html

https://www.google.com/webmasters/tools/docs/en/protocol.html

Best
Bonnie

Reply
Blue Acorn
Sep 04 2007

Bonnie, thanks again for your comments. If you read my article carefully, I never disputed the value of sitemaps and robots.txt – they have their time and place. However, I do dispute your wording of “…it is known that having a Google xml site map is needed, as is having a sitemap for your site” is incorrect. It is not needed, it may help, sure. However, if your site has the proper link structure to begin with, you don’t need a sitemap (an XML sitemap I’m referring to). And if your site needs all pages spidered then a robots.txt is not necessary. Yes, these are SEO tools for sites that need them for very particular reasons, however, they are not “necessary” for SEO.

Reply
Dori@Stretch Mark Cream Reviews
Feb 20 2013

I arrived at your site because i did a search for Bonnie Burns. I have worked with her in the past. Your comments back and forth intrigued me. 2 great seo minds going head to head or xml site maps. What do you have to say about free website templates such as Weebly who do not offer the option of adding a site map? Interested to know. Please email me if possible.
Dori

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.