I’m not much into SEO (search engine optimization). It’s a booming industry for some, but I still have this crazily naive idea that good content will just naturally rise to the top of search results. That said, there are a some basic SEO guidelines that I try to follow on my site. When I recently discovered my site was disappearing from Google results, I learned a very important SEO lesson.
There are some pages on my site that I actually don’t want in Google search results. For example, perhaps counter-intuitively, my blog pages: The content there is constantly changing, so I don’t want it to appear as a search result, because it will quickly be out of date, and users will be confused when they arrive. Instead, I want people to find the individual blog post pages where the content will remain pretty constant. So I placed meta tags at the top of my blog pages to prevent them from being indexed:
<meta name=”robots” content=”noindex” />
Now fast forward to a few weeks ago, when I was idly going over some web site logs. According to my logs, the most popular pages on my site are “How To” posts, and the most requested page of all time is How To Rename a Windows Service. Out of curiosity, I ran a Google search to see how far down this one particular page would be in the results, thinking it should be pretty high if people keep finding it.
It was indeed on the first page of results… on krehbiel.blogspot.com. (I used to crosspost there.) But on thomaskrehbiel.com? It wasn’t there. Like, anywhere. Not on result page 1, 2, or even 20. I typed in the exact title of the page, “How To Rename A Windows Service by Thomas Krehbiel,” in quotes, and got nothing.
I checked Google Webmaster Tools and quickly found the reason. Of the 1221 pages in my sitemap, only 58 pages were indexed. Several days later, only 4 were indexed! Dubya Tee Eff?
Well, I think I found the cause. Some time ago, I made some code changes in the area of my blog that renders the <head> tags. A colossal blunder on my part allowed <meta name=”robots” content=”noindex” /> to go on not just the blog pages, but on every single page of the site.
I’ve corrected the problem and resubmitted my sitemap; now I just have to see how long it will take to get my pages back into the index.
So here’s the lesson for anyone looking to improve their SEO skills: Excluding your entire website from the Google index definitely does not improve your search result rankings. :) But seriously, make sure the meta robots tag is correct. (Also, it would probably help to check your logs and statistics more than once or twice a year.)
(However, if you want to make a relatively private site, robots noindex is a very effective solution.)