If you care about how your site is found in organic search, you must spend some of your time thinking about search engine optimization (SEO). In the olden days (2005), certain kinds of content had no shot of showing up in the search index (and thus, could never be found). But in recent years, more and more dynamic content is showing up in Google’s search index, as Google makes its spider smarter and smarter. So, now there’s nothing to worry about with dynamic content, right? Not quite.
I don’t want to downplay the amazing strides that have been made by the Googlebot. Google has worked tirelessly with Adobe to make Flash content indexable. If it is a Flash video, there isn’t much text to index, but many Flash experiences are full of text and Google can index a lot more of it than ever before.
Similarly, dynamic content generated from databases is indexed better than it once was, so it is less important to hide dynamic URLs than in years past.
And then there was the tweet heard round the SEO world in November, when Google’s Matt Cutts confirmed that Facebook comments are now being indexed. That might sound like a small thing, but SEO gurus know that it is one more step in Google’s road to conquering a very difficult problem: understanding everything a developer can do with JavaScript. Just as a browser contains a JavaScript interpreter to render pages correctly, now Google’s spider contains some of that ability. Already, some are wondering how to take advantage of the new smarter Googlebot.
But it’s not smart to count on any of this dynamic content being indexed, for a few reasons:
- Better ain’t necessarily good. Sure it works better than it did, but if it omits any of your content, you’re losing something. By using tried-and-true techniques that avoid dynamic content, all of your content gets indexed, which still seems like the way to go.
- Google ain’t the only search engine. Sure, it’s nice that the Googlebot is getting so smart, but Bing runs 30% of U.S. searches and many other search engines grab market share around the world. Why hide your content from them?
- The negative effects can be bigger than you think. When the spider fails to identify dynamic content, you might lose a lot more than a few words on a page. If that content contains links, the spider might not see whole pages on your site, and whatever pages THEY link to.
So, I’m a technical guy, and I really love to see the spiders getting smarter. It would be great if any Web page that can be rendered properly in a browser could be crawled and indexed by all search engines. It would make SEO a lot simpler and would allow us to concentrate on content rather than technical mumbo-jumbo.
But we’re not there yet. So, make sure that you know what the spiders see (all of them) before you employ lots of dynamic content techniques.