Trending Now

Web 2.0 hide and seek with search engines

Welcome to the world of Web 2.0 where the old maps of the Internet have been updated to help visitors find their way to new content! After reading this article you will understand the challenges which new dynamic technologies such as AJAX pose to search and content discovery, as well as how to avoid some common problems.


What do we mean by “the old maps of the Internet have been updated to help visitors find their way to new content”? In the early days of the Web, pages were static. Pre-assembled content was served in the same way to all visitors, and a Web site was often nothing more than a replacement for a brochure about a company or product. Then came the first leap, as Web servers were able to dynamically assemble a page on the fly. Web sites became functional tools with which the visitor could interact through applications such as shopping carts, email, and others. Although a huge step forward, it still had a disadvantage in that any time an action was taken, the visitor would have to wait for the entire page to load before starting to work with it.
Up to this point, the browser played a minor role of rendering pages that had been assembled by the Web server. But with the browser war over, a new breed of browsers emerged with better support for JavaScript (a scripting language that runs in the browser), allowing Web pages to load and execute code in addition to content. With client side scripting (aka DHTML or AJAX), code is loaded as part of the page, powerful enough to manipulate the page and render specific parts of it without reloading. A more responsive experience is possible, where you can now drag maps to explore your route, mouse over books and movies to preview them, and manipulate pictures and data as if they were part of a standalone program in your desktop.
Congratulations, your site is more responsive and reacts to the visitor’s input, making them happy. This is a Good Thing.

The Search for New Content

But wait. If some of your content is only rendered in response to user actions, will search engines be able to find it? Chances are this won’t happen, unless you have necessary steps to optimize for search in this new environment.
At its core, the Web is a collection of different kinds of content, displayed in a Web page and contextually linked together. Users follow these links to load Web pages and discover other content, a behavior that search engines emulate, creating an index or map of linked content as they go. Pages using client-side technologies, such as AJAX and DHTML, can break this paradigm, since now content is not necessarily reached through a specific URL.
Now the goal becomes ensuring that the search engines can still find your content, even though it may no longer be “on the map”. One way to find out what is no longer visible to a search engine is to simply disable JavaScript; any content you can’t see, a search engine won’t be able to see either.

Tips for Optimizing AJAX for Search

You can assess how well your site will fare by asking two questions: “How will my site degrade?”, and “What is the content and context of my features?”.
The first question will tell you how well you followed the principle of graceful degradation: When leveraging new technologies, provide an alternative method so that anyone who won’t support these technologies can still access your site, usually through a more basic experience or limited functionality.
Applying this principle makes business sense, as otherwise you might not only be hiding your content from search engines, but also preventing visitors with older browsers or accessibility needs to use your site.
The next step is to understand the content and context of your features, which will help you decide what items are important when it comes to optimization. As a rule of thumb, features that pull in new content or that help with navigation need more attention than ones that merely enhance the base experience.
For example, a like “look ahead” functionality—where typing some characters in a search box will reveal a list of suggestions—doesn’t require optimization, since it plays a supporting role on the page that degrades well and will not impact search crawlers by not directly linking to additional content.
On the other hand, AJAX is also commonly used to enhance a page with supplementary content, such as ads, product details, related content, or simply to “view more” information by opening collapsed sections or moving through tabs, where this extra content is only loaded when the user needs it.
Consider instead loading all content with the page, and use JavaScript to hide what you don’t want to show immediately; this way search engines and users with JavaScript disabled will see complete information. And what if all the content can’t be loaded at once? You can create and link to extended pages that contain the extra information, and use JavaScript to hide those links or replacement with the nicer asynchronous experience.
By focusing on loading more of the right information and supporting links that can be used to degrade complex dynamic functionality, you can avoid hiding from search engines and clients that might not support all technologies you use. And on this last point, having a site map or content index is also recommended, in case everything else fails.
You can have a site optimized for organic search and for user responsiveness. Just make sure you know what you are doing.

Join the Discussion

Your email address will not be published. Required fields are marked *

Discussion

  1. Avatar Singapore SEO Consultant

    The same technique can be applied to flash content. Although flash can now be read by Google, image only flash content still remained useless to the search engine robot. The easiest way is to anticipate bots and visitors that do not have flash or js enabled and yet allow them to view the actual content.

Back to top