This ad will close in

How to Create a Search-Friendly Website

How to Create Search-Friendly Website Architecture

All of the effort you've invested in building your website won't mean a thing if search engines can't access your content correctly. One important way to make sure this doesn't happen is to have search-friendly website architecture.

To understand this issue, you need to know how search engines build the indices from which they derive the website listings displayed on their results pages. Google and the other search engines don't have teams of people who archive every single page on the Web. It relies on programs called "spiders" -- automated robots that move between links and store information in a site's code in their databases.

Making sure these spiders can access all of the content on your site is extremely important for SEO. Unfortunately, a number of website architecture mistakes can make large portions of your site unavailable to the search engines' spider programs.

Here are a few of the most common mistakes, as well as tips on how you can avoid them.

Related: 5 Tools to Take Your SEO to the Next Level

1. Overuse of content in image or script files.
Because they aren't living, breathing human readers, search engine spiders can read only the text-based content that's presented to them. If you store information in image files, Flash animations or Javascript codes, such as your website's header graphic or introductory video, the spiders may be unable to process the content appropriately.

The solution is to duplicate the information stored in these alternative formats with text versions. Try using a tool such as Webconf's Search Engine Spider Simulator to observe what the spiders see after arriving on your site. If you notice that chunks of content are missing, either provide the excluded information as text elsewhere on the page or use your site's robot.txt file -- what gives instructions about your site to search engines -- to redirect the spiders to specially designed, text-based pages you've created to provide them with the same information.

2. Deep vs. shallow navigation.
Many beginning webmasters run into trouble when a site's navigation becomes too deep. Because search engine spiders move between the pages of your site through the links you've created, it's important to make this movement as easy as possible for them. If your navigation structure is deep, meaning certain pages can be accessed only after a long string of sequential clicks, you run the risk that the spiders won't penetrate deeply enough into your site to index all of your pages appropriately.

The solution is to implement a "shallow" navigation structure, in which every single page on your website can be accessed by both visitors and search engine spiders within two to three clicks. You can accomplish this task by breaking up your navigation structure into sub-categories or incorporating additional internal links.

Related: 3 Ways to Create More Engaging 404 Pages

3. Inconsistent linking practices.
As you build these links, you'll want to be careful about how you name them. Again, because the search engines can't apply human judgment to see what you meant to do, their spider programs may index the URLs "www.yoursite.com/page1.html" and "yoursite.com/page1.html" as two separate pages -- even though both links direct visitors to the same location.

To prevent these indexing errors, be consistent in the way you build and name links. If you've made this mistake in the past, use 301 redirects to let the search engine spiders know that both the "www" and "non-www" versions of your URLs are the same.

4. Incorrect redirections.
When it comes to 301 redirects, any time you move the pages on your website -- whether you're simply renaming them or transferring your entire site to a new hosting account or URL – you'll want to put the correct redirects into place. Failing to do so can result in future indexing errors and eliminate the benefits provided by the backlinks you've spent time acquiring, as these links no longer point to valid pages. Both of these issues can reduce the search engine results rankings you've worked hard to develop.

Because this topic can get technical and can't be covered adequately in a few sentences, you might want to seek out online resources for additional reference, such as SEOmoz's guide to website redirections.

Related: Four Low-Cost Ways to Turbo-Charge Your Website

5. Failure to include a site map.
As you improve the accessibility features of your website's architecture, make sure you have a site map in place. This file provides the spiders with an accessible reference of all the pages on your site, allowing indexing to proceed correctly. For instance, here's Entrepreneur.com's site map.

If your site runs on WordPress, Joomla, Drupal, Magento or any other established platform, you should be able to install a plugin that will automatically generate a site map page for you. If not, creating a site map can be as simple as building a single HTML page with links to all of your other pages and submitting it to the search engines for consideration.

AJ Kumar is co-founder of Single Grain, a digital marketing agency based in San Francisco. Single Grain specializes in helping startups and larger companies with search engine optimization, pay-per-click, social media and various other marketing strategies.

Loading the player ...

Shark Tank's Daymond John on Lessons From His Worst Mistakes

Ads by Google

0 Comments. Post Yours.