Optimization involves using the right keywords, in the right locations, and in the right densities. It also involves creating source code that is intuitive to the search engine robots. In other words, only displaying what you want them to see by hiding programming and formatting as much as possible.

First, we’ll discuss the Pre-Design Phase of the Optimization Layer, which involves using the keywords you uncovered in the Analyze Layer to help you develop “relevant content” (seen as relevant to the search engines and users). Second, we’ll look at the actual coding of your website and things you can do to clean up (Optimize) your web pages. It’s here that we’ll be analyzing keyword densities, Meta tags, link text, and other criteria for an optimized website.

Pre-Design Phase

This phase could consume you, so I highly recommend using MS Project or Excel to map out your strategy and tasks in terms of milestones and deadlines. Trust me, if I didn’t set deadlines, I could easily spend a month on a phase that I could get the most important pieces from in about a week.

Have all your applicable keywords in hand before starting the design. These are the keywords we dug up in the Analyze Layer. Using “assumed keywords” is the worse possible gamble you could make; take the time to uncover the most valuable keywords for your website.

Linking to external websites

Authority sites provide huge amounts of information, but without linking to other websites they can become a “spider trap”, meaning the webcrawlers cannot leave the site to continue their journey. Everything you can think of is analyzed by the search engines, including these external links, so I can’t emphasis enough how important it is to create at least 1-3 per page.

Explaining the Title and META Tag Guidelines

META Tags aren’t being used nearly as much as they once were. In fact, Google doesn’t give any weight to the Keywords META tag whatsoever. However, nearly all search engines use the Title and Description tags in determining relevance and as the text that will appear in the search engine results.

Misuse of these tags could result in ranking penalties; not using them at all could result in not being included in the search results page. Therefore the guidelines should be strictly followed.

Keyword Density Explained

In my eyes there are really 3 different keyword densities I need to be concerned about: one-word, two-word, and three-word densities. You could find keyword densities in everything from the title to the content. Basically what we’re talking about here is how many times your keywords appear in the page in proportion to other text used on the page.

The robots.txt File

This simple little file can get you listed or keep you from ever being listed. The robots.txt file tells the search engines whether or not you would like to be indexed, and if so, which pages on your website you would not like to be indexed.

Using Google Sitemap Technology

Google is now offering a way to exclusively list every page you would like indexed in one file, called sitemap.xml. Pages not listed in the sitemap.xml can still be indexed during the Googlebot’s normal monthly crawl; this is just a way to almost guarantee inclusion in the Google directory.

Performing Broken Links Tests

Broken links on a website can get you penalized, so be sure to run a broken link check anytime you make a change to your website. Search engine spiders hate broken links because they act as sort of a deceiving dead-end; in other words, avoid them all costs!

Find an Editor for Syntax and Readability

Have a third party analyze your text for you. Ensure that they never hesitate to include “I don’t understand” in areas that require clarification. Now I know my sites are not only without spelling errors, but are understandable as well.

This sums up Optimize! Thanks again my loyal followers and subscribers. Catch you on my next post
edit post

Comments

0 Response to 'SEO Optimize'

Post a Comment