Wednesday, August 17, 2011

What is your keyword targeting strategy?

Let me ask you a few questions. What is your keyword targeting strategy? Do you go for the more general and competitive terms or for the more specific, longer and less competitive ones? Do you prefer shorter or longer articles? Are you targeting Google or Yahoo and MSN?

Before I lay down my general keyword strategy, let me explain what is the “keyword ladder” and then I will elaborate on why I believe in climbing it (a bottom-up approach).

Imagine a ladder where you put the most general and competitive keyphrases on the top rung and the least competitive keyphrases on the bottom rung.

As an example, let us consider the SEO industry. On the top rung we would have general queries like: “search engine optimization”, “search engine marketing”, “seo” etc. Somewhere below we will have less competitive and more specific keyphrases like: “google search engine optimization”, “yahoo search engine optimization”, “search engine optimization services” etc. Going down the ladder we might come down to queries of the type: “affordable seo specialist in New York”, “where can I learn seo for free” etc.

What are the general characteristics of the ladder? The top keyphrases are the most competitive, they have the highest traffic potential per keyphrase, they are fewer in numbers and they convert worst (there are too general).

Going down the ladder we come to keyphrases that are less competitive, easier to rank for, have lower search volume, but are higher in numbers and usually convert much better (because they are very specific).

The problem with very specific long queries is that it is hard to keyword research them. Depending on who you ask, about 20% to 50% of all daily queries are unique never-searched-before ones. Add to these the loads of queries from the bottom of the ladder that are searched for once in a while and we come to the fact that most of the queries from the bottom of the ladder don’t show in keyword research tools. That of course, can’t stop us from targeting them.

I try to target the bottom half of the ladder for at least these reasons:
1. It is easier to rank higher for less competitive keyphrases

2. The traffic you can get from thousands of lower profile keyphrases is more diverse and stable since you don’t put your eggs in too few baskets (keywords)

3. The bottom half of the ladder can provide more total traffic, since the number of low profile keyphrases is higher (it is like getting 100 visitors from one general keyphrase vs getting 10 visitors from each of 10 less competitive queries)

4. Searchers are starting to use longer and more descriptive queries, which bumps up the traffic potential of the bottom half of the keyword ladder

5. The bottom half of the ladder converts better (more revenue for you)

6. Targeting many low competitive keywords at the same time is not that difficult on Google. If you still haven’t noticed, Google has introduced ranking scores which push up the overall rankings of pages and sites that rank well for a variety of keyphrases. In other words, when you have a lot of content, you rank well for a variety of less competitive queries, which pushes up your rankings on the more competitive queries from the top half of the ladder, which in turn pushes the rankings of the even more competitive ones etc. Or to restate it in other words, with Google it is much easier to rank for the general high search volume queries (top half of the ladder), when you have already conquered the bottom half of the ladder. That is what I call ‘climbing’ the keyword ladder. All you have to do is have loads of content.

Let me get back to the last point above: you must climb the keyword ladder by starting from less competitive queries and gradually increasing your rankings for more and more competitive keywords.

Here’s a snippet from Google’s patent “Information Retrieval Based On Historical Data”:

“Thus, the quantity or rate that a document moves in rankings over a period of time might be used to influence future scores assigned to that document. In one implementation, for each set of search results, a document may be weighted according to its position in the top N search results.”

Every time you rank a page in the top N results, you may get a little general ranking boost (for other queries). The more times you can rank pages in the top N search results, the greater ranking boost you get. You can rank a great number of keyphrases in the top N when you have a lot of content and when these keyphrases are from the less competitive bottom-half of the ladder.

Google keeps statistics of which queries you rank for and which are selected by users within the SERPs. You can see the top stats, if you use Google Sitemaps.

The above “content is king” ranking factors are very different from how Yahoo and MSN operate. Let me give you a real world example.

You are writing an article that discusses how to optimize AdWords campaigns. You wrote a very comprehensive and long article that covers the ins and outs of AdWords. Now you are coming to the point where you need to optimize the article for search engine traffic. You do some keyword research and it turns out there are let’s say 20 good keyphrases to target. How do you target all of these?

Since Yahoo and MSN (and also Google not so long time ago) rely mostly on the anchor text of incoming links, link popularity and page titles, you need incoming links with the anchor text targeting these 20 keyphrases. You also need to place as many of these keywords in the page title. That seems like an impossible task.

Some webmasters may try to partition the content into multiple pages trying to target each of these 20 keyphrases. But how do you get links to all these parts of the article. There will always be someone to outrank that approach by targeting a single page with a specific keyword from these 20.

What most webmasters do at this time, is they just pick one or two of these 20 keywords to target. That is not bad but it is unnatural. It is like webmasters who offers generally the same AdWords content, share the traffic between them by everyone emphasizing a certain keyphrase.

Google is clearly trying to stop this traffic partitioning. Google will try to infer which is the best AdWords page and will try to send as much relevant traffic (including these 20 keywords) to the top few authorative pages.

How will Google do that? By boosting the rankings of the sites that have real content (having rankings for the bottom-half of the ladder). Not so long ago, I was into the “links are king” camp. Now I am at an equal distance between the “content is king” vs. “links are king” camps (maybe closer to the “content is king” camp, when I think long-term).

I don’t want to be misinterpreted here. Sites with almost no content can outrank sites with a lot of great content. Fact is, everything else being equal, a site with no content (haven’t conquered the bottom half of the keyword ladder) will need more and higher quality links to outrank a site with great content. To reframe it another way: having a lot of content decreases the amount and quality of links you need.

Naturally, you should be able to get higher quality links long term with a great content site. Your competitors will need to buy many links in order to compensate for their low-quality websites. And sites with no content will have a hard time overpowering your bottom-of-the-ladder traffic. They can only steal your general keyphrases traffic.

Let me take out my crystal ball and see into the future of search engines…Hmm, I see Yahoo and MSN copying Google… I see Yahoo and MSN becoming more Google-like…

Got the point? Sooner or later, Yahoo and MSN will find a way to give ranking boosts to genuine sites with loads of content and little over-optimization. We will start to see Google doing this better and better. In a way, it supports the notion of “the rich becoming richer” – or having the SERPs dominated by a handful of the most authorative websites.

Now let my lay out my keyword targeting strategy

Write Long Pages

When you write longer pages, you use more unique and repeated words per page. The repeated words increase the rankings of the queries they participate in. The unique words open the possibility of ranking for more keyphrases. Any way you look at it, the more words on a page, the more keyword phrases you target. It is that simple.

When your pages are longer, you are basically going for more keywords from the bottom of the ladder. You get these and Google boosts your rankings for the more competitive ones.

Longer pages will always outperform shorter ones.

Write naturally

Don’t overrepeat one or two phrases. Write naturally. Good content is written naturally. While search engines cannot understand the quality of content by way of interpreting its meaning, they can detect and devalue unnatural overrepeated and stuffed content. When you write naturally, you usually use synonyms, related words and that ups the number of potential phrases you target.

But wouldn’t that dilute your keyword density? Forget the nonsense of keyword density. Keyword density has never-ever been used by Google or any other decent search engine (because it does not improve relevancy). That is one of the SEO myths trotted by the SEO “experts”.

Optimize your page titles

When you do keyword research, remember, that you cannot research the bottom of the keyword ladder. You research the top of the ladder – the competitive high-traffic, most common search terms. Place the most general ones in the title of your home page and the titles of the major sections. Your content can target one or two general keyphrases in the page titles.

Research and use the common keyword subphrases Have you noticed that if you make a long list of queries related to your pages, you start to see common subphrases.

Let’s say you sell a weight loss ebook. You can target keyphrases like: weight loss book, weight loss ebook, weight loss program… but you can also target keyphrases like: diet book, diet ebook, diet program …

Here we see that a lot of keyphrases that are common to queries relevant to your page include subphrases like “weight loss”, “diet”, “diets”, “dieting”, “calories” etc.

It makes sense to think that most of the queries from the keyword ladder (top + bottom parts) will include the above subphrases as parts of the queries. These common subphrases are usually some of the more general queries (top of the keyword ladder).

When you have identified the common subphrases you need to do 4 things to greatly increase your chances of ranking for a lot of queries.

1. Use all these common subphrases generously in your long articles. Don’t focus on one subphrase. Use all of them + their stemmed variants. These are the ones that should be repeated more often. All the other text (non-common subphrases) should be written naturally. To use as many of them, you need to write longer articles.

2. When using the common subphrases within content, write them as parts of the longer queries. If you focus on “weight loss”, use it as “weight loss book”, next time as “weight loss program” or “weight loss failure” etc.

3. Place a few of these subphrases in the page title

4. If you have control over the anchor text of the incoming links (as with home pages), try to inject as many subphrases as possible. Anchor text is still very important. Let’s say that the subphrase “weight loss” is a subphrase in 8000 potential queries you may target on your page. You need to inject it into as many incoming links and then all the other words on the page will combine nicely with “weight loss” to form longer queries and increase your rankings for a great variety of phrases. If you submit to directories, rotate the anchor text subphrases. In our case rotate among “weight loss”, “diet”, “calories” etc.

There are obviously a lot of other factors to consider. Pages are going up and down the ladder constantly. Have you noticed that after each update your traffic and rankings increase or decrease across most of your pages? In a way, all ranking factors together either made you climb the ladder (rank for more queries) or made you go down the ladder (rank for less queries).

That is why I believe the most powerful factors for ranking on Google are domain based – they act upon all pages on the domain. One of these domain based factors is the “how many queries do you rank for in the top N results”. Google’s patent on “Information Retrieval Based On Historical Data” clearly states that a “document” may mean a page, a site, a part of a site (a subfolder).

Sunday, August 7, 2011

Domain Selection

There is alot of debate regarding SEO and the use of keywords / domain extensions. Everyone agrees that having your keyword in the domain is good but there are alot of theories regarding .net, .com, numbers, dashes, length etc. We definitely have an opinion here that is not based on theory but on measurable results. Over the past five years we have developed a domain selection order of importance. This is crucial when forming an SEO Monopoly. Here it is,
#1 = keyword dot com
#2 = keyword dot net
#3 = keyword with additional word dot com (the additional word should be genaric like news, blog, info etc)
#4 = keyword dot org
#5 = keyword with dashes dot com
#6 = keyword with number dot com (the number should be one digit in front of domain, exp 4domain.com
There is no question that in Google the keyword without dashes dot com gives you a large amount of leverage. However, it is rare that the keyword dot com is available. So, if # 1 is taken then we go to # 2 and so on so forth.

Tuesday, August 2, 2011

White Hat versus Black Hat seo

SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO. White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.

An SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.

White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility, although the two are not identical.

White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's PageRank.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.  Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.

Additionally, many professionals in the SEO industry refer to "gray hat" tactics that may skirt the lines of black and white hat tactics. Numerous references to gray hat techniques have been published, and these usually constitute practices that are not strictly disapproved by search engines, but may go against the spirit of the regulations that search engines have laid out.

Saturday, July 30, 2011

5 Tools to See a Website Like a Search Engine

SEO tools have come a long way, and it is easier than ever to view a website the same way a search engine does. Below are five tools that make this task quick and effective.

Google Cached Version

SEOs have long used the cached version of their website to glean how Google is viewing it. To view the cached version of any page simply type “cache:www.yourdomain.com/” into the Google search box. Once the cached version of your webpage loads, click on “Text-only version”. This will enable you to quickly identify if any of your text or links are not being crawled in addition to any hidden text.

SEO-Browser.com

While checking Google’s text-only cached version of a page can be very helpful, it is difficult to do this for an entire website. This is because clicking on a link to another page in the cached version will bring you to the live version, thus making it difficult to quickly examine the entire site. For this reason I often go to seo-browser.com. It is essentially the same as viewing the text cached version, but it allows you to obtain a better understanding of how the pages interact with each other. Simply type your URL into the search box on the homepage and click “Simple”.

Google Webmaster Tools

Google Webmaster Tools has two great features that help you see how the Googlebot views your website. For a quick reference you can use the Site Preview Tool. This will show you the same thing as clicking on the magnifying glass next to your website in the SERPS. However, it is easier to use since it compares your actual page and previewed page side by side and doesn’t highlight snippets of text from your site.

For a more in depth report perform a “Fetch as a Googlebot” within Webmaster Tools. This is similar to viewing your source code, but it is going one step further by showing you how Google views your source code.

Web Developer Toolbar

The Web Developer Toolbar is a well known extension for Firefox and contains three essential features for enabling you to see a website the way search engines do. These features include:

1. Turning off cookies (search engine crawlers don’t carry them)
2. Turning off JavaScript (search engines struggle to read it)
3. Turning off Meta redirects (search engines don’t blindly follow them)

By simply disabling cookies and Javascript while browsing a website you will gain a better understanding of what the search engines are seeing while still maintaining the overall look and feel of the site as you browse.

Likewise, it is also helpful to turn off Meta redirects in your browser, since it is likely that the search engines will crawl a page with a Meta redirect before moving on, whereas some Meta redirects will send a visitor away so quickly they might not even realize an additional page exists.

User Agent Switcher

A user agent switcher or spoofer allows you to change the user agent string, which identifies what browser you are using to the website you visit. In other words, even if you are browsing the web in the latest version of Firefox, you can change the user agent to appear as if you are using IE6. My favorite is a Firefox extension from Chris Pederick, who created the Web Developer Toolbar mentioned above.

The best part about this extension is the ability to set the user agent string to any one of the major search engine bots. This will allow you to quickly determine if a website is cloaking by presenting the search engines with different content than users.