Monday, July 12, 2010

SEO-Friendly URLs

Without proper SEO Services, a website stands numerous chances of not getting indexed by search spiders, which in turn, puts a question mark on its ranking in the SERPs and of course, the conversion rate. The situation can be easily resolved by implementing some sort of "cosmetic" operations on a site, and an integral part of these operations is URL Rewriting. Although, it is considered a bit time-intensive, it is very quite effective in the long run by others, is URL rewriting.

Secondary Search Engines

Secondary search engines target at smaller, more specific audiences. They are basically the topical and regional search engines that can provide a user tremendously narrow and focused search results when looking for specific information. However, the volume of traffic they generate is much lesser than the primary search engines do.

Some of the most popular secondary search engines are Lycos, LookSmart, Miva, Ask.com, and Espotting. Although the current list of secondary search engines is fairly small, it is expected to grow and extend with time.

Like primary or major search engines, these search engines also vary in the way they rank search results. The focus remains more or less on keywords and reciprocal links, whereas a few others rely on some other specifications such as meta tags or some proprietary criteria. However, experts suggest that although the amount of traffic generated is not as much as primary search engines do, but their results may gain valuable traffic that should not be overlooked.

Reporting Search Engine Spam

The process used to legitimately eliminate perverse or unfair competition from the search is termed as “search engine spam reporting. The presence of such website or content on the search engine, devalue its significance in terms of being an effective resource. If a competitor is making frequent use of various black hat SEO techniques or spam tactics to gain top positions on the search engines, it is in your best interest and indeed, the best interest of users also, who are getting directed to websites through search engine results, to report the site.
Normally, all black hat SEO services that artificially inflate the apparent relevancy of inferior web sites can be reported as spam.

SandBox Effect

According to industry experts, sandbox is an algorithmic effect, where Google condenses the page rank of new domains on temporary basis by placing them into what is termed as "sandbox". The step is believed to be taken to stand against the tricky techniques of search engine optimizers attempting to influence Google's page ranking through the massive creation of inbound links to a new web site from other web sites that they possess. Google's sandbox is a filter that came into existence back in March of 2004.

Page Rank

PageRank is a link analysis algorithm, given by Google King of search engine in the form of numerical weighting ranging from 1 to 10. The PageRank is assigned on the basis of each element of a hyperlinked set of documents, so as to measure its qualified importance contained in the set. Generally, it is described as a Google’s tool, used to weigh up every webpage on the grounds of the number and quality of backlinks.
In simple words, Google performs “elections” where each web page gives votes for web pages with hyperlinks to those pages. Some pages having high integrity are capable of having more than one vote and thus, are given more weight, depending on the search engine result page rankings)and hence help in improving the targets’ PageRank.

Micro Websites

Micro websites are compact and informative websites used by organizations apart from their major websites. It usually contains around 5 to 6 primary pages including home page, product or services page, service portfolio and a contact page with an enquiry or feedback form. These websites are the great help when it comes to analyze the initial online presence of your business or services and its possible outcome. In initial phase of your business promotion, micro websites can be the best associate that helps you to market your product in the global market place and later assists you to convey the wide product scenario among the buyers.

Mirror sites

As it name signifies, mirror sites are a nearly identical or duplicate website of the original one. In common terms, it is basically a little modified or tweaked replica of the original website. However, although the technique is highly purposeful when it comes to accessibility, it is marked under black Hat SEO Tactics and is not supported by the search engines. It is quite easy to create mirror, especially with the use of software that do the automated mirroring of entire sites

Meta Tags

Meta tags are small packets of data about the web pages rendered for the web crawlers. These tags are written in HTML and specify the source code to offer content and site information to search engine spiders. The Meta elements are normally used to state page description, keywords, author of the document, previous customized and other metadata. These tags help you manage the way some search engines may index your site, with the use of keywords and descriptions you offer. So, spiders rely heavily on the information provided by the site owner in the Meta Tags.
Typically, these snippets of informational code presented between your tags that are not counted in your generated HTML document.

LongTail SEO

Long tail SEO is an intelligent technique of bringing together lower-volume and higher-converting keywords for a major impact. The technique is specifically devised to gain relevant traffic from long search queries that are more focused and less frequently searched by searchers than other primary search terms.
As for instance, the term “television” receives millions of Internet searches every month. Hence, the keyword not only covers a wide range of searchers but also infuse the biggest podium of competitors. Therefore, if you have a website that sells televisions, then raking high for “new LCD television sets” can ensure you more narrow but relevant traffic. This because in “new LCD television sets” would not bring you visitors looking for television repair services or etc.

Link Farms

A link farm is a website page or collection of websites having an extensively large list of hyperlinks to different websites with no specific relationship in terms of groupings, categories, or any connection to the site domain name. Although most link farms are a creation of automated programs and services, a few can also be built by hands. It is a kind of spamming the index of a search engine, which is often termed as spamdexing or spamexing.

The Intent of link Farms
Link farms are created to falsely increase the count of incoming links to a website so that the site's index ranking gets a rise in the search engines. An illegal link farm is a kind of search engine spam. It is purposely used for tricking search engines into placing a website among the top position sin the search results, with the backing of inbound link coming from link farm for a specific keyword than the website merits on its own. Such websites usually get penalized by active search engines that are always on check of banning such websites using illicit means of building links and having no real significance. Also, such websites display no PageRank. Adding a link to your website on a link farm may result in getting your website penalized by the search engines.

Keyword Stuffing

Keyword stuffing is a Black hat SEO technique used to add unnecessarily a good number of instances of the targeted keyword phrase to a web page to showcase higher level of relevancy to the search engines and gain top most positions in the search result for those particular keywords. Although, such purposeful integration of several specific phrases in the content of the web page may result in tempo temporary rank benefits but in the long run, it may lead to a series of problems.

Doorway pages

Doorway pages are web pages written in simple HTML pages by customizing a few particular keywords or phrases, to gain attraction from specific search engines and their spiders. The core objective of the doorway pages is to trick the search engines into rendering the site top rankings, only for a limited period of time. Doorway Pages are specifically intended for search engine spiders whereas for human visitors, it is more or less a redirect page that leads to the “real" website. Doorway pages are often termed as "entrance" or "bridge" pages.


Google's Take on Doorway Pages

According to Google's specific recommendation, doorway pages are created purposely for search engines or other "cookie cutter" approaches like affiliate programs with almost no original content and therefore, should be completely avoided by the websites.

Crawler

A web crawler (Search Engine Spider) is an automated and orderly managed program that browses the internet and creates a copy of all the websites it frequently visits. Search engines make use of these to index the downloaded pages and accelerate the entire search process. A crawler has ability to index million pages per day but due to use of different algorithms they produce different search results.


How crawler works
During the process of crawling, a Crawler not only visits the landing pages and primary pages of the website but also all relevant links posted on the websites. Content and keywords on a website are also the vital components that attract the crawler. Every search engine has a scheduler mechanism to intimate crawler the frequency of the crawling and the list of the documents to be crawled next.

The crawler explores and indexes web pages of your website so that whenever a user searches a webpage with a relevant key phrase, your website turns up in the search query. The process of crawling includes bringing together a very comprehensive list of the keywords from the database. After indexing the website, a number of algorithms are used by the search engines to evaluate the visibility of your website in the search engine.

Black Hat SEO Tactics

Regularly webmasters try to "trick" the search engines into ranking sites and pages through illegitimate means. A number of tactics such as hidden text, interlinking, redirecting, doorway pages, keyword spamming and etc, are meant to only trick a search engine into the game of gaining rankings. However, the websites using black-hat SEO tactics are most likely to jump down from these places as fast as they rise.

Given below are some of the most frequently used black-hat tactics. Most SEO's and webmasters modify these tactics in hopes that the new technique will work. Honestly they may, but only for a short while.
Black-Hat SEO Tactics:

Keyword Stuffing
Highly abused form of search engine spam, keyword stuffing is one of the most frequently exploited black Hat SEO services. Heavy amount of related keyword usage in content seems irrelevant to the search engine. Obviously, since this sort of content has no value to it, it is generally placed at the bottom of the web page in very small font size.

Hidden Text

Hidden text is the text that is present on the website only for the spiders to read and not the human visitors. These are the texts that are written in the same color as the background or something very close to it. The technique is best described as blatant spam tactic and websites using it are usually quickly reported to search engine as spam by the competitors and genuine searchers so that they can get it blacklisted.

Interlinking
As incoming links have become a major aspect of search engine ranking, many webmasters have found an easy and lucrative way to build multiple websites and link them together to strengthen the overall link popularity. Performing this trick is obviously much easier than doing the linking accurately. Moreover, as it becomes difficult to detect such interlinking websites, the genuinely linked websites suffer, where they find it difficult to get placed due to the abundance of interlinking websites.

Redirects
    The concept of redirects is complimentary to that of doorway pages. Since doorway pages contain low or no substantial content, redirects are at times, applied to directly direct a visitor to the original page having genuine content such as the homepage of the site. However, if search engines detect these redirects, or if they are reported as spam, then the site can get penalized. Usually, a genuine competitor or an unhappy searcher may get such websites spammed

Cloaking
Cloaking is a technique of presenting dissimilar information to the search engines than a human visitor would see. Although many techniques are today running in the market on the name of cloaking practices, very few are actually detectably by the search engines. Hence, in order to get the cloaking practices out of the market, a lot depends on the reporting search engine spam.

Duplicate Sites
With the popularity of affiliate programs, webmasters find it outright to create a copy of the site they need to promote. With slight tweaking, these duplicate websites are put online, intending that it would outrank the site it was promoting and get their sales. However, since search engine demand unique content from each website, creating duplicate websites is completely banned. Moreover, search engines have introduced their own methods to identify and remove duplicate sites from their index. Also, sites that simply modify to escape automatic detection through the help of hidden text or the, have the chances of getting reported to the search engines and be banned that way..

Doorway Pages
Doorway pages are basically normal web pages added to a website with the sole purpose of targeting a specific keyword phrase or phrases. Normally, these pages provide very little or no information to a visitor and have clear intentions of promoting a specific phrase, hoping that a visitor would land and get directed to the homepage of the website. This is one of the most popular black hat search engine optimization technique and most search engines have special tools to detect such pages. Also, a report of this practice can get your website removed from the search engines results along with pages ranking good with genuine SEO practices and unique content.

Reporting Your Competitors
It is entirely legitimate to report spam the competitors using black hat SEO tactics to gain top most positions in the search engines results. If a competitor using unfair tactics to rank higher than you then ensure to report them.

If you can spot websites that according to you are practicing illegitimate tactics to rank higher than you on the search engines then drop a visit to our Reporting Search Engine Spam page for gaining proper information about reporting a website spam. However, ensure that your website is not employing any such dubious practices.

Cloaking

Cloaking is a smart yet risky technique used to present different content to the search engines than a human visitor would see. It is a Black hat SEO technique employed to present dissimilar content for search engine and human visitors. A totally deceptive search engine optimization technique, it actually cheats search engines in order to rank well for desired keywords.

The Penalties for Cloaking
Cloaking is strictly outlawed in most search engine policies. Detection of any cloaking activity may result in banned website.



Google take on Cloaking
In order to main the high standard of accuracy and quality of the search results, Google may permanently ban from our index any sites or site authors found to be engaged in cloaking activities to change their search rankings.
For cases where search engine is not able to automatically detect a website using cloaking practice and the competitors of the same can send a spam SEO report to the major search engines. On these request research would be done and if found true, action to ban the Cloaking would be taken.
However, irrespective of these strict threats and warning, experts suggest that a good number of methods are known today, under the heading the cloaking, out of which, many still remain undetectable by the search engine.

Bad Neighbourhood

Bad neighbourhood defines a websites that links to a website or network of website that are or have had employed unethical practices such as Link scheming, hosting spyware, malware, offensive material, including other illegitimate activities such as phishing. Don't link to these websites in order to gain a link to improve your search engine rankings, you are doing more harm than good to your website.

Search Algorithm

A definite set of rules specified for a search engine to measure the relevancy of the web pages and sort their listings which will be displayed in response to a search query. It is basically a mathematical formula employed by a search engine to agree on where a web page will rank in the search results for a particular key phrase. Each search engine follows its own algorithm to allot rankings. Also, the algorithm keeps changing frequently from time to time to eradicate the possibility of guesswork and speculations by intelligent webmasters.

Search engine frequently keep changing their algorithms to stand against “spam”. If people or more precisely, webmasters can figure out the algorithm of a search engine, they can easily manipulate the raking of their websites. This will give irrelevant, manoeuvred results.

Wednesday, May 26, 2010

SEO Glossary

Crawler

Doorway

Algorithm
An algorithm is a mathematical formulae or an operational programming rule to establish the position of a webpage in the search results.

‘Alt’ attribute
It displays alternative text description for an image when a user hovers mouse over it.


Anchor text

This visible text in hyperlink allows user to move to other pages by clicking on it.

AdWords
Cost-Per-Click based advertising introduced by Google.

Blog Farm
Collection or group of blogs purposely used for link building and populated by RSS-feed scripts.

Back Link
A link or number of links on web pages that point to the main or subject page

Bad neighborhood
A bad neighborhood over the internet is a website that uses prohibited or controversial tactics to rank its position in the search engines.

Bid management
Process to placing a bid price that a user wants to pay on a PPC search engine.

Bot
Software that’s generally used by the search engines for web spidering and webpage indexing.

Cache
Last recorded view of a webpage stored in the search engine’s database

Cloaking
A Black hat SEO technique where the content provided to the search engine spider is different from the information displayed to the user’s browser.

Black Hat SEO Tactics
This is best described as the use of “unfair” or unethical SEO techniques to drive results in search engine result pages among other things. However, mostly the traffic or artificial clicks gained from the use of black hat methods are extremely temporary.

Crawler
A software program that browse the websites in very orderly and automated fashion. Crawler also executes link analysis and HTML code validation functions.

Cross-linking
Term used when several sites are linked together in order to enhance the link popularity. Sometimes it’s also called interlinking.

CSS
Commonly known as Cascading Style Sheets. The most common application is used to format web pages written in HTML or XHTML.

Conversion rate
This is the rate at which a visitor converts in the potential customer

Directory
A website that acts as a repository of other website listings where the websites are listed in various categories. These directories are either free or paid.

Doorway pages
These are specially built pages meant to draw visitors to your website. These standalone pages work as a bridge between user and your website. Once the spider comes on the doorway page, it directly directs to the real website.

Description
The term refers Meta tag description in the head section of the page to describe primary purpose of the website. In other words, the term ‘description’ also refers to the full elaboration of your website segment during the directory submission.

Delisting
This is the scenario when webpage(s) of a website are removed from search engine indexes or banned from for any such reason.

Graphical Search inventory
Advertisement units like banners, animations and browser toolbars that can easily be synchronized with the keywords, are known as the Graphical Search Inventory. This non-text based advertising medium depends on the manner in which content is displayed.

Google Bot
This Google’s search engine spider visits the links across the internet to feed pages to search engines index. As many quality links your website have, priority of frequent spider visit at your website increases.

Head
This is a section in your website where you can find the code that includes title for the webpage, keyword Meta tag, description Meta tag and robots tag. The section is invisible from general users.

Hidden Text
These are the text, made hidden by setting the font color of the text same as the background color of the page. It renders the text invisible unless user highlights it.

Indexing
The term indexing or search engine indexing refers to the collecting, parsing and storing data on the World Wide Web for accurate and easy retrieval.

Keyword
These are the words that describe the theme of the website or particular webpage. Keywords and keyword phrases are used in SEO process to optimize the web pages for high search engine rankings.

Keyword Density
Term depicts the availability of a given keyword in 100 words of text. It equals a keyword density of 1% .


Keyword Stuffing
The term refers as a controversial practice of adding extra keywords in the web page in order to make the page more visible to search engines.

Link Farm
Group of interlinked websites, set up intentionally to increase the link popularity to those sites. Engaging in link farming is prohibited by the search engine.

Link Building
Acquiring valuable links from external websites for web documents through request, referral or lease/purchase.

Link Popularity
It refers to the number of hyperlinks that point to a particular web page. This is a vital factor that determines the PageRank of the page.

Longtail SEO
Long tail SEO is an attempt to bring together lower-volume and higher-converting keywords so as to build a major impact. The concept is especially devised to obtain relevant traffic from long search queries. These queries are focused and less frequently used by searchers in comparison to other primary search terms.

Meta Tags
These are special HTML tags that store the information about a webpage. Meta tags are not displayed in the web browser. It provides the information about the theme of the page, keywords of the page and description of the page.

Mirror
A mirror sites is the exact or identical copy of another website. Mirror sites are used to deliver multiple source of the same information.

Macro SEO
A group level search engine optimization technique to promote a number of websites together for one or more keywords.

Micro Website
A group level search engine optimization technique to promote a number of websites together for one or more keywords.

Offsite Optimization
A process to optimize and increase a website’s search engine rankings primarily with link exchange.

Optimization
A very popular process to tune up the website(s) to make it more visible in search engines for achieving high search engine rankings.

Outbound Link
These are the links on your website that point to other websites

PageRank
This is Google’s tool to evaluate every webpage based on the number and quality of backlinks.

Pay per Click
A process used by prominent search engines where advertisers pay for their published ads on the websites when a user clicks on the add link.

Reciprocal Links
Term refers a process of link exchange between two sites to improve website’s search ranking on a search engine results page (SERP).

Redirects
A tactic to send a visitor to other webpage than the one that’s particularly explored by the keyword search in the search engine. The process is risky as search engine spider doesn’t follow the redirects sometimes.

Robots.txt
A set of instructions for spiders to allow or deny them indexing of website page(s).

Sandbox Effect
When Google systematically ignores new websites to be indexed for its competitive keywords and key phrases, the term is called Sandbox effect.

Search Engine Optimization
Very well-arranged technique to place appropriate title tags and Meta tags, and that the keywords in a page in order to obtain higher search engine rankings.

Search Engine Ranking Report
The report evaluates web page rankings by analyzing a range of keyword(s) in the major search engines.

Search Engine Result Page
It displays an indicative list of search results that are relevant to the keyword(s) or search phrases, user enter.

Search Engine Submission
A process to submit a website to search engines in order to allow them to list your website in their indices and search results

Slurp
Crawler from yahoo. It crawls the World Wide Web and submits the content into the Yahoo search engine

Spider
A spider is an automated program that randomly visits the websites and reads entire content for making their entries in the Google search engine indices. In other terms, a spider is a robot used by search engines to list websites on the internet.

Spamming
In search engine terms, spamming is a tactic to repeat keywords or key phrases in the website to achieve higher ranking in the search engines.

Reporting search engine spam
This is the procedure to legitimately elude perverse or unfair competition from the search base. Searchers may report a website spam, if they find it irrelevant according to the related theme.

Secondary search engines
Secondary search engines are the topical and regional engines that target at smaller, more specific audiences. These search engines provide narrow and focused search results while searching for specific information.

Title Tag
Found in the ‘Head’ section of a Web page, this HTML tag is used to define the text that acts as the title of the search listings

Traffic
A collective amount of visitors that visit a website for their own purposes.

URL
Broadly known as ‘Uniform Resource Locator’ and is the address of a particular website on the internet. It starts with http://.

SEO Friendly URL
Creating SEO-friendly URLs is a time consuming technique, employed to open the website to a massive platform of relevant traffic.

White Hat SEO Tactics
Any SEO technique that complies with integrity and veracity of the website and follows the search engines guidelines is seen as a white hate search engine optimization tactic.

Monday, March 15, 2010

Ban from Google

Banned from Google? Ouch. Anyone who’s been banned from Google before knows that it’s not a fun ordeal. As of June 22, 2009, a report shows that google.com is used for around 90% of all searches done on the internet (worldwide). And with the next closest search engines in the running being 85 points away (yahoo.com at 5.5% and the new bing.com at about 4.5%) that’s a pretty big chunk of your traffic being cut out when Google decides to ban your site.

How to Know if you are Banned from Google

Sometimes it’s not so easy to know if you are banned from Google. So I’ve started a “You Might be Banned from Google if…” list. Feel free to add to the list in the comments.

  • If yesterday you were ranking on the first page of Google and today your site is nowhere to be found in the SERPS, you might be banned from Google.
  • If you do a site search (site:yoursite.com) on Google, and it brings up no results, you might be banned from Google.
  • If yesterday your page rank was 5 and today it is 0, you might be banned from Google.

Common ways of getting banned from Google.

Now, of course, if you’ve been banned, you probably already know why it happened. But just in case you’re still in a daze, here are some common ways to get your site banned from the ever powerful Google search engine:

  • Hidden text or hidden links – when you think about how this is done (making the color of the text the same color as the background that it’s placed over, how hard would this really be for Google to detect with a small piece of code in their algorithm?
  • Use of cloaking or sneaky redirects – and yes, Google calls them “sneaky.”
  • Loading pages with irrelevant keywords – aka keyword stuffing.
  • Creating multiple pages, subdomains, or domains with substantially duplicate content.
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other mal-ware.
  • Producing “Doorway” pages created just for search engines with little or no original content. The key to this is “little or no original content.” If you’re actually adding new content to the web then this shouldn’t be a problem.

There are also some “back-door” ways of getting your site banned from Google, such as including several links to sites that are known for spamming, thus causing Google to draw the conclusion that you’re affiliated with them.

Should you feel the urge to read more on this topic, visit google.com for a few more tips on creating a “Google-friendly” site.

If anyone knows of any other sure-fire ways of getting banned from Google, let me know in the comments!

So what if you do get banned? Then what?

How to remove a ban from Google
The thing to remember with a Google ban is that it is not always permanent. In fact, most of the time, you can just change whatever it was that got you banned and then submit a reconsideration request. To resubmit to Google, visit this link, which will take you to a page within the Google Webmaster Tools. From there they will walk you through the process which can also include sending an email to Google regarding what you have changed and why you feel you should be included in the search engine once again. Remember when you compose this email that there will be a real person reading it on the other end. Be kind and business-like in your request and you will have much more of a chance of getting the ban lifted and once again being indexed in the largest and most used search engine on the web.


Courtesy

Google Caffeine – What You Need To Know!

As an SEO, I know that my industry is constantly changing and evolving. That is one of the reasons why I truly love marketing through the search engines. One upcoming change will be the release of what has been code-named “Google Caffeine,” which can be beta-tested here. It will be a new, more powerful version of Google’s search engine technology. You can read more about it from the Google Webmaster Central Blog.

Because I rely on search engine optimization to grow my client’s businesses and my personal websites, I pay attention anytime Google gives out little bits of information regarding the future of their search engine.

The following quote from Google tells us a lot about this new project’s role in the future of search:

“It’s the first step in a process that will let us push the envelope on size, indexing speed, accuracy, comprehensiveness, and other dimensions.”

That one sentence tells us just about everything we need to know about the future of Google search. It tells us where we need to be moving to stay ahead of the curve in our industry. Here is how we can do it:

  1. Site & Indexing Speed: Google is going to be crawling more sites, more pages on those sites, and adding them to their index much faster. We have known for years that Google and the other search engines love fresh content. The problem is that there is so much new content hitting the internet every single day that they needed to come up with a solution to keep their index up-to-date with the latest news and information. The key here is that you need to continue to add new content to your website often. As Google increases their indexing speed, your new pages will get picked up faster and you will rank better.
  2. Accuracy: Websites that will continue to dominate the search engines will be the ones that can properly match up the keywords people are searching on with the content of what they are actually looking for. I believe Google is not only interested in which sites people are actually clicking on, but what their behavior is like once they arrive at the particular website. The key here is that you should take all the time you need to make sure you are targeting the proper keywords, that the content on your website is related to the search term, and that it is enticing enough to keep visitors interested. Conversion optimization will also play a huge roll in converting more of your visitors into buying customers. One thing I always tell my clients is, “I can bring you traffic, but what good is the traffic if it doesn’t make you more money?” This is true regarding targeting the wrong keywords and having a crappy web design that doesn’t convert the traffic.
  3. Comprehensiveness: Google will continue to favor “authority” sites. You know, those sites that do an excellent job of completely covering their niche. The key here is that if you want a website that ranks incredibly well for your main keywords, you better be ready to put in some blood, sweat, and tears. Google wants to see and rank sites that are thorough and comprehensive resources for the people who are looking for information about a particular topic. This is exactly why it is crucial that you continue to build new pages on your website that rank for each and every one of your relevant keywords. Remember, Google continues to show that they prefer larger, older websites that are loaded with useful information.

This information is nothing new. These are the things we are currently doing for our clients and that you should be doing now to increase your search engine rankings. However, there are many websites who are not doing these things and will fall far behind when Google finally releases their new search engine technology.

That is why you need to get started TODAY! If you do, you will be far ahead of your competitors who push aside this information and you will develop what is known as a competitive advantage.


Courtesy By Greg Shuey

7 Ways to Use Keyword Analytics to Your Advantage

You should be paying very close attention to which keywords are driving traffic to your site. If someone at your company isn’t digging into your keyword referral reports in your analytics tool, you are leaving money on the table. Here’s a list of seven ways to effectively leverage your keyword analytics (for both organic and paid search traffic).

1. Peek inside the minds of searchers
Often we as marketers think we know a lot about how people search. The truth is, there are a lot of different ways to search and it varies by industry and from one individual to another. By analyzing the keywords and phrases that are driving traffic and sales to your website, you can find out how your customers search to find your site. What adjectives or other modifiers do potential customers search on? What order do they search

2. See which keywords are working for organic search
If your site is showing up on the first page for some of those keywords, how much traffic are you getting from those organic listings? More importantly, how many leads or sales are you getting from those keywords? You will sometimes be surprised at which keywords drive the most traffic. Often it’s not the keywords you think will be best, and that’s why you have to watch your keyword referral reports to see which keywords are working.

3. Determine which keywords are not driving traffic
If you’re on the first page of Google and you get zero clicks, it’s time to find some new keywords. Stick with the keywords that drive sales and ditch the keywords that don’t work. There is a huge difference in click through rates depending on the position your site is listed in, but if your site is anywhere on the first page of Google, you should expect some level of traffic, or you’re not targeting the right keyword.

4. Find keywords that work in PPC that can be used for SEO
The nice thing about PPC search advertising is that you can choose exactly which keywords your ad shows up for. The thing that sucks about PPC is that you have to pay for every click. So why not take what you’ve learned from your PPC campaign and make sure you’re focusing your SEO efforts on the right keywords? You’ll usually find that a first page organic listing for the same keyword will send a lot more traffic than a paid listing for the same phrase, and the price per click is way better ;-)

5. Find keywords that work for SEO that can be used for PPC campaigns
The same idea for taking PPC keywords into your SEO campaign works the other way, too. Organic search listings will bring people to your site for all kinds of different keywords–including tons of keyword combinations that you never would have thought to include in your PPC campaign. If you notice a particular phrase that drives a lot of sales from a unique organic search keyword, you should try it out in your PPC ads. You’ll usually see a similar conversion rate, or maybe even better conversion from PPC on the same keyword!

6. Identify keywords to add as negative matches
Negative matching with PPC campaigns is when you tell the search engines to not show your ad when certain words are included in the search query. This can come in handy when you’re doing broad matching on keywords that have multiple meanings or connotations. They can also help you eliminate keywords that are driving a lot of traffic without resulting in sales. By watching your conversion metrics on a keyword level, you can identify keywords that drive traffic without sales and add those keywords to your campaigns as negative matches. You can even save yourself some money by looking at irrelevant, under-performing keywords from your organic search that should be excluded from your PPC campaigns before you even spend a penny on PPC ads.

7. Get ideas for new content and products
You’ll start to notice that people find your site for all kinds of different, sometimes strange, keywords. Watch the keyword list for new ideas for topics you can write about on your blog or even a new product you can add to meet the needs of your customers. If you’re getting significant traffic on keywords that you don’t have content about, it’s a good indicator that traffic would flow to your site if you create content to match what people are looking for.

I find it very interesting to review the keyword referral data in website analytics reports. As you dig in you’ll find all kinds of hidden gems that you can apply to make your website better and more profitable!

Any other ideas of ways you’re using keyword analytics to grow your business?


Courtesy