Category Archives: SEO

SEO for Mobile Sites

SEO for Mobiles

As anyone can see, the world is going mobile, with so many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn’t easy.

Configure mobile sites so that they can be indexed accurately

Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different. This results in a variety of new challenges. While many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly. Here are some SEO for mobile sites notes and troubleshooting tips to help ensure that your site is properly crawled and indexed:

Verify that your mobile site is indexed by Google

If your web site doesn’t show up in the results of a Google mobile search even using the site: operator, it may be that your site has one or both of the following issues:

1. Googlebot may not be able to find your site.

Googlebot must crawl your site before it can be included in a search index. If you just created the site, Google may not yet be aware of it. If that’s the case, create a Mobile Sitemap and submit it to Google to inform them of the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, just like a standard Sitemap.

2. Googlebot may not be able to access your Site

Some mobile sites refuse access to anything but mobile phones,making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Google’s crawler for mobile sites is “Googlebot-Mobile”. If you’d like your site crawled, please allow any User-agent including “Googlebot-Mobile” to access your site

For Example: Use SetEnvlf User Agent Allow “Googlebot-Mobile” allow_ua

You should also be aware that Google may change its User-agent information at any time without notice, so I don’t recommend checking whether the User-agent exactly matches “Googlebot-Mobile” (the current User-agent). Instead, check whether the User-agent header contains the string “Googlebot-Mobile”. You can also use DNS Lookups to verify Googlebot.

Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, you should then check whether each URL is viewable on a mobile device. Pages that Google determines aren’t viewable on a mobile phone won’t be included in Google’s mobile site index (although they may be included in the regular web index).

This determination is based on a variety of factors, one of which is the “DTD (Doc Type Definition)” declaration. Check that your mobile-friendly URLs’ DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML. If it’s in a compatible format, the page is eligible for the mobile search index. Also avoid duplicate content. For more information, see the Mobile Webmaster Guidelines.

Running desktop and mobile versions of your site

One of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone accesses it on a mobile device. In dealing with this scenario, there are ta couple of viable options:

1. Redirect mobile users to the correct version

When a mobile user or crawler (like Googlebot-Mobile) accesses the desktop version of a URL, you can redirect them to the corresponding mobile version of the same page. Google notices the relationship between the two versions of the URL and displays the standard version for searches from desktops and the mobile version for mobile searches.

If you redirect users, please make sure that the content on the corresponding mobile/desktop URL matches as closely as possible.

For example, if you run a shopping site and there’s an access from a mobile phone to a desktop-version URL, make sure that the user is redirected to the mobile version of the page for the same product, and not to the homepage of the mobile version of the site.

Google occasionally finds sites using this kind of redirect in an attempt to boost their search rankings, but this practice only results in a negative user experience, and so should be avoided at all costs.

On the other hand, when there’s an access to a mobile-version URL from a desktop browser or by our web crawler, Googlebot, it’s not necessary to redirect them to the desktop-version.

For instance, Google doesn’t automatically redirect desktop users from their mobile site to their desktop site; instead they include a link on the mobile- version page to the desktop version.

These links are especially helpful when a mobile site doesn’t provide the full functionality of the desktop version, so users can easily navigate to the desktop-version.

2. Switch content based on User-agent

Some sites have the same URL for both desktop and mobile content, but can change their format according to the User-agent. In addition, both mobile users and desktop users access the same URL (i.e.: no redirects), but the content/format changes slightly according to the User-agent.

In this case, the same URL will appear for both mobile search and desktop search, and desktop users can see a desktop version of the content while mobile users can see a mobile version of the content.

Please note that if you fail to configure your site correctly, your site could be considered to be cloaking, which can lead to your site disappearing from Google search results. Cloaking refers to an attempt to boost search result rankings by serving different content to Googlebot other than to regular users. This causes problems such as less relevant results (pages appear in search results even though their content is actually unrelated to what users see/want), so Google takes cloaking very seriously!

So what does “the page that the user sees” mean if you provide both versions with a URL? As I stated in the previous post, Google uses “Googlebot” for web search and “Googlebot-Mobile” for mobile search.

To remain within Google guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device. It’s OK if the contents for Googlebot are different from those for Googlebot-Mobile.

One example of how you could be unintentionally detected as cloaking, is if your site returns a message like “Please access from mobile phones” to desktop browsers, but then returns a full mobile version to both crawlers (so Googlebot receives the mobile version). In this case, the page which web search users see (i.e:. “Please access from mobile phones”) is different from the page which

Googlebot crawls (i.e:. “Welcome to my site”). Again, Google detects cloaking because they want to serve users the same “relevant content” that Googlebot or Googlebot-Mobile crawled.

William

SEO Concept Mining

SEO Concept Mining

SEO Concept Mining is an activity that results in the extraction of concepts from artifacts. Solutions to the task typically involve aspects of artificial intelligence and statistics, such as data mining and text mining. Because artifacts are typically a loosely structured sequence of words and other symbols (rather than concepts), the problem is nontrivial, but it can provide powerful insights into the meaning, provenance and similarity of documents.

Methods

Traditionally, the conversion of words to concepts has been performed using a thesaurus, and for computational techniques the tendency is to do the same. The thesauri used are either specially created for the task, or a pre-existing language model, usually related to Princeton’s WordNet.

The mappings of words to concepts are often ambiguous. Typically each word in a given language will relate to several possible concepts. Humans use context to disambiguate the various meanings of a given piece of text, where available. Machine translation systems cannot easily infer context.

For the purposes of concept mining however, these ambiguities tend to be less important than they are with machine translation, for in large documents the ambiguities tend to even out, much as is the case with text mining.

There are many techniques for disambiguation that may be used. Examples are linguistic analysis of the text and the use of word and concept association frequency information that may be inferred from large text corpora. Recently, techniques that base on semantic similarity between the possible concepts and the context have appeared and gained interest in the scientific community.

Applications

Detecting and indexing similar documents in large corpus

One of the spin-offs of calculating document statistics in the concept domain, rather than the word domain, is that concepts form natural tree structures based on hypernymy and meronymy. These structures can be used to produce simple tree membership statistics, that can be used to locate any document in a Euclidean concept space. If the size of a document is also considered as another dimension of this space then an extremely efficient indexing system can be created. This technique is currently in commercial use locating similar legal documents in a 2.5 million document corpus.

Clustering documents by topic

Standard numeric clustering techniques may be used in “concept space” as described above to locate and index documents by the inferred topic. These are numerically far more efficient than their text mining cousins, and tend to behave more intuitively, in that they map better to the similarity measures a human would generate.

William

SEO Identifying Long Tail Keywords

Long Tail Keywords

One of the most important functions in SEO is identifying Long Tailed Keywords; the key to successful SEO is concentrating on long-tail keywords. Although these keywords get less traffic than more generic head terms, they are associated with more qualified traffic and users that are most likely further down their path of intent. The good news is that choosing the right long-tail keywords for your website pages is actually a fairly simple process.

We have already spoken of “Relevance”, which is the key factor to consider when choosing the correct keywords for SEO.

Please note, the more specific you are, the better. For example, if you own a company that installed swimming pools, which keyword is more likely to attract qualified prospects for your business?

“Swimming pools” vs. “fiberglass in-ground pool installation”

Obviously, if someone is searching for “fiberglass in-ground pool installation,” his brain is in research mode. They are looking for information on installation or someone to perform the installation. Keyword optimizing for “swimming pools” has its place, but, there is no doubt that this keyword will attract a much more generic audience that may not be looking for what you have are offering.

Another thing to consider when optimizing for the right keywords is location-based searches. When looking for contractors and services in their area, search engine users will usually include their location in the search. So, “fiberglass in-ground pool installation” becomes “fiberglass in-ground pool installation in Scottsdale, Az.”

If you operate in one geo-location, you may want to consider adding location-based keywords to all of your pages because traffic from other locations is not going to be that much help to you. If your business operates in several geo-locations, it is a good choice to create a separate webpage dedicated to each location, so you can make sure your brand is present when people in those locations are searching.

Determining where to start when it comes to keywords may seem challenging. Guessing is not a recommended practice for obvious reasons. However, there are many ways to research and find long-tail keywords that are right for your business

Web Analytics

Web analytics tools like Google Analytics allow you to see what organic search keywords are already driving traffic to your website. These keywords will provide a good baseline of core keywords, and provide you with a list of keywords and performance which you can benchmark your future SEO efforts against.

Online Keyword Tools

Google has a few tools that make it easy to conduct keyword research. The Google Adwords Keyword Tool is a great place to start. You can insert one keyword, multiple keywords, or even your website address, and Google will then return a list of related keywords along with simple metrics to gauge how fierce the competition is around each one and how many searches it gets on both a global and local search level.

Another tool worth checking out is Google Insights for Search. This tool allows you to enter multiple keywords and filter by location, search history, and category. You are then given results that show how much web interest there is around a specific  keyword, what caused the interest and where the traffic is coming from, as well as similar keywords.

HubSpot has its very good Keyword Grader tool, which helps you identify the best keywords for optimizing your site, and also tracks results from each one. This tracking feature allows you to see which keywords are actually driving traffic and leads, and allows you to continue optimizing your keywords over time, based on this information.

Keyword Search

Besides looking at your web analytics data or using a keyword research tool, there is a lot to be said for simply going on the search engines and conducting a few searches. Using the search engines can help you answer critical questions like:

How much competition is in the space? See how many search results there are. If there are hundreds of thousands or millions of results, ask yourself if it is really worth the time and effort to play in that space.

Where do your competitors rank?Pick a keyword you would like to optimize for and look at the top 20 results.

  • Are your competitors anywhere to be found?
  • Where do you rank?
  • Are you ranking at all? This information will guide you in making a decision to carve out a niche for yourself with keywords where your competitors are not playing, or you may find a keyword you think is worth picking a battle over.

Does Google providing other recommendations?When you type a keyword into Google, it will automatically populate the search results as you type. This feature is called Google Instant. This is Google’s attempt at trying to anticipate what you are searching for. Google is giving you results based off of previous search data. You can use this data to your advantage. Simply start typing in a keyword and see what keywords Google populates under your search result. This is a quick way to get keyword ideas.

William

Search Engine Optimization Keywords

Keywords SEO

Search Engine Optimization Keywords

Why are Keywords Important?

The way people shop has changed in the age of search engines and in using Search Engine Optimization Keywords. People are increasingly using search engines to help them find the products or services they are looking for. To do this, they type keywords into sites like Google, Yahoo, or Bing. The engines then rank sites related to these keywords based on relevance and authority.

Whether you know it or not, your website is already targeting certain keywords. Search engines extract these keywords from your on-page text, headers, page titles, inbound links and other factors. Moreover, you might not have made a conscious decision to target those keywords. Even if you did, you might not be monitoring your rankings or have a sense of how good your chances are at ranking well for those keywords.

Choosing the right keywords is often the difference between getting found in search or not!  As a consequence, keyword research is the foundation of an effective online marketing strategy. There are a number of variables that impact keyword selection. These variables can be divided into two groupings – primary selection variables and prioritization variables.

Selecting Keywords

It is important to understand what aspects of keywords make them important to your business. The different variables or characteristics of a keyword help determine whether the keywords are worth consideration in your SEO strategy. Only if keywords pass the primary selection tests can they be subjected to the prioritization variable tests. Considerations for primary keyword selection are:

  • Ensuring keyword terms/phrases have sufficient search volumes
  • Ensuring the chosen keyword terms are relevant
  • Assessing levels of relative competition

If a search term doesn’t satisfy the criterion of sufficient volume, then it is removed from the list. Likewise, if it does not satisfy the relevancy criterion, it should not be considered.

Keywords Prioritization

Two things to consider when prioritizing keywords are:

  • Competitive advantage for the product/services
  • Profitability of the products/services associated with the keywords

Prior to entering the vetting process, a Keyword Opportunity List should be generated.

How to Generate the Initial Keyword Opportunity List

The first phase of creating the initial Keyword Opportunity List involves brainstorming as many keyword ideas as possible.

a. Listing root brands and product/service names (e.g. lawyer)

b. Brainstorming variations of product and brand related keywords

c. Talking to clients to determine what terms they use in search

d. Studying competitors’ sites

e. Adding geographic variations (e.g. Miami lawyers, Dade county lawyers)

f. Adding descriptive variations (e.g. personal injury lawyers, Auto lawyers)

g. Taking all the variations and entering them into the Google AdWords Keyword \Tool,      which will suggest numerous other variations.

With this list in hand, now the keyword list can be vetted.

How to Choose Relevant Keyword Terms/Phrases

Once all keyword possibilities with sufficient search volumes are selected, keywords must then be filtered for relevancy. You don’t just want to pull in traffic; you want to ensure that your traffic is of high quality. Quality traffic helps you convert your visitors into customers at a higher rate.

Let’s demonstrate the importance of relevant traffic through an example.

If a small law office in Jacksonville, Florida were able to achieve a ranking for the generic term ‘lawyers,’ they would be inundated with irrelevant calls from people in New York City, Chicago, Miami and Los Angeles. Realistically, less than 1% of the queries from the term ‘lawyers’ would be potential clients from the Jacksonville area, meaning:

  • It would be a tremendous distraction for the staff taking these calls as well as filtering out the bad leads
  • It would eat up all the time and resources needed to nurture your more valuable leads in the Jacksonville area

How to Assess Keyword Competitiveness

People have a tendency to emphasize traffic over relevance. You need to make sure the search terms you’re targeting have sufficient traffic, but often you don’t want them to have too much either. More traffic usually correlates with high competition.

Let’s go back to the Jacksonville law firm. Let’s say they want to rank for the term ‘lawyer.’ This puts them up against almost all law firms in the English-speaking world, including larger and more powerful ones. As I’m writing this, there are 112 million Google results for ‘lawyer.’ Only 10 are on the first page of Google.

When picking keywords to target, you clearly need to choose your battles wisely. So how do you do that?

There are several free tools for assessing keyword competitiveness. One example is the SEO Chat Keyword Difficulty Check Tool. The higher the score the keyword gets, the more competitive the term, and the more difficult it is to rank for. Generally speaking, terms with a difficulty score over 60 will require much more than just on-page optimization if you want to rank on the first page of search results.

HubSpot Internet Marketing Software is a paid tool that includes a keyword monitoring component. In addition, it also helps you maintain a dashboard of relevant keywords, including their search volume, competition, your ranking and the number of visits from that keyword search.

How to Beat the Competition

After picking your niche, you need to figure out how to beat the competition in that niche. The way to do this is for your site to gain authority and relevance for those terms.

Authority

Authority is assessed by understanding the link profile of your site versus those other sites ranking for the keywords you are targeting. External links from other sites are the single most powerful ranking tool amongst the major search engines of today. The three most important elements of these linking factors are:

  • Number of links to a website (more is better)
  • Number of links to the specific page one hopes will rank for the term in question (again, more is typically better than less)
  • The anchor text of links to the specific page (see the upcoming link building chapter for more on this)

Links are the biggest factor in gaining authority and search engine rankings. HubSpot software allows companies to compare their own link profiles to those of their competitors.

In general, one’s site may compete for rankings (in the short term) with other sites with similar link profiles. Tackling sites with more powerful link profiles requires time and dedication. The bigger the gap, the more time, effort and budget is needed. When a large gap exists between two competing sites in the number of HubSpot.com inbound links, it is very difficult for the site with less links to make-up ground and compete for keyword opportunities.

Relevance

Relevance, on the other hand, means looking to see if the other sites are specifically trying to rank for the term(s) in question. On-page relevancy can be quickly assessed by looking at simple elements.

  • Keyword match in the title of a page
  • Keyword match in a site’s internal navigation
  • Keyword match in the domain name

By considering both authority and relevancy, it’s a relatively simple process to determine opportunities. If rankings for a given keyword term are dominated by much more powerful sites obviously targeting the term with their on-page factors, then it’s likely best to look for another keyword opportunity. If, on the other hand, those same sites are powerful yet aren’t specifically targeting the terms (or vice versa), then potential does exist.

At the end of this process, you should have a list of keywords that have been vetted. Now, it becomes a process of prioritizing all the remaining keywords. While the same primary assessment variables can still be utilized to determine priorities, secondary assessment variables now can also be considered.

Additional Prioritization Variables

1. Competitive advantage – Does the firm have a distinct competitive advantage (in terms of price, quality, delivery time) that can be leveraged to increase the likelihood of sales?

2. Ability to scale or fulfill – Is inventory or ability to fulfill limited? If so, other products and services with more potential might be a better priority.

3. Profitability – How profitable is a product or service? More profitable items are often more desirable to promote.

4. Lifetime value of item client – If the sale of a given item is made, the current value of the sale is not the only consideration. One should also take into account the average lifetime value of the purchaser of the item in question.

Often, the keyword terms with relatively high search volumes and low competition are the best opportunities. Of course, relevance must be factored into this equation as well.

In addition to looking at volume vs. competition, it often helps to look at the additional prioritization variables.

William

SEO – What is Crawling and Indexing

Site Index

Crawling and indexing the billions of documents, pages, files, news, calculating relevancy, rankings, and serving results.

Imagine the World Wide Web as a network of stops in a big city subway system. Each stop is its own unique document (usually a web page, but sometimes a PDF, JPG or other type of file.

When you sit down at your computer and do a Google search, you’re almost instantly presented with a list of results from all over the web. How does Google find web pages matching your query, and determine the order of search results?

In the simplest terms, you could think of searching the web as looking in a very large book with an impressive index telling you exactly where everything is located. When you perform a Google search, our programs check our index to determine the most relevant search results to be returned (“served”) to you.

There are three key processes in delivering search results to you are:

  1. Crawling: Does Google know about your site and can they find it?
  2. Indexing: Can Google Index your site?
  3. Serving: Does the site have good and useful content that is relevant to users search?

Crawling

Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

Google uses a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

It should be noted: Google doesn’t accept payment to crawl a site more frequently, and they keep the search side of the business separate from their revenue-generating AdWords service.

Indexing

Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, Google processes information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

Once the engines find these pages, their next job is to parse the code from them and store selected pieces of the pages in massive hard drives, to be recalled when needed in a query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engines have constructed massive datacenters in cities all over the world.

These monstrous storage facilities hold thousands of machines processing unimaginably large quantities of information. After all, when a person performs a search at any of the major engines, engines work hard to provide answers as fast as possible.

Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content. In order for your site to rank well in search results pages, it’s important to make sure that Google can crawl and index your site correctly.

Importance is an equally tough concept to quantify, but Search engines must do their best. Site, page or document, the more valuable the information contained therein must be. This assumption has proven fairly successful in practice, as the engines have continued to increase algorithms and as we said before, are often comprised of hundreds of components.

Prediction Engines

When using Google’s Did you mean and Google Auto Complete features which are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like google.com search results, the keywords used by these features are automatically generated by the web crawlers and search algorithms. Google displays these predictions only when they think they might save the user time. If a site ranks well for a keyword, it’s because Google has algorithmically determined that its content is more relevant to the user’s query.

Hopefully understanding these concepts will help you to better understand how Crawling and Indexing function, so you get make use of keywords to write your articles that improve your Websites and Blog rankings.

William

Search Engine Optimization Keywords

Keywords Research

Why are Keywords Important?

The way people shop has changed in the age of search engines and in using Search Engine Optimization Keywords. People are increasingly using search engines to help them find the products or services they are looking for. To do this, they type keywords into sites like Google, Yahoo, or Bing. The engines then rank sites related to these keywords based on relevance and authority.

Whether you know it or not, your website is already targeting certain keywords. Search engines extract these keywords from your on-page text, headers, page titles, inbound links and other factors. Moreover, you might not have made a conscious decision to target those keywords. Even if you did, you might not be monitoring your rankings or have a sense of how good your chances are at ranking well for those keywords.

Choosing the right keywords is often the difference between getting found in search or not!  As a consequence, keyword research is the foundation of an effective online marketing strategy. There are a number of variables that impact keyword selection. These variables can be divided into two groupings – primary selection variables and prioritization variables.

Selecting Keywords

It is important to understand what aspects of keywords make them important to your business. The different variables or characteristics of a keyword help determine whether the keywords are worth consideration in your SEO strategy. Only if keywords pass the primary selection tests can they be subjected to the prioritization variable tests. Considerations for primary keyword selection are:

  • Ensuring keyword terms/phrases have sufficient search volumes
  • Ensuring the chosen keyword terms are relevant
  • Assessing levels of relative competition

If a search term doesn’t satisfy the criterion of sufficient volume, then it is removed from the list. Likewise, if it does not satisfy the relevancy criterion, it should not be considered.

Keywords Prioritization

Two things to consider when prioritizing keywords are:

  • Competitive advantage for the product/services
  • Profitability of the products/services associated with the keywords

Prior to entering the vetting process, a Keyword Opportunity List should be generated.

How to Generate the Initial Keyword Opportunity List

The first phase of creating the initial Keyword Opportunity List involves brainstorming as many keyword ideas as possible.

a. Listing root brands and product/service names (e.g. lawyer)

b. Brainstorming variations of product and brand related keywords

c. Talking to clients to determine what terms they use in search

d. Studying competitors’ sites

e. Adding geographic variations (e.g. Miami lawyers, Dade county lawyers)

f. Adding descriptive variations (e.g. personal injury lawyers, Auto lawyers)

g. Taking all the variations and entering them into the Google AdWords Keyword \Tool,      which will suggest numerous other variations.

With this list in hand, now the keyword list can be vetted.

How to Choose Relevant Keyword Terms/Phrases

Once all keyword possibilities with sufficient search volumes are selected, keywords must then be filtered for relevancy. You don’t just want to pull in traffic; you want to ensure that your traffic is of high quality. Quality traffic helps you convert your visitors into customers at a higher rate.

Let’s demonstrate the importance of relevant traffic through an example.

If a small law office in Jacksonville, Florida were able to achieve a ranking for the generic term ‘lawyers,’ they would be inundated with irrelevant calls from people in New York City, Chicago, Miami and Los Angeles. Realistically, less than 1% of the queries from the term ‘lawyers’ would be potential clients from the Jacksonville area, meaning:

  • It would be a tremendous distraction for the staff taking these calls as well as filtering out the bad leads
  • It would eat up all the time and resources needed to nurture your more valuable leads in the Jacksonville area

How to Assess Keyword Competitiveness

People have a tendency to emphasize traffic over relevance. You need to make sure the search terms you’re targeting have sufficient traffic, but often you don’t want them to have too much either. More traffic usually correlates with high competition.

Let’s go back to the Jacksonville law firm. Let’s say they want to rank for the term ‘lawyer.’ This puts them up against almost all law firms in the English-speaking world, including larger and more powerful ones. As I’m writing this, there are 112 million Google results for ‘lawyer.’ Only 10 are on the first page of Google.

When picking keywords to target, you clearly need to choose your battles wisely. So how do you do that?

There are several free tools for assessing keyword competitiveness. One example is the SEO Chat Keyword Difficulty Check Tool. The higher the score the keyword gets, the more competitive the term, and the more difficult it is to rank for. Generally speaking, terms with a difficulty score over 60 will require much more than just on-page optimization if you want to rank on the first page of search results.

HubSpot Internet Marketing Software is a paid tool that includes a keyword monitoring component. In addition, it also helps you maintain a dashboard of relevant keywords, including their search volume, competition, your ranking and the number of visits from that keyword search.

How to Beat the Competition

After picking your niche, you need to figure out how to beat the competition in that niche. The way to do this is for your site to gain authority and relevance for those terms.

Authority

Authority is assessed by understanding the link profile of your site versus those other sites ranking for the keywords you are targeting. External links from other sites are the single most powerful ranking tool amongst the major search engines of today. The three most important elements of these linking factors are:

  • Number of links to a website (more is better)
  • Number of links to the specific page one hopes will rank for the term in question (again, more is typically better than less)
  • The anchor text of links to the specific page (see the upcoming link building chapter for more on this)

Links are the biggest factor in gaining authority and search engine rankings. HubSpot software allows companies to compare their own link profiles to those of their competitors.

In general, one’s site may compete for rankings (in the short term) with other sites with similar link profiles. Tackling sites with more powerful link profiles requires time and dedication. The bigger the gap, the more time, effort and budget is needed. When a large gap exists between two competing sites in the number of HubSpot.com inbound links, it is very difficult for the site with less links to make-up ground and compete for keyword opportunities.

Relevance

Relevance, on the other hand, means looking to see if the other sites are specifically trying to rank for the term(s) in question. On-page relevancy can be quickly assessed by looking at simple elements.

  • Keyword match in the title of a page
  • Keyword match in a site’s internal navigation
  • Keyword match in the domain name

By considering both authority and relevancy, it’s a relatively simple process to determine opportunities. If rankings for a given keyword term are dominated by much more powerful sites obviously targeting the term with their on-page factors, then it’s likely best to look for another keyword opportunity. If, on the other hand, those same sites are powerful yet aren’t specifically targeting the terms (or vice versa), then potential does exist.

At the end of this process, you should have a list of keywords that have been vetted. Now, it becomes a process of prioritizing all the remaining keywords. While the same primary assessment variables can still be utilized to determine priorities, secondary assessment variables now can also be considered.

Additional Prioritization Variables

1. Competitive advantage – Does the firm have a distinct competitive advantage (in terms of price, quality, delivery time) that can be leveraged to increase the likelihood of sales?

2. Ability to scale or fulfill – Is inventory or ability to fulfill limited? If so, other products and services with more potential might be a better priority.

3. Profitability – How profitable is a product or service? More profitable items are often more desirable to promote.

4. Lifetime value of item client – If the sale of a given item is made, the current value of the sale is not the only consideration. One should also take into account the average lifetime value of the purchaser of the item in question.

Often, the keyword terms with relatively high search volumes and low competition are the best opportunities. Of course, relevance must be factored into this equation as well.

In addition to looking at volume vs. competition, it often helps to look at the additional prioritization variables.

William

SEO – Long Tailed Keywords

In order to get your websites content to rank on the search engines, you need to take the path of least resistance. Although trying to rank for highly trafficked keywords and terms may seem like a logical approach, it will most likely lead to a lot of frustration and wasted resources. Also, even if you end up getting traffic from these types of keywords, chances are the quality of the traffic will be low due to disinterest in what you specifically have to offer.

Think of every search query as being like people – they are all different. There are billions more unique search queries than there are generic ones.

In reality, if you were to add up all search engine traffic that comes from the most popular keywords, it would not even come close to the amount of traffic that comes from searches using more unique queries. This is called the theory of the long-tail keyword.

A critical component of SEO is choosing the right keywords for optimization. If you sell cars, you may want your website to rank for “car store,” (a head term), but chances are you are going to have some trouble there. However, if you optimize multiple pages on your website for each specific car that you sell, you are going to have much more success and it will be easier to rank on the SERP.

A keyword like “2011 red BMW convertible” (a long-tail keyword or term) is a good example. Sure, the number of people that search for this keyword will be much lower than the number that search for “car store,” but you can almost bet that those searchers are much farther down the sales funnel and may be ready to buy.

  Why Long Tailed Keywords are Effective

This is why long-tail keywords are so effective. They target people who are looking to perform a specific action, for example, to buy something, or looking for a specific piece of information, like a how-to or a service that can solve their problem. By choosing to optimize with long tail keywords, you will find it easier to rank on the search engines, drive qualified traffic, and turn that traffic into leads and customers.

Content is the Key

We have all heard it – when it comes to SEO, content is the key. Without rich content, you will find it difficult to rank for specific keywords and drive traffic to your website. Additionally, if your content does not provide value or engage users, you will be far less likely to drive leads and customers.

It is impossible to predict how people will search for content and exactly what keywords they are going to use. The only way to combat this is to generate content and lots of it. The more content and webpage’s you publish the more chances you have at ranking on the search engines. Lotto tickets are a good analogy here. The more lotto tickets you have, the higher the odds are that you will win. Imagine that every webpage you create is a lotto ticket. The more webpage’s you have, the higher your chances are of ranking in the search engines.

As you already know, the search engines are smart. If you create multiple webpage’s about the same exact topic, you are wasting your time. You need to create lots of content that covers lots of topics. There are multiple ways you can use content to expand your online presence and increase your chances of ranking without being repetitive. Here are few examples:

Homepage: Use your homepage to cover your overall value proposition and high-level messaging. If there was ever a place to optimize for more generic keywords, it is your homepage.

Product/Service Pages: If you offer products and/or services, create a unique webpage for each one of them.

Resource Center: Provide a webpage that offers links to other places on your website that cover education, advice, and tips.

Blog: Blogging is an incredible way to stay current and fresh while making it easy to generate tons of content. Blogging on a regular basis (once per week is ideal) can have a dramatic impact on SEO because every blog post is a new webpage.

While conducting SEO research, you may come across articles that discuss being mindful of keyword density(how often you mention a keyword on a page). Although following an approach like this may seem technically sound, it is not recommended. Remember: do not write content for the search engines. Write content for your audience and everything else will follow. Make sure each webpage has a clear objective and remains focused on one topic, and you will do just fine.

William

SEO – How Do Links Affect Your Website

Link Building Strategy

So, how do Links affect your website?  Anchor text is used to link an internal webpage to another one of your webpage’s, the use of anchor text when another website links to you can be extremely helpful in creating relevancy to certain keywords and phrases. If you have the option, always request keyword-rich anchor text for a link that uses your domain. That said, if you have no other option, still take a link with anchor text to your domain. All link juice is good.

One helpful practice in link building is link trading, or “I will put a link to your website on my website if you put a link to my mine on yours.” These types of links are referred to as reciprocal links. Since all links are good, reciprocal links are not prohibited, but their value is certainly not as good as a one-way link to your website. There was most likely a time when reciprocal links were just as good as any other, but the search engines are always getting smarter in determining how much value a link should receive.

Like most other aspects of SEO, throwing money at link building is a bad idea. Paying others to link to you is strictly prohibited by the search engines. In fact, all paid links must include a tag, called a no-follow tag, which tell the search engines not to give those links credit. If you are caught with un-tagged paid links (the linker or the linkee), your website could be suspended from the search engines or blacklisted for good.

Links to your website from advertisements are not counted as inbound links by the search engines. If they discover paid link relationships that are not classified as advertisements, you risk having your website suspended from being listed on the SERP, or even blacklisted if the instance is deemed severe enough.

If you don’t have the time to do link building, but do have some money, there are SEO firms that you can hire to perform this task. Some firms have questionable SEO practices at best, so it is best to do extensive research before signing any agreements or cutting a check.

Use Social Media for Link Building

Use of social networks like Facebook, Google+, Twitter, and LinkedIn has exploded over the last few years. In fact, the latest figures from ComScore suggest that 16% of all time spent online is spent on a social network. With hundreds of millions of users across these social networks sharing content they find online with their friends and followers, search engines have begun to take notice.

According to SEOMoz, the amount of social activity that a webpage has on social networks (shares, recommendations, likes, links, +1’s, etc.) is an important factor in that page’s ability to rank on the SERP. Simply put, search engines have realized that content shared on social networks is extremely influential, and should therefore rank higher. Beyond using social networks to engage new prospects, drive leads, and build brand awareness, businesses should consider all of the SEO benefits they miss out on by not having a brand presence.

In order to capitalize on the boost to your SERP rankings from social media, you need to make your content easy to share. Implementing social network buttons across your website is the easiest way to accomplish this. Installing the buttons is easy if you use a service like AddThis. Better yet, HubSpot’s blogging software automatically adds this functionality for you.

Use Email to get Links

Almost any business these days uses email to nurture relationships with their current leads and customers, and utilizes promotional email blasts to attract new ones. It is no surprise that with the death of direct mail over the past few years, email marketing has exploded. It has never been easier to set up an email program, upload your leads, and send them communication. Obviously, the extreme rate at which businesses have adopted email has deteriorated its effectiveness industry-wide. There is so much noise out there that you need to make every email send count.

Just like you need to make the content on your website easy to share in social media, you need to do the same for email. Aside from having clear call-to-action in your emails to nurture your list, drive leads, and convert them to customers, you should also make it easy for your email readers to share the content with friends and post it to social networks. This will increase the reach of your website content and make it easier for you to get inbound links for SEO.

Crawling and Indexing using Backlinks

SEO Link Building

The first answer would be that they must be indexed, however through experiments and tests we know that links that are not indexed still pass-on a certain amount of link juice to your website.  This makes absolute sense because Google/Yahoo would not want to show all their inbound linking data and info about a specific website. That would be much easier to manipulate. There is much information available that show a “portion” of the link wheel that supports any domain and the best average of backlinks.  No one is exactly sure unless you work at Google.

We know that Google does not show all of your backlinks in webmaster tools and will constantly re-evaluate, de-index and shift weights around on all your inbound links.  This will happen because a backlink gains more popularity or goes stale and is dropped from the SERPs, all sorts of different changes that happen constantly to the search engines “environment” of data.  There are a lot of opinions on how indexing vs. crawling of backlinks may differ.

Indexed Backlinks:

  • In-bound links that are showing in Google webmaster tools, Yahoo Site Explorer and/or SEO software
  • Links that are showing in Google would be considered highest value
  • You know that your receiving maximum benefits from that backlink

Crawled Backlinks:

  • Inbound links that have been pinged and crawled by search engine spiders
  • Carry “some” weight, but no one is sure how much compared to indexed links
  • Usually these links do not have much “content” associated with the link, more content (when relevant) is going to help improve the association and worth of a backlink pointing toward your main site

SEO Tips for Getting your Backlinks Indexed:

  • Always Ping  – use Pingler.com, Feedshark, Pingfarm.com(mass list pings)
  • Setup an RSS feed with all your backlinks in them so spiders can crawl the feed regularly
  • Build a 2nd tier of backlinks pointing at your 1st tier links (2nd tier can be much worse quality)
  • Bookmarking Demon/Xrumer/Scrapebox are excellent resources for building a variety of 2nd tier links to get your primary/high quality 1st tier backlinks indexed
  • If you are setting 2nd tier backlinks that point at your 1st tier, quality is not that important and don’t worry if your 1st tier properties get sandboxed, they will still pass on the link.

Please understand that having one backlink is better than none!  Just make sure to keep your backlinks in good neighborhoods, try to associate relevance and unique content with every link you make.  Whether it’s a small relation like a business directory or geographically, just stay away from spam websites that will affect your trust and rank negatively.

If you are having trouble getting your web2.0 properties indexed, definitely focus on building an RSS for those that can be dripped pings consistently, then you need to look at starting and creating your 2nd tier links.

I would really recommend a tool like Bookmarking Demon or Scrapebox, you can blast thousands and thousands of blog comments, bookmark submissions, Pligg sites, etc so that your main links are picked up and also this will distribute more links to your main site down the chain.

This has really become the standard in the SEO industry recently, especially after Google’s algorithm changes which have made it much more difficult to use poor quality/junk backlinks to rank for your keywords.  It’s better to build lower quantity, high quality link properties and then build a foundation of linking toward those.

William