Author Archives: Sylvia Patterson

SEO – Offer Quality Content and Services

SEO Content

Creating compelling and useful content will likely influence your website more than any of the other factors discussed here. Users know good content when they see it and will likely want to direct other users to it. This could be through blog posts, social media services, email, forums, or other means.

Organic or word-of-mouth buzz is what helps build your site’s reputation with both users and Google, and it rarely comes without quality content.

Interesting sites will increase their recognition on their own

  1. A blogger finds a piece of your content, likes it, and then references it in a blog post.
  2. The Google AdWords Keyword Tool can help you find relevant keywords on your site and the volume of those keywords.

 Anticipate differences in users’ understanding of your topic and offer unique,exclusive content

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a person looking for SEO Info might search for Search Engine Optimization while another person might use a more general query like [SEO]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google AdWords provides a handy Keyword Tool that helps you discover new keyword variations and see the approximate search volume for each keyword (2). Also, Google Webmaster Tools provides you with the top search queries your site appears for and the ones that led the most users to your site.

Consider creating a new, useful service that no other site offers. You could also write an original piece of research, break an exciting news story, or leverage your unique user base. Other sites may lack the resources or expertise to do these things.

Write easy-to-read text

Users enjoy content that is well written and easy to follow.

Avoid:

  • Writing sloppy text with many spelling and grammatical mistakes
  • Embedding text in images for textual content as users may want to copy and paste the text and search engines can’t read it

Stay organized around the topic

It’s always beneficial to organize your content so that visitors have a good sense of where one content topic begins and another ends. Breaking your content up into logical chunks or divisions helps users find the content they want faster.

Avoid:

  • Dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation

Create fresh, unique content

New content will not only keep your existing visitor base coming back, but also bring in new visitors.

Avoid:

  • Rehashing (or even copying) existing content that will bring little extra value to users
  • Having duplicate or near-duplicate versions of your content across your site – more on duplicate content

Create content primarily for your users, not search engines

Designing your site around your visitors’ needs while making sure your site is easily accessible to search engines usually produces positive results.

Avoid:

  • Inserting numerous unnecessary keywords aimed at search engines but are annoying or nonsensical to users
  • Having blocks of text like “frequent misspellings used to reach this page” that add little value for users
  • Deceptively hiding text from users, but displaying it to search engines

Write better anchor text

Suitable anchor text makes it easy to convey the contents linked

Anchor text is the clickable text that users will see as a result of a link, and is placed within the anchor tag <a href=”…”></a>.

This text tells users and Google something about the page you’re linking to. Links on your page maybe internal i.e. pointing to other pages on your site or external leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you’re linking to is about.

Suitable anchor text makes it easy to convey the contents linked

Choose descriptive text

The anchor text you use for a link should provide at least a basic idea of what the page linked to is about

Avoid:

  • Writing generic anchor text like “page”, “article”, or “click here”
  • Using text that is off-topic or has no relation to the content of the page linked to
  • Using the page’s URL as the anchor text in most cases – although there are certainly legitimate uses of this, such as promoting or referencing a new website’s address

Write concise text

Aim for short but descriptive text-usually a few words or a short phrase.

Avoid:

  • Writing long anchor text, such as a lengthy sentence or short paragraph of text

Format links so they’re easy to spot

Make it easy for users to distinguish between regular text and the anchor text of your links. Your content becomes less useful if users miss the links or accidentally click them.

Avoid:

  • Using CSS or text styling that make links look just like regular text

Think about anchor text for internal links too

You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and Google navigate your site better.

Avoid:

  • Using excessively keyword-filled or lengthy anchor text just for search engines
  • Creating unnecessary links that don’t help with the user’s navigation of the site

SEO Website Navigation Basics

The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the webmaster thinks is important. Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.

Understanding URL’s

Internet link

Understanding URL’s

Creating descriptive categories and file-names for the documents on your website can not only help you keep your site better organized, but it could also lead to better crawling of your documents by search engines. This creates easier, “friendlier” URLs for those that want to link to your content. Visitors may be intimidated by extremely long and cryptic URLs that contain few recognizable words.

URLs can be confusing and unfriendly. Users would have a hard time reciting complex URL’s from memory or creating a link to it. In addition, users may believe that a portion of the URL is unnecessary, especially if the URL shows many unrecognizable parameters. They might leave off a part, breaking the link.

Some users might link to your page using the URL of that page as the anchor text. If your URL contains relevant words, this provides users and search engines with more information about the page than an ID using numbers or an oddly named parameter would present.

URLs are displayed in search results

Finally, remember that the URL to a document is displayed as part of a search result in Google, below the document’s title and snippet. Like the title and snippet, words in the URL on the search result appear in bold if they appear in the user’s query. Note: to avoid Google from creating a snippet, use a well constructed Meta Description to enhance the title description.

Google is very adept at crawling all types of URL structures, even if they’re quite complex, but spending the time to make your URLs as simple as possible for both users and search engines can help. Some webmasters try to achieve this by rewriting their dynamic URLs to static ones; while Google is fine with this, we’d like to note that this is an advanced procedure and if done incorrectly, could cause crawling issues with your site. To learn even more about good URL structure, we recommend this Webmaster Help Center page on creating Google-friendly URLs.

Use words in URLs

URLs that contain words which are relevant to your site’s content and structure provide a much friendlier environment for visitors navigating your site. Visitors remember them better and might be more willing to link to them.

Avoid:

  • Using lengthy URLs with unnecessary parameters and session IDs
  • Choosing generic page names like “page1.html”
  • Using excessive keywords like “football-cards-football-cards-footballcards.htm”

Create a simple directory structure

Use a directory structure that organizes your content clearly and makes it easy for visitors to know where they’re at on your site. Try using your directory structure to indicate the type of content found at that URL.

Avoid:

  • Having deep nesting of sub-directories like “…/dir1/dir2/dir3/dir4/dir5/dir6/page.html”
  • Using directory names that have no relation to the content in them

Provide one version of a URL to reach a document

To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a “301 redirect” from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel=”canonical”link element if you cannot redirect.

Avoid:

  • Having pages from subdomains and the root directory access the same content

– e.g. “domain.com/page.htm” and “sub.domain.com/page.htm”

  • Using odd capitalization of URLs

-many users expect lower-case URLs and remember them better

Summary

Website development is greatly enhanced when developers adhere to Google’s rules for understanding URL’s.

William

SEO for Mobile Sites

SEO for Mobiles

As anyone can see, the world is going mobile, with so many people using mobile phones on a daily basis, and a large user base searching on Google’s mobile search page. However, as a webmaster, running a mobile site and tapping into the mobile search audience isn’t easy.

Configure mobile sites so that they can be indexed accurately

Mobile sites not only use a different format from normal desktop sites, but the management methods and expertise required are also quite different. This results in a variety of new challenges. While many mobile sites were designed with mobile viewing in mind, they weren’t designed to be search friendly. Here are some SEO for mobile sites notes and troubleshooting tips to help ensure that your site is properly crawled and indexed:

Verify that your mobile site is indexed by Google

If your web site doesn’t show up in the results of a Google mobile search even using the site: operator, it may be that your site has one or both of the following issues:

1. Googlebot may not be able to find your site.

Googlebot must crawl your site before it can be included in a search index. If you just created the site, Google may not yet be aware of it. If that’s the case, create a Mobile Sitemap and submit it to Google to inform them of the site’s existence. A Mobile Sitemap can be submitted using Google Webmaster Tools, just like a standard Sitemap.

2. Googlebot may not be able to access your Site

Some mobile sites refuse access to anything but mobile phones,making it impossible for Googlebot to access the site, and therefore making the site unsearchable. Google’s crawler for mobile sites is “Googlebot-Mobile”. If you’d like your site crawled, please allow any User-agent including “Googlebot-Mobile” to access your site

For Example: Use SetEnvlf User Agent Allow “Googlebot-Mobile” allow_ua

You should also be aware that Google may change its User-agent information at any time without notice, so I don’t recommend checking whether the User-agent exactly matches “Googlebot-Mobile” (the current User-agent). Instead, check whether the User-agent header contains the string “Googlebot-Mobile”. You can also use DNS Lookups to verify Googlebot.

Verify that Google can recognize your mobile URLs

Once Googlebot-Mobile crawls your URLs, you should then check whether each URL is viewable on a mobile device. Pages that Google determines aren’t viewable on a mobile phone won’t be included in Google’s mobile site index (although they may be included in the regular web index).

This determination is based on a variety of factors, one of which is the “DTD (Doc Type Definition)” declaration. Check that your mobile-friendly URLs’ DTD declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML. If it’s in a compatible format, the page is eligible for the mobile search index. Also avoid duplicate content. For more information, see the Mobile Webmaster Guidelines.

Running desktop and mobile versions of your site

One of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone accesses it on a mobile device. In dealing with this scenario, there are ta couple of viable options:

1. Redirect mobile users to the correct version

When a mobile user or crawler (like Googlebot-Mobile) accesses the desktop version of a URL, you can redirect them to the corresponding mobile version of the same page. Google notices the relationship between the two versions of the URL and displays the standard version for searches from desktops and the mobile version for mobile searches.

If you redirect users, please make sure that the content on the corresponding mobile/desktop URL matches as closely as possible.

For example, if you run a shopping site and there’s an access from a mobile phone to a desktop-version URL, make sure that the user is redirected to the mobile version of the page for the same product, and not to the homepage of the mobile version of the site.

Google occasionally finds sites using this kind of redirect in an attempt to boost their search rankings, but this practice only results in a negative user experience, and so should be avoided at all costs.

On the other hand, when there’s an access to a mobile-version URL from a desktop browser or by our web crawler, Googlebot, it’s not necessary to redirect them to the desktop-version.

For instance, Google doesn’t automatically redirect desktop users from their mobile site to their desktop site; instead they include a link on the mobile- version page to the desktop version.

These links are especially helpful when a mobile site doesn’t provide the full functionality of the desktop version, so users can easily navigate to the desktop-version.

2. Switch content based on User-agent

Some sites have the same URL for both desktop and mobile content, but can change their format according to the User-agent. In addition, both mobile users and desktop users access the same URL (i.e.: no redirects), but the content/format changes slightly according to the User-agent.

In this case, the same URL will appear for both mobile search and desktop search, and desktop users can see a desktop version of the content while mobile users can see a mobile version of the content.

Please note that if you fail to configure your site correctly, your site could be considered to be cloaking, which can lead to your site disappearing from Google search results. Cloaking refers to an attempt to boost search result rankings by serving different content to Googlebot other than to regular users. This causes problems such as less relevant results (pages appear in search results even though their content is actually unrelated to what users see/want), so Google takes cloaking very seriously!

So what does “the page that the user sees” mean if you provide both versions with a URL? As I stated in the previous post, Google uses “Googlebot” for web search and “Googlebot-Mobile” for mobile search.

To remain within Google guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device. It’s OK if the contents for Googlebot are different from those for Googlebot-Mobile.

One example of how you could be unintentionally detected as cloaking, is if your site returns a message like “Please access from mobile phones” to desktop browsers, but then returns a full mobile version to both crawlers (so Googlebot receives the mobile version). In this case, the page which web search users see (i.e:. “Please access from mobile phones”) is different from the page which

Googlebot crawls (i.e:. “Welcome to my site”). Again, Google detects cloaking because they want to serve users the same “relevant content” that Googlebot or Googlebot-Mobile crawled.

William

SEO Concept Mining

SEO Concept Mining

SEO Concept Mining is an activity that results in the extraction of concepts from artifacts. Solutions to the task typically involve aspects of artificial intelligence and statistics, such as data mining and text mining. Because artifacts are typically a loosely structured sequence of words and other symbols (rather than concepts), the problem is nontrivial, but it can provide powerful insights into the meaning, provenance and similarity of documents.

Methods

Traditionally, the conversion of words to concepts has been performed using a thesaurus, and for computational techniques the tendency is to do the same. The thesauri used are either specially created for the task, or a pre-existing language model, usually related to Princeton’s WordNet.

The mappings of words to concepts are often ambiguous. Typically each word in a given language will relate to several possible concepts. Humans use context to disambiguate the various meanings of a given piece of text, where available. Machine translation systems cannot easily infer context.

For the purposes of concept mining however, these ambiguities tend to be less important than they are with machine translation, for in large documents the ambiguities tend to even out, much as is the case with text mining.

There are many techniques for disambiguation that may be used. Examples are linguistic analysis of the text and the use of word and concept association frequency information that may be inferred from large text corpora. Recently, techniques that base on semantic similarity between the possible concepts and the context have appeared and gained interest in the scientific community.

Applications

Detecting and indexing similar documents in large corpus

One of the spin-offs of calculating document statistics in the concept domain, rather than the word domain, is that concepts form natural tree structures based on hypernymy and meronymy. These structures can be used to produce simple tree membership statistics, that can be used to locate any document in a Euclidean concept space. If the size of a document is also considered as another dimension of this space then an extremely efficient indexing system can be created. This technique is currently in commercial use locating similar legal documents in a 2.5 million document corpus.

Clustering documents by topic

Standard numeric clustering techniques may be used in “concept space” as described above to locate and index documents by the inferred topic. These are numerically far more efficient than their text mining cousins, and tend to behave more intuitively, in that they map better to the similarity measures a human would generate.

William

SEO Identifying Long Tail Keywords

Long Tail Keywords

One of the most important functions in SEO is identifying Long Tailed Keywords; the key to successful SEO is concentrating on long-tail keywords. Although these keywords get less traffic than more generic head terms, they are associated with more qualified traffic and users that are most likely further down their path of intent. The good news is that choosing the right long-tail keywords for your website pages is actually a fairly simple process.

We have already spoken of “Relevance”, which is the key factor to consider when choosing the correct keywords for SEO.

Please note, the more specific you are, the better. For example, if you own a company that installed swimming pools, which keyword is more likely to attract qualified prospects for your business?

“Swimming pools” vs. “fiberglass in-ground pool installation”

Obviously, if someone is searching for “fiberglass in-ground pool installation,” his brain is in research mode. They are looking for information on installation or someone to perform the installation. Keyword optimizing for “swimming pools” has its place, but, there is no doubt that this keyword will attract a much more generic audience that may not be looking for what you have are offering.

Another thing to consider when optimizing for the right keywords is location-based searches. When looking for contractors and services in their area, search engine users will usually include their location in the search. So, “fiberglass in-ground pool installation” becomes “fiberglass in-ground pool installation in Scottsdale, Az.”

If you operate in one geo-location, you may want to consider adding location-based keywords to all of your pages because traffic from other locations is not going to be that much help to you. If your business operates in several geo-locations, it is a good choice to create a separate webpage dedicated to each location, so you can make sure your brand is present when people in those locations are searching.

Determining where to start when it comes to keywords may seem challenging. Guessing is not a recommended practice for obvious reasons. However, there are many ways to research and find long-tail keywords that are right for your business

Web Analytics

Web analytics tools like Google Analytics allow you to see what organic search keywords are already driving traffic to your website. These keywords will provide a good baseline of core keywords, and provide you with a list of keywords and performance which you can benchmark your future SEO efforts against.

Online Keyword Tools

Google has a few tools that make it easy to conduct keyword research. The Google Adwords Keyword Tool is a great place to start. You can insert one keyword, multiple keywords, or even your website address, and Google will then return a list of related keywords along with simple metrics to gauge how fierce the competition is around each one and how many searches it gets on both a global and local search level.

Another tool worth checking out is Google Insights for Search. This tool allows you to enter multiple keywords and filter by location, search history, and category. You are then given results that show how much web interest there is around a specific  keyword, what caused the interest and where the traffic is coming from, as well as similar keywords.

HubSpot has its very good Keyword Grader tool, which helps you identify the best keywords for optimizing your site, and also tracks results from each one. This tracking feature allows you to see which keywords are actually driving traffic and leads, and allows you to continue optimizing your keywords over time, based on this information.

Keyword Search

Besides looking at your web analytics data or using a keyword research tool, there is a lot to be said for simply going on the search engines and conducting a few searches. Using the search engines can help you answer critical questions like:

How much competition is in the space? See how many search results there are. If there are hundreds of thousands or millions of results, ask yourself if it is really worth the time and effort to play in that space.

Where do your competitors rank?Pick a keyword you would like to optimize for and look at the top 20 results.

  • Are your competitors anywhere to be found?
  • Where do you rank?
  • Are you ranking at all? This information will guide you in making a decision to carve out a niche for yourself with keywords where your competitors are not playing, or you may find a keyword you think is worth picking a battle over.

Does Google providing other recommendations?When you type a keyword into Google, it will automatically populate the search results as you type. This feature is called Google Instant. This is Google’s attempt at trying to anticipate what you are searching for. Google is giving you results based off of previous search data. You can use this data to your advantage. Simply start typing in a keyword and see what keywords Google populates under your search result. This is a quick way to get keyword ideas.

William

Search Engine Optimization Keywords

Keywords SEO

Search Engine Optimization Keywords

Why are Keywords Important?

The way people shop has changed in the age of search engines and in using Search Engine Optimization Keywords. People are increasingly using search engines to help them find the products or services they are looking for. To do this, they type keywords into sites like Google, Yahoo, or Bing. The engines then rank sites related to these keywords based on relevance and authority.

Whether you know it or not, your website is already targeting certain keywords. Search engines extract these keywords from your on-page text, headers, page titles, inbound links and other factors. Moreover, you might not have made a conscious decision to target those keywords. Even if you did, you might not be monitoring your rankings or have a sense of how good your chances are at ranking well for those keywords.

Choosing the right keywords is often the difference between getting found in search or not!  As a consequence, keyword research is the foundation of an effective online marketing strategy. There are a number of variables that impact keyword selection. These variables can be divided into two groupings – primary selection variables and prioritization variables.

Selecting Keywords

It is important to understand what aspects of keywords make them important to your business. The different variables or characteristics of a keyword help determine whether the keywords are worth consideration in your SEO strategy. Only if keywords pass the primary selection tests can they be subjected to the prioritization variable tests. Considerations for primary keyword selection are:

  • Ensuring keyword terms/phrases have sufficient search volumes
  • Ensuring the chosen keyword terms are relevant
  • Assessing levels of relative competition

If a search term doesn’t satisfy the criterion of sufficient volume, then it is removed from the list. Likewise, if it does not satisfy the relevancy criterion, it should not be considered.

Keywords Prioritization

Two things to consider when prioritizing keywords are:

  • Competitive advantage for the product/services
  • Profitability of the products/services associated with the keywords

Prior to entering the vetting process, a Keyword Opportunity List should be generated.

How to Generate the Initial Keyword Opportunity List

The first phase of creating the initial Keyword Opportunity List involves brainstorming as many keyword ideas as possible.

a. Listing root brands and product/service names (e.g. lawyer)

b. Brainstorming variations of product and brand related keywords

c. Talking to clients to determine what terms they use in search

d. Studying competitors’ sites

e. Adding geographic variations (e.g. Miami lawyers, Dade county lawyers)

f. Adding descriptive variations (e.g. personal injury lawyers, Auto lawyers)

g. Taking all the variations and entering them into the Google AdWords Keyword \Tool,      which will suggest numerous other variations.

With this list in hand, now the keyword list can be vetted.

How to Choose Relevant Keyword Terms/Phrases

Once all keyword possibilities with sufficient search volumes are selected, keywords must then be filtered for relevancy. You don’t just want to pull in traffic; you want to ensure that your traffic is of high quality. Quality traffic helps you convert your visitors into customers at a higher rate.

Let’s demonstrate the importance of relevant traffic through an example.

If a small law office in Jacksonville, Florida were able to achieve a ranking for the generic term ‘lawyers,’ they would be inundated with irrelevant calls from people in New York City, Chicago, Miami and Los Angeles. Realistically, less than 1% of the queries from the term ‘lawyers’ would be potential clients from the Jacksonville area, meaning:

  • It would be a tremendous distraction for the staff taking these calls as well as filtering out the bad leads
  • It would eat up all the time and resources needed to nurture your more valuable leads in the Jacksonville area

How to Assess Keyword Competitiveness

People have a tendency to emphasize traffic over relevance. You need to make sure the search terms you’re targeting have sufficient traffic, but often you don’t want them to have too much either. More traffic usually correlates with high competition.

Let’s go back to the Jacksonville law firm. Let’s say they want to rank for the term ‘lawyer.’ This puts them up against almost all law firms in the English-speaking world, including larger and more powerful ones. As I’m writing this, there are 112 million Google results for ‘lawyer.’ Only 10 are on the first page of Google.

When picking keywords to target, you clearly need to choose your battles wisely. So how do you do that?

There are several free tools for assessing keyword competitiveness. One example is the SEO Chat Keyword Difficulty Check Tool. The higher the score the keyword gets, the more competitive the term, and the more difficult it is to rank for. Generally speaking, terms with a difficulty score over 60 will require much more than just on-page optimization if you want to rank on the first page of search results.

HubSpot Internet Marketing Software is a paid tool that includes a keyword monitoring component. In addition, it also helps you maintain a dashboard of relevant keywords, including their search volume, competition, your ranking and the number of visits from that keyword search.

How to Beat the Competition

After picking your niche, you need to figure out how to beat the competition in that niche. The way to do this is for your site to gain authority and relevance for those terms.

Authority

Authority is assessed by understanding the link profile of your site versus those other sites ranking for the keywords you are targeting. External links from other sites are the single most powerful ranking tool amongst the major search engines of today. The three most important elements of these linking factors are:

  • Number of links to a website (more is better)
  • Number of links to the specific page one hopes will rank for the term in question (again, more is typically better than less)
  • The anchor text of links to the specific page (see the upcoming link building chapter for more on this)

Links are the biggest factor in gaining authority and search engine rankings. HubSpot software allows companies to compare their own link profiles to those of their competitors.

In general, one’s site may compete for rankings (in the short term) with other sites with similar link profiles. Tackling sites with more powerful link profiles requires time and dedication. The bigger the gap, the more time, effort and budget is needed. When a large gap exists between two competing sites in the number of HubSpot.com inbound links, it is very difficult for the site with less links to make-up ground and compete for keyword opportunities.

Relevance

Relevance, on the other hand, means looking to see if the other sites are specifically trying to rank for the term(s) in question. On-page relevancy can be quickly assessed by looking at simple elements.

  • Keyword match in the title of a page
  • Keyword match in a site’s internal navigation
  • Keyword match in the domain name

By considering both authority and relevancy, it’s a relatively simple process to determine opportunities. If rankings for a given keyword term are dominated by much more powerful sites obviously targeting the term with their on-page factors, then it’s likely best to look for another keyword opportunity. If, on the other hand, those same sites are powerful yet aren’t specifically targeting the terms (or vice versa), then potential does exist.

At the end of this process, you should have a list of keywords that have been vetted. Now, it becomes a process of prioritizing all the remaining keywords. While the same primary assessment variables can still be utilized to determine priorities, secondary assessment variables now can also be considered.

Additional Prioritization Variables

1. Competitive advantage – Does the firm have a distinct competitive advantage (in terms of price, quality, delivery time) that can be leveraged to increase the likelihood of sales?

2. Ability to scale or fulfill – Is inventory or ability to fulfill limited? If so, other products and services with more potential might be a better priority.

3. Profitability – How profitable is a product or service? More profitable items are often more desirable to promote.

4. Lifetime value of item client – If the sale of a given item is made, the current value of the sale is not the only consideration. One should also take into account the average lifetime value of the purchaser of the item in question.

Often, the keyword terms with relatively high search volumes and low competition are the best opportunities. Of course, relevance must be factored into this equation as well.

In addition to looking at volume vs. competition, it often helps to look at the additional prioritization variables.

William

SEO – What is Crawling and Indexing

Site Index

Crawling and indexing the billions of documents, pages, files, news, calculating relevancy, rankings, and serving results.

Imagine the World Wide Web as a network of stops in a big city subway system. Each stop is its own unique document (usually a web page, but sometimes a PDF, JPG or other type of file.

When you sit down at your computer and do a Google search, you’re almost instantly presented with a list of results from all over the web. How does Google find web pages matching your query, and determine the order of search results?

In the simplest terms, you could think of searching the web as looking in a very large book with an impressive index telling you exactly where everything is located. When you perform a Google search, our programs check our index to determine the most relevant search results to be returned (“served”) to you.

There are three key processes in delivering search results to you are:

  1. Crawling: Does Google know about your site and can they find it?
  2. Indexing: Can Google Index your site?
  3. Serving: Does the site have good and useful content that is relevant to users search?

Crawling

Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

Google uses a huge set of computers to fetch (or “crawl”) billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

It should be noted: Google doesn’t accept payment to crawl a site more frequently, and they keep the search side of the business separate from their revenue-generating AdWords service.

Indexing

Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, Google processes information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.

Once the engines find these pages, their next job is to parse the code from them and store selected pieces of the pages in massive hard drives, to be recalled when needed in a query. To accomplish the monumental task of holding billions of pages that can be accessed in a fraction of a second, the search engines have constructed massive datacenters in cities all over the world.

These monstrous storage facilities hold thousands of machines processing unimaginably large quantities of information. After all, when a person performs a search at any of the major engines, engines work hard to provide answers as fast as possible.

Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site’s PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content. In order for your site to rank well in search results pages, it’s important to make sure that Google can crawl and index your site correctly.

Importance is an equally tough concept to quantify, but Search engines must do their best. Site, page or document, the more valuable the information contained therein must be. This assumption has proven fairly successful in practice, as the engines have continued to increase algorithms and as we said before, are often comprised of hundreds of components.

Prediction Engines

When using Google’s Did you mean and Google Auto Complete features which are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like google.com search results, the keywords used by these features are automatically generated by the web crawlers and search algorithms. Google displays these predictions only when they think they might save the user time. If a site ranks well for a keyword, it’s because Google has algorithmically determined that its content is more relevant to the user’s query.

Hopefully understanding these concepts will help you to better understand how Crawling and Indexing function, so you get make use of keywords to write your articles that improve your Websites and Blog rankings.

William

Search Engine Optimization Keywords

Keywords Research

Why are Keywords Important?

The way people shop has changed in the age of search engines and in using Search Engine Optimization Keywords. People are increasingly using search engines to help them find the products or services they are looking for. To do this, they type keywords into sites like Google, Yahoo, or Bing. The engines then rank sites related to these keywords based on relevance and authority.

Whether you know it or not, your website is already targeting certain keywords. Search engines extract these keywords from your on-page text, headers, page titles, inbound links and other factors. Moreover, you might not have made a conscious decision to target those keywords. Even if you did, you might not be monitoring your rankings or have a sense of how good your chances are at ranking well for those keywords.

Choosing the right keywords is often the difference between getting found in search or not!  As a consequence, keyword research is the foundation of an effective online marketing strategy. There are a number of variables that impact keyword selection. These variables can be divided into two groupings – primary selection variables and prioritization variables.

Selecting Keywords

It is important to understand what aspects of keywords make them important to your business. The different variables or characteristics of a keyword help determine whether the keywords are worth consideration in your SEO strategy. Only if keywords pass the primary selection tests can they be subjected to the prioritization variable tests. Considerations for primary keyword selection are:

  • Ensuring keyword terms/phrases have sufficient search volumes
  • Ensuring the chosen keyword terms are relevant
  • Assessing levels of relative competition

If a search term doesn’t satisfy the criterion of sufficient volume, then it is removed from the list. Likewise, if it does not satisfy the relevancy criterion, it should not be considered.

Keywords Prioritization

Two things to consider when prioritizing keywords are:

  • Competitive advantage for the product/services
  • Profitability of the products/services associated with the keywords

Prior to entering the vetting process, a Keyword Opportunity List should be generated.

How to Generate the Initial Keyword Opportunity List

The first phase of creating the initial Keyword Opportunity List involves brainstorming as many keyword ideas as possible.

a. Listing root brands and product/service names (e.g. lawyer)

b. Brainstorming variations of product and brand related keywords

c. Talking to clients to determine what terms they use in search

d. Studying competitors’ sites

e. Adding geographic variations (e.g. Miami lawyers, Dade county lawyers)

f. Adding descriptive variations (e.g. personal injury lawyers, Auto lawyers)

g. Taking all the variations and entering them into the Google AdWords Keyword \Tool,      which will suggest numerous other variations.

With this list in hand, now the keyword list can be vetted.

How to Choose Relevant Keyword Terms/Phrases

Once all keyword possibilities with sufficient search volumes are selected, keywords must then be filtered for relevancy. You don’t just want to pull in traffic; you want to ensure that your traffic is of high quality. Quality traffic helps you convert your visitors into customers at a higher rate.

Let’s demonstrate the importance of relevant traffic through an example.

If a small law office in Jacksonville, Florida were able to achieve a ranking for the generic term ‘lawyers,’ they would be inundated with irrelevant calls from people in New York City, Chicago, Miami and Los Angeles. Realistically, less than 1% of the queries from the term ‘lawyers’ would be potential clients from the Jacksonville area, meaning:

  • It would be a tremendous distraction for the staff taking these calls as well as filtering out the bad leads
  • It would eat up all the time and resources needed to nurture your more valuable leads in the Jacksonville area

How to Assess Keyword Competitiveness

People have a tendency to emphasize traffic over relevance. You need to make sure the search terms you’re targeting have sufficient traffic, but often you don’t want them to have too much either. More traffic usually correlates with high competition.

Let’s go back to the Jacksonville law firm. Let’s say they want to rank for the term ‘lawyer.’ This puts them up against almost all law firms in the English-speaking world, including larger and more powerful ones. As I’m writing this, there are 112 million Google results for ‘lawyer.’ Only 10 are on the first page of Google.

When picking keywords to target, you clearly need to choose your battles wisely. So how do you do that?

There are several free tools for assessing keyword competitiveness. One example is the SEO Chat Keyword Difficulty Check Tool. The higher the score the keyword gets, the more competitive the term, and the more difficult it is to rank for. Generally speaking, terms with a difficulty score over 60 will require much more than just on-page optimization if you want to rank on the first page of search results.

HubSpot Internet Marketing Software is a paid tool that includes a keyword monitoring component. In addition, it also helps you maintain a dashboard of relevant keywords, including their search volume, competition, your ranking and the number of visits from that keyword search.

How to Beat the Competition

After picking your niche, you need to figure out how to beat the competition in that niche. The way to do this is for your site to gain authority and relevance for those terms.

Authority

Authority is assessed by understanding the link profile of your site versus those other sites ranking for the keywords you are targeting. External links from other sites are the single most powerful ranking tool amongst the major search engines of today. The three most important elements of these linking factors are:

  • Number of links to a website (more is better)
  • Number of links to the specific page one hopes will rank for the term in question (again, more is typically better than less)
  • The anchor text of links to the specific page (see the upcoming link building chapter for more on this)

Links are the biggest factor in gaining authority and search engine rankings. HubSpot software allows companies to compare their own link profiles to those of their competitors.

In general, one’s site may compete for rankings (in the short term) with other sites with similar link profiles. Tackling sites with more powerful link profiles requires time and dedication. The bigger the gap, the more time, effort and budget is needed. When a large gap exists between two competing sites in the number of HubSpot.com inbound links, it is very difficult for the site with less links to make-up ground and compete for keyword opportunities.

Relevance

Relevance, on the other hand, means looking to see if the other sites are specifically trying to rank for the term(s) in question. On-page relevancy can be quickly assessed by looking at simple elements.

  • Keyword match in the title of a page
  • Keyword match in a site’s internal navigation
  • Keyword match in the domain name

By considering both authority and relevancy, it’s a relatively simple process to determine opportunities. If rankings for a given keyword term are dominated by much more powerful sites obviously targeting the term with their on-page factors, then it’s likely best to look for another keyword opportunity. If, on the other hand, those same sites are powerful yet aren’t specifically targeting the terms (or vice versa), then potential does exist.

At the end of this process, you should have a list of keywords that have been vetted. Now, it becomes a process of prioritizing all the remaining keywords. While the same primary assessment variables can still be utilized to determine priorities, secondary assessment variables now can also be considered.

Additional Prioritization Variables

1. Competitive advantage – Does the firm have a distinct competitive advantage (in terms of price, quality, delivery time) that can be leveraged to increase the likelihood of sales?

2. Ability to scale or fulfill – Is inventory or ability to fulfill limited? If so, other products and services with more potential might be a better priority.

3. Profitability – How profitable is a product or service? More profitable items are often more desirable to promote.

4. Lifetime value of item client – If the sale of a given item is made, the current value of the sale is not the only consideration. One should also take into account the average lifetime value of the purchaser of the item in question.

Often, the keyword terms with relatively high search volumes and low competition are the best opportunities. Of course, relevance must be factored into this equation as well.

In addition to looking at volume vs. competition, it often helps to look at the additional prioritization variables.

William

SEO – Long Tailed Keywords

In order to get your websites content to rank on the search engines, you need to take the path of least resistance. Although trying to rank for highly trafficked keywords and terms may seem like a logical approach, it will most likely lead to a lot of frustration and wasted resources. Also, even if you end up getting traffic from these types of keywords, chances are the quality of the traffic will be low due to disinterest in what you specifically have to offer.

Think of every search query as being like people – they are all different. There are billions more unique search queries than there are generic ones.

In reality, if you were to add up all search engine traffic that comes from the most popular keywords, it would not even come close to the amount of traffic that comes from searches using more unique queries. This is called the theory of the long-tail keyword.

A critical component of SEO is choosing the right keywords for optimization. If you sell cars, you may want your website to rank for “car store,” (a head term), but chances are you are going to have some trouble there. However, if you optimize multiple pages on your website for each specific car that you sell, you are going to have much more success and it will be easier to rank on the SERP.

A keyword like “2011 red BMW convertible” (a long-tail keyword or term) is a good example. Sure, the number of people that search for this keyword will be much lower than the number that search for “car store,” but you can almost bet that those searchers are much farther down the sales funnel and may be ready to buy.

  Why Long Tailed Keywords are Effective

This is why long-tail keywords are so effective. They target people who are looking to perform a specific action, for example, to buy something, or looking for a specific piece of information, like a how-to or a service that can solve their problem. By choosing to optimize with long tail keywords, you will find it easier to rank on the search engines, drive qualified traffic, and turn that traffic into leads and customers.

Content is the Key

We have all heard it – when it comes to SEO, content is the key. Without rich content, you will find it difficult to rank for specific keywords and drive traffic to your website. Additionally, if your content does not provide value or engage users, you will be far less likely to drive leads and customers.

It is impossible to predict how people will search for content and exactly what keywords they are going to use. The only way to combat this is to generate content and lots of it. The more content and webpage’s you publish the more chances you have at ranking on the search engines. Lotto tickets are a good analogy here. The more lotto tickets you have, the higher the odds are that you will win. Imagine that every webpage you create is a lotto ticket. The more webpage’s you have, the higher your chances are of ranking in the search engines.

As you already know, the search engines are smart. If you create multiple webpage’s about the same exact topic, you are wasting your time. You need to create lots of content that covers lots of topics. There are multiple ways you can use content to expand your online presence and increase your chances of ranking without being repetitive. Here are few examples:

Homepage: Use your homepage to cover your overall value proposition and high-level messaging. If there was ever a place to optimize for more generic keywords, it is your homepage.

Product/Service Pages: If you offer products and/or services, create a unique webpage for each one of them.

Resource Center: Provide a webpage that offers links to other places on your website that cover education, advice, and tips.

Blog: Blogging is an incredible way to stay current and fresh while making it easy to generate tons of content. Blogging on a regular basis (once per week is ideal) can have a dramatic impact on SEO because every blog post is a new webpage.

While conducting SEO research, you may come across articles that discuss being mindful of keyword density(how often you mention a keyword on a page). Although following an approach like this may seem technically sound, it is not recommended. Remember: do not write content for the search engines. Write content for your audience and everything else will follow. Make sure each webpage has a clear objective and remains focused on one topic, and you will do just fine.

William