The Complete Guide to Google Penalties (Both Manual and Algorithmic)


The Complete Guide to Google Penalties (Both Manual and Algorithmic)


It’s your worst nightmare…

You wake up one morning and check your analytics. But something’s wrong…where’s all your traffic?

Whether you like it or not, websites in most niches rely on Google for a large percentage of their traffic.

If you get hit by a penalty, 10%, 20%, or even more of your business can be wiped out overnight. That’s a pretty scary thought.

There are two types of penalties that can hit you: manual penalties and algorithmic penalties.

Algorithms get most of the attention because those types of penalties affect tens of thousands of sites all at once.

However, there are over 400,000 manual penalties that are applied every month, according to Matt Cutts—that’s a lot. 

To be fair, many of the sites that get penalized are legitimately awful sites that consist of nothing but content spam. However, hundreds of site owners are penalized every day who are trying to make the best site they can. It could even be you one day.

If you’ve been fortunate enough to avoid a penalty in the past, you might think reports of penalties are exaggerated. In most cases, they’re not.

While not all penalties will have the same effect on your traffic, some can wipe out 90% or more of it in an instant.

And penalties don’t discriminate either—they affect both small and large sites.

After the Panda 4.0 update (more on that later), eBay’s traffic was hit hard:


But that’s far from the only example of a big site being penalized.

Recently, another large company named Thumbtack was penalized.

Thumbtack, in case you didn’t know, is a company that Google invested $100 million into, and they still got penalized.

That being said, there is a difference between penalties for small and large sites. If you’re a verylarge site, where a penalty will garner a lot of press, you may be able to get prioritized support in fixing the penalty.

Thumbtack was able to get their penalty lifted in less than a week. If you have a lesser-known site, it’ll typically take a few weeks or months (at least) to correct the penalty.

I didn’t tell you all this to make you terrified of getting hit by a penalty. I did it so you recognize that avoiding penalties is ideal for your business.

If you understand all the different common penalties that Google hands out on a regular basis, you can take simple steps to reduce your chances of being hit by one by 99%.

In this article, I’m going to go over all the main types of penalties you can be hit by:

  • Panda
  • Penguin
  • Mobile-Friendly
  • Top Heavy
  • Payday
  • Pirate
  • Unnatural Links
  • Spam
  • Thin Content

For each of the penalties, I’ll let you know if you have the type of website that is at risk of being hit and what steps you can take to minimize your chances of being penalized in the future.

If you’ve already been hit by one of these penalties, check out my step-by-step guide to fixing any Google penalty.

Panda – This penalty chews up weak content

The Panda algorithm might be the most well-known algorithm.

It was one of the first updates that specifically penalized websites. The first Panda algorithm was run in 2011 and decimated the traffic of a lot of low-quality websites.

In the three years following its release, Panda was run about once per month. Now that the algorithm is more established, it only seems to be run a few times per year.

While this might seem like a good thing at first, it’s a double-edged sword. On the one hand, with fewer updates, there are fewer opportunities to get penalized.

However, Panda is an algorithmic penalty. This means that if you get hit, once you fix the underlying issue(s) that caused the penalty, you have to wait for the algorithm to be run again to get your rankings back.

That means you could be waiting several months to get the penalty lifted.

And if you’re unsuccessful fixing the issues, you’ll have to try again and wait for another iteration of the algorithm.

The basics – What is Panda? The amazing thing about Panda is that even though it’s been run several times over the past four years or so, we still don’t have an exact definition of what types of sites it affects (although we have a good idea).

Google’s search team keep their algorithms as secret as possible. They don’t give much help to sites hit by algorithmic penalties, whereas they provide a lot of support for manual penalties.

As of now, we know that:

The purpose of the Panda algorithm update was and is to keep low-quality (“shallow”) content from showing up in search results.

Therefore, if you don’t have low-quality content on your site, you should be safe from the traffic-eating pandas.

Here is the problem, however. Low-quality can mean many different things.

Google provided a list of over 20 questions to help alleviate the worries of webmasters, but most of these are open to interpretation:


Two different people could be asked these questions regarding the same site and come to different conclusions. I don’t think they are very helpful.

Over time, the SEO community has come together to analyze websites that were hit by Panda and arrived to the following conclusions about pages that get penalized:

  • The content is poorly written (perhaps “spun” using software)
  • The content is very short (“shallow” content that is too brief to be valuable)
  • The content is mostly duplicate content (copied from another page)
  • The content adds no real value

It’s no surprise that content farms, like most web 2.0 sites, were hit the most. They were heavily used by SEOs to create backlinks to content, but those links were placed in terribly written, short articles for the most part.

How do Panda penalties work? Google often patents its algorithms, and it did so for Panda. It was granted its Panda patent in 2014. While you’re free to read it, it’s pretty boring, so let me sum it up for you:

Google creates a site-wide modification factor based on the quality of all the pieces of content on the site. If it falls below a certain threshold, the factor is applied to the site (lowering rankings ofall the pages on the site).

In plain English, this means that if a site has a certain amount of low quality content on it, the entire site will be penalized.

That’s why, when it comes to reports of Panda penalties, you usually see graphs like this one:


Panda penalties are rarely small—they decimate organic search traffic.

How do you know if you were hit by Panda? You don’t get any messages about algorithmic penalties. The only way to spot them is by observation.

If you get hit by a penalty that wipes out most of your traffic, chances are you’re not alone. Monitor SEO news sites such as Search Engine Land to get more information. If it’s a Panda update, it’ll likely get spotted quickly.

If you ever suspect you’ve been hit by a penalty, but it happened in the past, there are online tools that can help you.

One useful free tool is the Panguin Tool. Once you connect it to your Google Analytics account, it will overlay a graph of your traffic over timelines of past algorithms:


If you see that your traffic rapidly declined a few days before or after a major Panda update, you were likely penalized by it.

Remember that these algorithms are often run over long periods of time (weeks), so your traffic decline may not start on the exact day that the algorithm was reported.

Penguin – The bird that can’t fly but can detect your bad backlinks

Only in SEO would a panda and a penguin be so closely related.

Both have had a huge impact on the way SEOs approach their work.

While Panda focused mainly on on-page factors, Penguin was a huge step forward for identifying unnatural link profiles.

The first Penguin was released in 2012 and affected over 3% of all queries. Like Panda, it decimated the traffic of any site it penalized:


What Penguin looks for: Penguin was groundbreaking when it was first run and has become more sophisticated over time.

It looks for a variety of obvious unnatural backlink patterns.

Google will never release the full details of the algorithm (or not any time soon), but we do know that there are three main backlink factors that can be used to identify unnatural link patterns:

  1. Link quality – A site that has obtained all of its links naturally will have links of both low and high quality. Sites made by blackhat SEOs often have a ton of just low quality links oronly high authority links (like from a private blog network).
  2. Link velocity – Look at the backlink growth of any large site, and you will see that it gains links at an increased rate over time. Unnatural sites often get a lot of links in a short period, followed by a sudden decrease.
  3. Link diversity – Legitimate sites get links from all sources (contextual, blog comments, forums, etc.). However, bad SEOs often create a large portion of a site’s links from one source (like blog comments). In addition, links should have varied anchor text. Too many links with the same anchor text could trigger a Penguin penalty.

Complicated, right?

Penguin is one of the main reasons why most SEOs are “whitehat,” or at least “greyhat,” SEOs these days. If you want to manipulate Google, you’ll have to plan your link-building strategy very carefully to make sure that most of your links appear natural.

How Penguin penalizes sites: Penguin is not a site-wide penalty—it affects specific pages.

However, since it affects those pages that typically have the most backlinks pointing to them, you can still lose 80%+ of your traffic if those pages are responsible for most of your traffic.

If your site is flagged by Penguin, you’ll typically be penalized. In some rare cases, Penguin will discount the value of the unnatural links instead of penalizing you.

A tool such as Panguin (shown in the previous section) can confirm that your traffic drop was caused by a Penguin algorithm update.

If your traffic drop was relatively small, you were probably one of the lucky few who didn’t get penalized. The drop was most likely caused by those now-discounted links.

When you’re checking to see if you were hit by Penguin, you should know that it is an even bigger algorithm than Panda. It can take more than a few weeks to fully run.

Recovering from a Penguin penalty is possible but difficult. Not only will you have to try to fix the issue (which could be a number of different things), but you’ll also need to wait for the next algorithm refresh to see if it worked or not.

Mobilegeddon – Can Google force website owners into the future?

Google’s primary goal is to help users find the best content that satisfies their queries.

For the first decade of Internet search, most of the work done by Google was dedicated to finding and classifying content better.

But Google is pretty good at that now.

The biggest factor affecting the user experience (when someone is searching for something) is the content itself. In other words, website owners aren’t improving their websites and content fast enough to keep up.

In early 2015, Google announced that it would start trying to help mobile users find useful results on mobile-friendly websites.

This announcement caused a lot of stir in the SEO community. A mobile-friendly update was soon to come, and it sounded like it was something big.

Site owners scrambled to make their websites mobile-friendly—something that Google would be happy to see (better experience for mobile searchers).

The update finally came a few months later on April 20th.

Although it was called “Mobilegeddon” and “Mobilepocalypse,” it turned out to be much less significant than originally predicted.

There was definitely some movement in the search rankings, but only the worst mobile-offenders suffered traffic losses.


What does Google consider mobile-friendly? Mobile-friendly can mean many different things. This is probably why Google started by just demoting the worst offenders.

Right now, there’s no sliding scale. Your web pages are either friendly or not friendly.

You can see what Google thinks of your content by using the Mobile-Friendly Test tool. Enter a URL, click Analyze, and it will give you a green passing message or a red fail message.


It’s a good idea to check a few different pages such as your home page, a blog post, and any other pages with custom layouts or designs.

Another place to check if you have any major mobile issues is in Google Webmaster Tools (Search Console).

Navigate to “Search traffic > Mobile usability”, and you’ll see any errors that you should fix as soon as possible:


Finally, Google has also released a useful mobile SEO guide. In it, it explains the most common mobile errors such as blocking javascript or messing up your mobile redirects.

On top of those mistakes, here are a few more general mobile-friendly principles to keep in mind:

  • Don’t use software that most mobile devices can’t render, e.g, Flash.
  • Resize text to match the screen (i.e., responsive design)
  • Use text that is easily readable on a small screen (typically 16px or more)
  • Don’t put links right beside each other (hard to tap the right one)

Mobilegeddon in the future: Just because the first mobile-friendly update wasn’t huge doesn’t mean you shouldn’t concern yourself with making your website as mobile-friendly as possible.

Google will likely make changes to the algorithm in the future as it further develops its requirements for what is and isn’t mobile-friendly.

Keep in mind that even if you get hit by a mobile “penalty,” your traffic likely won’t be decimated. This update primarily boosts the rankings of the most mobile-friendly sites, so they’ll just push down your unfriendly pages in the results.

Top Heavy – Balance is the key to any impression

When a searcher clicks on a result in Google, they are looking for an answer to their query.

If they can’t find it, they get frustrated.

So, it makes sense that Google would like to minimize these frustrations by not sending users to sites that make it difficult for users to find what they’re looking for.

The “Top Heavy” algorithm was first run in January 2012.

As the name implies, it specifically targets top heavy sites.

The best explanation comes from Google itself:

“We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.

So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

Such sites may not rank as highly going forward.”

How the Top Heavy penalty works: This is a site-based penalty. That means that either all of your content is penalized or none of it is.

Google clarified this after an article on Search Engine Land pointed out that Google’s results themselves could be seen as “top heavy.”


Google responded by saying that only sites where most pages are “top heavy” will be penalized.

If it’s only a few pages, don’t worry about this algorithm.

The final thing you need to know about this algorithmic penalty is that it is run very infrequently.

It was first run in January of 2012, then October of 2012, and most recently in February of 2014. If you get hit with this penalty, you’ll have to be patient to get it removed.

Avoiding a Top Heavy penalty: Although it may seem unfair that the algorithm is only run about once a year, it’s fairly difficult to get hit by this penalty.

Here’s an example of a top heavy layout:


Unless you have multiple ads, all above the fold, you’re probably safe.

And really, these types of sites should be penalized. They’re extremely frustrating to the average searcher.

If your content is pushed below the fold, chances are your site visitors won’t bother trying to find it.

To avoid this penalty, just create a good user experience.

Payday – If you prey on hopeful readers, your Payday may be over

Anyone who has been in the Internet marketing industry for some time knows that shady industries can be very lucrative.

Most of the best blackhat SEOs compete against each other to rank for keywords in the gambling, loan, and supplement niches.

This algorithm—“Payday”—was appropriately named for some of the most lucrative, and therefore competitive, search engine results for Payday loans.

Combatting spammy results with the Payday algorithm: We’ve seen in the past few years how good Google is at catching blackhat SEOs.

It has repeatedly crushed large portions of their sites, mainly belonging to beginner and intermediate SEOs.

However, the best blackhat SEOs won’t go down easy.

There is a small group of SEOs who have the ability and will to manipulate Google. They are good enough to rank well in these high paying niches and make enough money to justify it before getting penalized.

The Payday algorithm was first run on June 11, 2013, and rolled out over a few months.

It specifically targeted queries containing keywords such as:

  • Payday loans
  • Casinos
  • Viagra
  • Garcinia cambogia
  • and more.


The second version of the algorithm was released on May 17th and 18th of 2014, and the 3.0 version was released soon after in June.

If you operate a site in any “spammy” niche, you need to be extra clean if you want to avoid being penalized. Otherwise, if you’re getting results with blackhat SEO, expect to be penalized eventually. If that happens, you’ll just have to move on to a new site.

If you have a legitimate site that was hit by this penalty (line up traffic drops with any of the algorithm dates), you can try to fix it. However, you’ll have to wait for the algorithm to be updated again for any positive changes to take effect.

Pirate – Outlaws be warned! The Google police are coming for you

Google almost always tries to show searchers the results they want.

However, Google has taken a strong stance on piracy.

Piracy, which is essentially stealing copyrighted content, is considered unethical by many and is illegal in some countries (although hard to enforce).

The “Pirate” algorithm was Google’s answer to the growing number of torrent sites (mainly used for pirating media and software) showing up in search results.

Based on the following graph of the traffic for some of the top torrent sites, I’d say it worked pretty well.


It didn’t knock them out of the search results altogether, but it reduced a large chunk of their traffic:


The reason why they still attract organic traffic is because not all their content is illegal material. In addition, this algorithm had no effect on branded searches.

Other sites that were purely made for pirating did lose most of their traffic. For example, lost 96% of its search visibility:


How the Pirate algorithm works: The main purpose of this algorithm wasn’t to eradicate torrent sites from the search results altogether, just for certain queries.

For example, if someone searched “Game of Thrones season 5 episode 6,” the searcher should not get torrent results. Before this update, torrent links to the episode would show up. But now, only reviews and legitimate ways to watch the show (HBO) are in the results:


The algorithm works based on copyright reports.

If a site has a lot of copyright violations, this algorithm will penalize it by lowering its rankings.

While new torrent sites can be made, they will be removed each time the algorithm is run if they have accumulated enough violations.

To get an idea of the scale on which copyright violations occur, consider this: Google receives requests to remove over 10 million URLs from search each week:


Not all of those are legitimate claims (Google always verifies first), but it’s still quite a bit.

If you want to avoid the Pirate penalty, it’s simple: don’t steal content (or I suppose don’t steal too much of it).

Unnatural links (manual) – Diversity is healthy

Manual penalties are a whole different beast when it comes to Google penalties.

They can be just as damaging to your traffic levels as algorithmic penalties are, but at least you’ll be able to see if you were hit by one.

As the name implies, manual penalties are given by Google employees and contractors who review your site against their quality guidelines and deem that you are violating one or more of them (most common ones are below):


One of the most influential ranking factors has been and still is backlinks. The more backlinks a page has, the better it ranks (in general).

Of course, SEOs started manipulating this as soon as they found out.

Manually reviewing backlink profiles of “unnatural links” is one of the ways Google combats this.

If the reviewer sees that a large portion of your links are paid links or part of a link scheme, you will be hit with this penalty.

Different forms of unnatural link penalties: Many different penalties include the phrase “unnatural links.” Some have more of an effect on your site than others.

If you log in to Webmaster Tools (Search Console), you can see whether you have any manual actions applied to your site:


The three most common actions are:

  1. “Unnatural links to your site—impacts links.” If you have unnatural links, but it doesn’t look like you had any part in creating them, you’ll get this manual action, which isn’t actually a penalty. The links will no longer factor into your rankings (so traffic might drop a bit), but there’s nothing you need to do to “recover.”
  2. “Unnatural links to your site.” If you just see this message, then you’ve been penalized. It means that the reviewer has concluded that you’re responsible for the shady links. Depending on the specific message, either specific pages will be penalized or your entire site could be.
  3. “Unnatural links from your site.” If you’re always linking to specific sites with exact anchor text (for a high volume keyword) or you have way too many links pointing out from your site, you could get hit with this. This penalty can affect either a portion or all of your site.

Fixing a manual penalty: While no penalty is good, manual penalties are better than algorithmic. Once you fix the issue, you can apply for reconsideration. If you truly fixed the problem, the manual action will be lifted.

Once again, you may need to refer to my step-by-step guide to fixing any Google penalty.

Spam (manual) – If you’re going to play around, at least do it carefully

While most SEOs believe that spam refers solely to blasting thousands of links to a site, it’s much more than that.

The term spam, at least when it comes to manual penalties, also includes things such as:

  • excessive or malicious cloaking
  • scraping content
  • automatically generated content
  • and more.

Just like in the case of unnatural links manual actions, there are many different spam-related messages that can show up as a result of a manual action. These are the most common:

  1. “Pure spam.” The majority of the site is clearly spam, or the backlinks to the site are all spammed. It’s next to impossible to recover from this manual action.
  2. “User-generated spam.” If you have a site that allows users to submit content, you could be penalized for it if they abuse it to create spam content or links. Most commonly, this penalty refers to spam in comments or forum posts/profiles. It can be fixed.
  3. “Spammy freehosts.” If you’re unlucky enough to have your site hosted by the same web host that provides service to a ton of spammers, your site might be lumped together with them. This is a good reason to stay away from very cheap or free hosting services.

Since these are manual penalties, they can be fixed. Recovery usually involves either cleaning up on-site spam or disavowing spammy links.

Thin content with no added value (manual) – No one likes hearing the same story over and over again

If Google doesn’t get you with Panda, it may get you with a manual review for having thin content.

Thin or duplicate content typically consists of information that can be found elsewhere, either on or off your site.

If a manual reviewer spots that most of your content is derived from other content, you can get hit with this penalty, and your traffic will take a tumble.

Here are the most common scenarios that represent “little or no added value”:

  • Automatically generated content
  • Thin affiliate pages
  • Content from other sources, e.g., scraped content or low-quality guest blog posts
  • Doorway pages

When you go to the Manual Actions section in Webmasters Tools (Search Console), you can see whether you’ve been hit by this penalty:


Pay close attention to whether it says that it’s a site-wide match or a partial match.

If it’s a site-wide match, that means the penalty applies to all your content until you fix it. If you just have a few pages of thin content, it’s possible that the penalty will only affect those. While you should still fix it, it won’t have a huge effect on your traffic.


Penalties are part of every SEO’s education.

Most are deserved, but some happen accidently. Understanding the root causes of penalties is the first step to preventing them from occurring and fixing them if you do get hit.

Once you have a good grasp on all the penalties, monitor Moz’s Google algorithm change log for any new ones so you can stay on top of them.

If you’ve discovered that you’ve been doing something that might get your website (or your client’s) penalized, stop it and correct it. Hopefully, you’ll catch it in time to avoid a penalty.


Advanced SEO: How To Easily Analyze Your Competitor’s Keywords

Just Found this

Written by Neil Patel on September 1, 2015, Shares his deep insights to concentrated research

Competition can be scary, especially when it comes to SEO.

When trying to rank for a keyword, you might be going up against a junior intern, or you might be going up against an experienced SEO company.

Typically you will be competing with other SEOs around your own skill level, although that’s not always the case.

The good news is that, if you take the time to do comprehensive keyword competition analysis, you’ll be able to identify keywords that haven’t been targeted by the most skilled SEOs in your niche.

While it will take you a lot of time up front, it will save you much more in the long run.

Assuming you’re producing high quality content, you’ll have a much easier time getting it seen and getting it ranking highly on Google and other search engines.

You will need exponentially fewer backlinks, which will save you either a ton of time or money.

In truth, there are only a small amount of low competition keywords in a niche at any given time (new keywords pop-up as others disappear), but that’s all you need.

Ranking for just a handful of low volume, but low competition keywords will get your organic search traffic started. Traffic has a tendency to grow exponentially.


As you build your authority and trust with visitors and search engines, you’ll be able to start ranking for more competitive terms down the line without any major new investment.

I’m going to show you a comprehensive method of analyzing keywords to target. None of it is too complicated, but it is a lot of work. Be prepared to put it in now and it will save you effort in the long-term. 

Why no competition analysis is perfect

Before we get going, there’s something important that you need to learn.

Keyword competition analysis is an estimate of the competition for a keyword, but it’s not a science.

Just because a keyword appears to be easy, or easier than another keyword to rank for, doesn’t mean it will be easy to rank for in all cases.

Yes, competition analysis is very useful. It will give you a good idea of what you’re up against and where your opportunities are.

However, remember that your results won’t always be accurate. There are 2 main reasons for this.

First, no one knows what Google is thinking: While we understand which ranking factors are most important for the most part, we can’t exactly quantify them.

Your competition analysis is a reflection of how you think Google ranks sites. But even teams of very smart analysts have been unable to recreate the Google ranking algorithm.

This means that our competition analysis methodology isn’t 100% accurate.

In addition, Google constantly changes their algorithm (about 500 times per year).

So even if you’re able to perfectly predict the competition level for a keyword today, it might be a bit off in a few weeks or month.

Secondly, there’s a tradeoff: When doing keyword analysis, you will always face a tradeoff between efficiency and accuracy.

The more ranking factors that you try to take into account, the more resources (time and money) you will need to do your competition analysis.

If you only take into 1 or 2 ranking factors into account, you can do competition analysis quickly, but it won’t be too accurate. As you add more ranking factors into the equation, you start to get more accurate results, but it takes more time.

It’s up to you to decide on a good balance between efficiency and accuracy for your situation.

The most important SEO factors to consider

The main strategy behind keyword competition analysis is to look at how the top rankings fare when it comes to the most important SEO factors.

For example, one of the factors we will look at is the number of backlinks. If a page has 0 backlinks, it’s likely to be easier to beat than a page with 100 or 1,000 links.

When we do this for several factors, we are able to see if a keyword is “low competition” or “high competition.” Then, you can decide whether or not to target that keyword.

Our first step is to decide on which factors to consider in our competition analysis.

To do that, let’s turn to SearchMetrics 100 page report of the most important ranking factors.

This report’s data consists of over 100,000 different search engine result pages (SERPs). The team looked at different potential ranking factors to see if top ranked sites tended to score higher (in regards to these factors) than low ranking sites.

Below is an image excerpt of the factors with the highest correlations, which means that top sites tended to have the most of these factors:


One key thing to remember is that correlation does not equal causation.

What this means is that just because top ranked sites had a lot of Facebook likes and engagement, doesn’t mean that Facebook likes and engagement causes you to rank highly.

In fact, in that specific case, it’s more likely that it doesn’t. Sites that rank highly get more traffic, which likely leads to the boost in social sharing.

In addition, sites that have large social followings are usually large sites that already have a lot of domain authority, so they naturally rank higher.

Are any of those correlations useful?

Yes, because some are legitimately caused by the factor helping sites to rank better.

It’s our job, as SEOs, to test each factor individually and figure out which ones do and don’t help.

For years, we’ve known that backlinks help. However, Google has come out and stated that social signals do not affect rank in the past. Some case studies have seen temporary ranking increases from social signals, but it’s not a factor I would suggest focusing your attention on at this time.

For the purposes of this post, I’m going to look at the most important factors that are known to help pages rank. If you’d like to add more on top later in your own analysis, then you’re welcome to.

Factor #1 – Backlinks: When a site links to another site, it counts as a “vote” for the site being linked to. That’s nothing new. We know that backlinks are a key ranking factor, and they need to be part of any analysis.

However, we need to look at them on a few different levels, which is where it can get a bit tricky for beginners.

First, we need to consider that backlinks are important on both a page level (links to the exact page), and a domain level (total amount of links to all pages on the domain).

Second, we need to consider that not all links are created equally. Links can have different value based on which page they are located on.

To analyze backlinks, you need a backlink database tool. For a serious analysis, you’re going to need a paid plan to one of the best tools. I recommend Ahrefs or Majestic. They are by far the 2 most comprehensive backlink database tools.

When you want to analyze a specific page (from a SERP), you’ll simply plug it into the textbox on one of these tools.

As an example, let’s say that you saw my beginner’s guide to online marketing in a search result, and decided you wanted to see how hard it would be to outrank it.

Typing the URL into Majestic reveals that there are 232 domains that link to that specific page, and over 4,600 total backlinks from those domains.


That’s quite a bit for any single page.

In addition, you also need to check how strong the domain is in general. By switching the dropdown beside the URL, or changing the URL to the root domain, you can see all the links to Quick Sprout:


Just about 16,000 referring domains, and well over 1 million backlinks.

So that’s how you look at quantity, but how do you look at quality?

One high quality link is worth hundreds or thousands of low quality links, so it’s important to not just go by numbers.

You could examine the quality of each link individually. A high quality link is:

  • on a page that has a lot of links to it itself
  • relatively high up on the page
  • found naturally in the page’s text (surrounded by appropriate description)
  • is on a relevant page
  • is on a page without too many links (link power is divided by number of links)

In other words, quite a bit goes into it.

It would be impossible to evaluate this for every single link.

Luckily, link database tools have a pretty good solution for us. They algorithmically try to determine the quality of each link. It’s not perfect, but it’s pretty good.

On Ahrefs, every page and domain get a score:

  • URL rating: A score that represents the overall quality and quantity of links pointing to that specific URL, on a scale from 0-100.
  • Domain rating: A score that represents the overall quality and quantity of links pointing to anywhere on the domain, also on a scale from 0-100.

Majestic is slightly different. It uses 2 metrics:

  • Trust flow: A score purely based on the quality of links to the page you enter.
  • Citation flow: A score based on the quantity of links to the page you enter.

In general, citation flow will be a bit higher than trust flow, but if it’s more than about 1.5 times higher, it’s likely that the page has a lot of low quality links.

Instead of manually checking the quality of every link to a page, we’ll be using these metrics (or similar) for a quick check.

Factor #2 – Relevance: When it comes to search results, relevance is the most important factor. When someone searches for “yellow tables”, they will be disappointed unless they find a page about yellow tables.

A long time ago, relevance was mainly determined by having the exact keyword in the domain, title, and body of the page.

However, Google is now great at picking up synonyms, along with user intent.

So now, if a searcher types in “yellow tables”, Google knows that they probably want results where they can buy a table, not a fluff article about what a yellow table is. That’s user intent.

In addition, Google would also include synonyms for either yellow or table in the results. So you might see results for “golden tables” or “yellow stands.”

This is a factor that you will have to assess manually. I have not come across a reliable wide-source automated tool for this.

We can look at basic keyword density, but it’s very difficult to understand user intent and include the right synonyms without manually looking at a page.

Let’s walk through a quick example.

Pretend you were searching for:

“content promotional tactics”


Google will bold synonyms in meta descriptions and URLs. In this case, we see that Google knows that “content promotion strategies” means essentially the same thing as what we searched.

If we’re assessing the top few results for relevance, our main question is: “how well does this satisfy someone who searches for the keyword?”

If the answer is, “not very much”, you can probably outrank it.

So here’s the second result from that search:


This page has “50 promotional tactics”.

And while that’s fairly comprehensive, I feel that the quality could be improved. The picture above is of the first 2 tactics, which aren’t really promotional tactics at all.

They are also not very data-driven and there’s no clear cut examples or walkthroughs, which the visitor would probably appreciate.

Overall, it is relevant, but I think it could be improved on significantly.

If you come across a keyword where the first 3 results are basically perfect answers for the query, don’t bother trying to outrank them, it’s going to be very difficult. However, that’s also a rare situation.

Factor #3 – User satisfaction: This is related to relevance, but there are a few differences in how we will evaluate it.

It also needs to be done manually, but we can look at a few different factors to determine how much users typically like the page.

Since we can’t see things like bounce rate and time on page, we need to rely on public information.

First, we can start with how many social shares it has. A page that everyone loves will have a decent amount of social shares. So if we see a page with few shares (in niches where people aren’t embarrassed to share), we know that it’s probably not fully satisfying searchers.

On most sites, you can see the share count displayed somewhere prominent:


But if you can’t find the share count, use a share count tally tool.

Just type in the URL when prompted and submit it:


It will pull up the number of shares from the most popular networks:


The second place that we’re going to look is the comment section (if there is one).

If people are saying things like:

  • “Amazing post!”
  • “This changed my life”
  • “This is the best post on (topic) I’ve ever read”

Then they’re likely satisfied. On the other hand, if there are a lot of complaints or suggestions, most visitors probably left the page unsatisfied and went back to the search results.

Look for both the number of comments, and what’s in the comments themselves.


Factor #4 – Do you consider all your visitors?: Google has made it clear lately that it wants you to optimize your website for your visitors.

With the recent mobile-friendly update, and preference given to fast loading sites in search results, it’s clear that Google wants mobile friendly and fast pages in their results.

If the top results are not mobile-friendly and load slowly, it’s an indicator that Google has to rank a page that it doesn’t really want to. Unfortunately, there are no other pages of the same content quality, but that are faster and responsive. You could fix that.

This is another manual check, so you won’t need to do it for every search result. But it can be used as a final check before you make a decision to target or not target a keyword.

First, check if the page is mobile friendly by using Google’s own mobile-friendly test:


Put in the URL of the page and click analyze. It’s a simple pass and fail test.

Next, check site speed using a site speed tool like Gtmetrix. Paste in the URL and click “analyze.”


After a quick scan, you’ll get a performance report for the site. Pay special attention to the “Page Details” box:


In this case, the page isn’t slow, but it’s not fast either.

In general, a fast loading page loads in under 2 seconds and only has a few dozen requests. If you see a page that takes 4+ seconds to load, there’s a real opportunity to beat it.

Tools to help you do this faster

By now you might realize that it would be next to impossible to do a large scale keyword competition analysis without the help of some automation, which is where tools come in.

There are hundreds of tools out there to do this specific job, some better than others.

Most of them work similarly, so I’ll walk through a few so that you can understand how to use them, and how they work.

Tool #1 – Term Explorer: This tool can be used to find keywords, but also to analyze their competition, which is what I’ll focus on.

Your first option, once you create an account, is to run a bulk keyword job:


You enter one or more keywords, and it will give you a list of results anywhere from 1,000 to 90,000, depending on your account type and choice.


Once you run the job, you will get keyword results along with search volume data. You can easily filter the results according to keywords or search volume.


If you find a few keywords that you like (or many), you can check the box beside them, and then click the blue button at the top to send them to the keyword analyzer:


Alternatively, you can enter in keywords from other sources into the keyword analyzer directly.

You will get a similar report this time, but you will see an overall “difficulty” score from 1 to 10. This is based off the 3 categories to the right: relevancy, link strength, and trust.


A higher difficulty means that it’s more difficult to rank for the term. With this particular tool, a score under 3.5 or so is “easy”, while 3.5-5.0 is “normal”, and above that is difficult.

Tool #2 – KWfinder: This is another tool similar to Term Explorer. To generate keyword ideas, enter a seed keyword into the only box on the page:


It will quickly generate a list of keywords (I believe based on Google’s Keyword Planner).

You’ll also note that each keyword will have an “SEO” score beside it:


The scores range from 0-100 and are color coded (green is easy, red is hard). If a score isn’t showed by default, you will need to click the magnifying glass.

In addition, you can get a more detailed look at the competition behind any of the keywords.

If you click a keyword, the search results for that keyword will pop-up on the right. It shows data from Majestic (trust flow, citation flow), and shows you an SEO score for each search result.


The average score is the one you initially saw.

This is important because average scores can be skewed by one or two results. If there are some very low scored pages at the bottom of the first page (maybe they’re temporary), they could make the keyword look much easier than it really is.

Tool #3 – Moz Keyword Difficulty tool: A third tool for keyword competition analysis that you can use comes from Moz.

It’s one of the many tools that comes with a Moz Pro subscription.

Type in any keywords you’d like to analyze into the tool’s main text box.


It will bring up a quick report that shows you the keyword difficulty as a percent (maximum 100).

To get more detail, you can click the “view” button under the basic SERP report column:


The bottom section shows the same information as the top section, just in a bit more detail.

Note that the page authority and domain authority here come from Moz’s Open Site Explorer, which isn’t as reliable or complete as the other link databases we’ve looked at.

Tool #4 – Ahrefs Toolbar: This tool isn’t an automated competition checker. However, it can be used to quickly assess the strength of an actual SERP.

When you search for a term in Google, the toolbar will load a small bar under each result:


It shows link information from the Ahrefs database on a page and domain level. In particular, you can look at the URL rank (“UR”) and the domain rank (“DR”).

Step #1: Gather your keywords

Now that you have a good idea of what competition analysis consists of, and some of the tools that you can use to simplify the process, I want to walk you through, step-by-step, of how to actually do it.

Before you analyze the competition of keywords, you’ll need to gather a rather large list of them. It’s more efficient to do competition analysis in bulk.

Keyword research is an important skill by itself, and it’s something that you should spend some time learning before doing competition analysis. Here are some guides that will help fill in any gaps in your knowledge:

Try not to take the easy way out when you’re doing keyword research.

If you just plug in a popular keyword like “content marketing,” you’re going to get the exact same keyword list of thousands of SEOs before you.

Using some of the creative methods in those guides above, you can find “hidden” keywords that fewer people are targeting.

When less people are targeting a set of keywords, you’re more likely to find some low competition keywords ripe for the picking.

Step #2: Start filtering out keywords

Once you have a large list of keywords, it’s time to start assessing the competition of them.

Unless you have an experience eye (and even then it’s difficult), you’re not going to be able to reliably pick out low competition keywords without a thorough assessment.

This means you have 2 options:

  • a manual assessment: You could manually review all of your keywords. However, unless you have a few weeks of free time, this isn’t really feasible.
  • an automated/manual hybrid assessment (recommended): You can use tools to get started and find the highest competition keywords. Take these out of the results and then manually review the more promising keywords.

I would hope you’d pick the second option.

Run your keywords through a tool I showed you above (or similar tool) so that you can get an estimate of the competition:


It’s important to remember that this is just a quick estimate. There are many ways that the results can be skewed.

If you’re creating a ton of content, you might be able to just target all the lowest marked keywords, and then see which pages are ranking the easiest. However, most people need to be more selective.

You can typically check the box beside a keyword and save the selected ones to a private list. If not, save the most promising ones manually to a spreadsheet.

The idea here is to take out any keywords that are obviously going to be too difficult to target.

Step #3: Dig in deeper

By the end of that process, you should have a much smaller list of keywords. I recommend aiming for between 10 and 20% of what you initially started with.

Give the list a quick look over and take out any keywords that don’t make sense for your site.

Once you have a final list, the hard work begins.

Your tools have told you that every query you are left with is relatively easy to rank for based on the most common metrics, which is usually a combination of domain and trust authority.

Now, you get to see how easy they are to rank for from a user’s perspective.

For each keyword, search it in Google, but you need to make sure it isn’t personalized.

Go to “”, which is the global Google search (the “ncr” prevents redirection to your local Google). Make sure you’re in a private (or incognito) browser and you aren’t logged in to any Google account.


You can choose how many results to look at. I recommend starting with the top 3, and if you’re not sure if the keyword is low enough competition, continue with the next 5 to 7.

For each of the results for each keyword, you want to evaluate the following things:

  • how many (and what quality) backlinks point to the page
  • how many (and what quality) backlinks point to the domain
  • if I was a user who searched this keyword, would I be fully satisfied with this result?
  • does it load quickly?
  • is it mobile friendly?

You can also add in any other Google ranking factors to your analysis as you’d like, but it will take you more time.

Step #4: Make a decision

After you analyze each keyword, you have to decide if it’s worth going after or not.

Unfortunately, there’s no magic metric or tool that will help you this.

To make this decision, you need to first consider many factors:

  1. How strong is your domain? (How many links point to it, what’s its domain authority?)
  2. How easily can you get backlinks? (Do you have experience in SEO? Do you have a network of contacts to ask for links from?)
  3. What level of content can you produce? (Do you have the skills or financial situation to create the best content?)

At some point, the authority, trust, and relevance of the page you create for a particular keyword, needs to exceed all other results that you have examined.

The stronger your domain, the easier it is to rank.

The easier you can get quality backlinks, the easier it is to rank.

The bigger the budget you have for content, the easier it is to produce something that no one else can match. Not only does this make it easier to rank short term, it also makes it easier to stay there.

Based on these factors, you have to determine if a keyword is worth going after during your manual review.


Keyword competition analysis is not a science.

While you can get some information from a tool, you’re also going to have to apply some SEO expertise of your own to find low competition keywords.

This is going to take some practice and time. The good news is that you’ll save that time, and so much more, if you pick your keywords carefully.

The easier the keywords that you target are to rank for, the more consistent your results will be.

  1. Neil,Great job! I think there is no point reinventing the wheel. The best way to get more leads is to spy on your competitor. Afterall, it is said that if you want to travel to a place you have not been to before, the best thing to do is to ask those who are have gone there. Great job and very in-depth and apt for me,typical of Neil Patel.

I-Want-to-Buy Moments: How Mobile Has Reshaped the Purchase Journey


Written byAllison Mooney , Brad Johnsmeyer

Published in May 2015

For today’s constantly connected consumers, shopping never sleeps. Whether making an everyday purchase or researching a big-ticket item, we reflexively turn to mobile. These I-want-to-buy moments are important for consumers, and they’re critical for brands. Are you winning these micro-moments?

Written by
Allison Mooney , Brad Johnsmeyer
Published in
May 2015
Giana was in her local drugstore, trying to understand why two brands of the same type of treatment differed in price by $15. (The more expensive option coming in a smaller tube, no less.) “I thought there had to be a difference,” she recalls. So she pulled out her smartphone and searched for product reviews, right there in the store aisle. She ended up going with the higher-rated, yet higher-priced product. Looking back, Giana realizes that had she not had her smartphone, she probably would have bought the cheaper one.

Erica was at the airport, killing time before a flight, when she searched for a mortgage calculator. She wanted to figure out whether she and her husband could really afford a new home. That first smartphone search led to many more in stolen moments throughout her day as she researched the purchase step by step. “Being able to do it in the moment makes it more fun and less daunting,” says Erica. “I can space it out throughout my day.”

For today’s constantly connected consumers, shopping never sleeps. Whether we need to make an everyday purchase or research a big-ticket item, we reflexively turn to our devices. This happens in hundreds of micro-moments throughout the day when we’re making purchase decisions. (What’s the best choice? Can I afford to buy this? Is it worth it?) These I-want-to-buy moments are important moments for consumers, and they’re critical for brands. They’re opportunities to connect, especially on mobile: 93% of people who use a mobile device for research go on to make a purchase.1

Read more about Giana’s and Erica’s stories in our forthcoming micro-moments research.

Mobile: The new shopping assistant

Related Story

I Want-to-Do Moments: From Home to Beauty

We don’t go to a store without our wallets; many of us say the same thing about our smartphones. In stores, 82% of smartphone users turn to their devices to help them make a product decision.2 What they find online can influence their decisions right down to the very last minute before a purchase. After reading something on a smartphone, nearly one in four shoppers has changed his or her mind about buying something while in the checkout line.3

Some marketers might see this as a threat and wring their hands about “showrooming,” because they are concerned that consumers will end up buying products elsewhere. But savvy brands see it differently. “We think one of the biggest opportunities that we have in retail is for our customers to leverage their phones as a shopping assistant when they’re standing in the store,” says Sephora’s vice president of interactive media, Bridget Dolan. To assist shoppers in these moments, Sephora designed its app to pull up product ratings and reviews when an item is scanned. “Having access to this information is that perfect new moment for customers to find everything they’re looking for and get advice from Sephora.”

Dolan and her team also realized that people were searching for products on mobile before heading to a physical store. To reach shoppers in these critical moments, Sephora began using local inventory ads to let customers know when particular products, such as lipstick, eyeliner, or perfume, would be available at a nearby store. This helped drive shoppers into stores, and they often bought more once they arrived. “A client that really knows exactly what she’s buying—all the reviews and all her options—is actually a happier client and will come back and shop with you more often,” says Dolan.

From product review to purchase

When searching in the moment, we often rely on reviews. In one study, more than half of millennials surveyed said they check product reviews on their phones while shopping in a store4—and YouTube has become a top source for reviews.

Related Story

New Research Shows How Digital Connects Shoppers to Local Stores

There are more than 1 million YouTube channels with product reviews where creators, brands, and experts share their opinions about a range of products, from consumer electronics to cars.5 These reviews take many forms. For example, “first impression” videos feature creators, such as Lauren Curtis, opening a product and giving their immediate take. In “haul” videos, shoppers show off their new purchases (usually clothes and beauty products) on camera. And in“sneaker pickup” videos, sneakerheads share stories of scoring a prized pair of kicks. The audience is bigger than ever; views of product review videos have grown 50% year over year.6

These I-want-to-buy moments are important moments for consumers, and they’re critical for brands.

As I-want-to-buy moments are increasingly also becoming I-want-to-watch moments, brands are amping up their mobile video strategies by creating a range of helpful content. “We’re in a world today where people are on 24/7. They have more choices when it comes to what they look at and when they look at it,” says Alison Lewis, CMO at Johnson & Johnson Consumer Companies. Last year, the company’s CLEAN & CLEAR brand created more than 100 videos for its YouTube channel, many of which are answers to specific product questions. “It’s the content that consumers are already coming to you for,” saysKacey Dreby, group brand director at CLEAN & CLEAR®. And when people watch these videos, they’re further down in the marketing funnel and closer to the point of purchase, Dreby points out. Indeed, we see that after watching a YouTube video, people actively search for products. Across more than 800 campaigns studied, 65% of campaigns see a significant lift in brand interest after viewers watch their TrueView ad on YouTube.7

Growing Genres of Product Reviews on YouTube

Source: View count growth, Google Data, April 2015 vs. April 2014, U.S., classification was based on public data such as headlines, tags, etc., and may not account for every type of review video available on YouTube.

Big decisions on small screens

I-want-to-buy moments aren’t just important for low-consideration purchases. As we saw with Erica, they also happen when we’re making big decisions such as investing in a new home, booking a vacation, or buying a car. In the auto category, for example, searches on mobile are growing 51% YoY.8 “Today’s path to purchase is more dynamic than ever before,” says Dionne Colvin-Lovely, director of emerging and traditional media at Toyota. “Car shoppers leverage mobile at the beginning and middle of their purchase process and continue to research and shop online while at the dealership.”

This constant access to information means that immediacy and relevance are now table stakes for brands. “Mobile’s rapid evolution is changing the expectations of today’s car buyer,” says Colvin-Lovely. “We want to make it easy for our customers to discover the information they are seeking about Toyota as they search, read, and watch auto-related content. To do so, we leveraged a variety of mobile tactics—from high-impact sponsorships and takeovers to dynamic, hyper-targeted, location triggered placements—to ensure Toyota remains top of mind during these key moments,” she says.

Search Interest in “Reviews” and “Test Drives” in the Auto Category on YouTube

Source: Google Trends, United States, 2008–present.

How to win I-want-to-buy moments

Whether they’re in a parking lot, in a grocery store, or waiting patiently at the airport, shoppers are using smartphones to help them decide what to buy. Here are five ways brands can win these micro-moments:

1. Identify your consumers’ I-want-to-buy moments. Talk to them—in stores, through surveys, in focus groups and forums—to figure out when and how they’re researching and making purchase decisions.

2. Be there in these moments of need. Create a comprehensive strategy that works holistically across channels such as search, video, social, and display. Keep in mind that consumers may be at home, in store, or somewhere in between.

3. Deliver relevant messaging. Simply being there in these moments isn’t enough. Look at how people are searching—the questions they ask, the terms they use—and create ads and content that provide helpful answers.

4. Make it easy for them to make a purchase. The step from research to purchase should be a simple and seamless one. Give the consumer multiple ways to buy—whether that means driving them to your e-commerce site from a YouTube video or from a local inventory ad to a nearby store.

5. Measure every moment that matters. It’s no longer enough to simply measure the online conversion. With mobile, the path to purchase is now fragmented. As a result, advertisers need to measure results online, across devices, in apps, and even in stores.

1 Google/Nielsen, “Mobile Path to Purchase” study, November 2013, United States.
2 Google/Ipsos, “Consumers in the Micro-Moment” study, March 2015, United States.
3 Google Consumer Surveys, April 2015, United States, n=1130.
4 Google Consumer Surveys, April 2015, United States, n=365.
5 Google Data, April 2015, global.
6 Google Data, April 2014 vs. April 2015, United States.
7 Google Data 2015, United States, brand interest measured via search volume/activity on
8 Google Data, Q1 2014 vs. Q1 2015, United States.

20 Ways to Effectively Market Your Small Business


 Marketing is an afterthought for most small business owners. Between trying to manage employees and keep customers happy, small businesses owners usually don’t have the time create and implement a marketing campaign that drives brand engagement, generates leads, and boosts sales. Because we live in a digital age, we’re going to focus on the top online marketing strategies that you can use to promote and effectively market your small business.

Let’s dig in.

1. Start an Email Marketing List

Starting an email marketing list may sound difficult, but it’s really not. Think about all of the emails you receive from companies and brands, and then think about how they acquired that information from you—it was likely through an online promotion or a form on their website where they incentivized you to sign up. Hey, you’ve got to give a little to get a little.

2. Use that Email Marketing List

Email marketing is one of the quickest and most effective ways to drive business. Think about it for a second; you’re delivering targeted messages straight to your customers’ inboxes, and you’re reaching them on their mobile device if they’re on the go. More importantly, it’s a completely free strategy that can deliver big results if your emails are engaging and your call to action is strong.

3. Focus On User Experience

It doesn’t matter if you’re an eCommerce company selling trendy socks or a SaaS company trying to collect data and then cold call—if your website isn’t easy to navigate you’ve already lost the business. Think about the user. Does your website make it easy to find a product or service and then purchase it or inquire about it? The user always comes first, and you need to put yourself in their shoes and make sure that your site is easy to maneuver.

4. Make Yourself Look the Part

While we’re on the subject of your website, it doesn’t look like it was built in the ’90s, does it? User experience is the priority, but design is a close second. Your website should feature full frame imagery and a progressive, clean look that makes your company look like the leader in your industry, even if you’re a startup.

5. Use Social to Communicate

A lot of people get caught up using social media to shove their products and services in front of their fan base, and that can be a definite turnoff. Sure, social is an effective way to drive sales, but you should focus on using your social networks to engage with your followers and communicate with them. Ask them questions, and then use that feedback to put out a better product and improve your process.

6. Don’t Let SEO Fall by the Wayside

There’s a lot to think about when you’re developing your marketing plan, but one thing you can’t afford to let slip is your search engine optimization (SEO). As Google continues to make its algorithm changes, it’s a necessity that you adhere to those changes and optimize your website accordingly so that you can obtain—and ultimately maintain—your high organic rankings.

7. Write Great Content

A big part of Google’s ranking algorithm for SEO is content. The search engine giant wants to see that your website is consistently publishing informational and educational content that’s meaningful. With everyone now putting more effort into content, you need to make sure that yours engages the reader and provides them with some sort of value. Don’t write good content—it has to be great.

8. Make Your Followers Feel Special

There’s nothing worse than being a consumer and feeling like you don’t matter to a brand, especially if you consider yourself loyal to that brand. If you want to keep your followers happy and engaged, reward them with sweepstakes, promotions, and giveaways to show them you care and that you’re not all about the sale (even if you are).

9. Use Paid Search

Paid search, also known as PPC, can be a daunting strategy for a lot of small business owners because there’s a lot to learn and understand. From CPCs and CTRs to conversion rates and quality score, PPC can be confusing, but if you work with a specialist or hire an agency, it can work wonders for your business. SEO can take months—even years—to get to the top of Google, but with paid search, you can be on top of Page 1 within a couple of hours.

10. Have an Editorial Calendar

We talked about the importance of putting out great content, but knowing when that content is going to be published is just as important. Having an editorial calendar will keep your content initiatives in order and will assure that you’re serving as the hub of information for your small business.

11. Deploy Remarketing

Remarketing, also called retargeting, is a form of paid search that helps you stay in front of potential customers and stay top of mind. In short, how it works is that you cookie your website visitors’ browsers (using a snippet of code pulled from your PPC platform), which then allows you to follow them around the Internet with targeted advertisements as they visit other sites on the Web. Sound creepy? Maybe a little, but you’ll probably change your mind once you see the data behind it.

12. Audit Your Competition

Look at your most successful competitor and really take a look at what they’re doing on all cylinders: website, organic search, paid search, social media, email, etc. Use tools like SpyFu to analyze the competition, collect as much data as you can, and then use all of that information to mirror what they’re doing, but do it better.

13. Use Video

Did you know that YouTube is the second-largest search network in the world? If you didn’t, now you do, and you should now realize how important video is to your business. Video can help you educate users better than you ever could through your blog content, and video is also another signal that Google factors into its ranking algorithm for SEO. Use video for new product launches and walkthroughs, and host that video on your website and push it across your social channels.

14. List Your Business on Local Directories

Online local business directories like Google Plus Local and Bing/Yahoo Local will help people close to your business find you faster. Make sure you optimize those listings with a business description, accurate hours, and of course your address. Link the directory to your website and you’ll get the added benefit of referral traffic.

15. Pay to Play on Social Media

Your organic posts on social networks such as Facebook don’t show as far out to your fans and followers as they once did, which is why you need to put some dollars behind your social media efforts. It’s not to say you need to pay for every post, but if you have an important announcement or an event coming up, throw some money behind it and choose your customized targeting to hit your desired demographic. Platforms like Facebook and Twitter have amazing targeting options and analytics so that you can measure the effectiveness of your campaign.

16. Use Google Analytics

If you don’t have Google Analytics on your website, stop what you’re doing, go to, and get the tracking code on your website ASAP. Google Analytics is a digital marketer’s dream, showing you a plethora of data, including where your website users are coming from, how long they’re staying on the website, which pages they’re visiting the most, and most importantly, if they’re purchasing your products or services. You, or someone from your company, should be looking at this data a few times per week, at the minimum.

17. Make Sure Your Website is Mobile Optimized

We’ve all heard how important mobile is, and if your website isn’t optimized for mobile devices, you’re missing the boat on a ton of traffic, and probably leaving a lot of business on the table. It used to be that having a mobile site was a luxury, but now it’s essentially a requirement, especially when you consider the fact that Google has publicly stated that not having a mobile optimized website will affect your SEO (dubbed “Mobilegeddon“).

18. Offer Discounts

Everyone loves knowing that they’re getting some sort of discount—even if it’s five percent. Make sure that you let it be known that you’re offering a discount by promoting it on your website, through social media, and via your email blasts. Again, you may have to give a little away to get something in return, but your margins should still be there.

19. Focus on Reviews

Reviews play a huge role in digital marketing. People love to read reviews and get other people’s opinions, so you should develop a strategy to solicit reviews from your past customers. Maybe it’s a personal email from the founder of the business or an incentive for leaving a positive review on platforms like Facebook, Google+, and Yelp, but either way, positive reviews need to find their way into your marketing strategy.

20. Listen to Your Customers

Sounds easy, right? Well, yes, but you’ve got to remember to do it. Your customers give you a different perspective, and they often have great feedback for small business owners on what they can do to improve. Make sure you’re monitoring the comments on your social media profiles and have a section of your website where visitors can email you comments/feedback. You’re never going to get it 100% right off the bat, so let your customers lend a hand.


Without some type of marketing strategy in place, you can have a great business, but people aren’t going to know about it. These are the current digital marketing strategies that small business are utilizing to increase brand awareness and boost their customer base.

Why Do Sites Rank High on Google When They Aren’t Optimized?

Just Found This

by NEIL PATEL on APRIL 29, 2015

why rankings

Have you ever wondered why some sites rank high on Google when they aren’t optimized for search engines? Or even worse, when they barely have any backlinks?

I’ve been asked this question a lot over the last few months, so I thought I would write a blog post explaining why that happens.

Here’s why some sites rank high when they aren’t optimized: 

Reason #1: Click-through rate

Part of Google’s algorithm looks at a click-through rate. It calculates it as a percentage, reflecting the number of clicks you receive from the total number of people searching for that particular phrase you rank for.

The higher the percentage, the more appealing your listing is compared to the competition. And if your click-through rate is higher than everyone else’s, Google will slowly start moving you up the search engine results page as this algorithm factor tells it that searchers prefer your listing.

Looking at the click-through rate isn’t enough, however, as people could create deceptive title tags and meta descriptions to increase their results. So Google also looks at your bounce rate.

It assesses the number of people who leave your page by hitting the back button to return to the search listing page. If Google sends 1,000 people to one of your web pages and each of those 1,000 people hit the back button within a few seconds, it tells Google your web page isn’t relevant.

A lot of the websites that are ranking well on Google that don’t seem to be optimized have a high click-through rate and a low bounce rate. And that helps maintain their rankings.

For example, if you look at this guide, you’ll see it ranks really high for the term “online marketing,” and the ranking very rarely fluctuates as my click-through rate according to Webmaster Tools is 31%.

Here’s another example. This post ranks well for “best times to post on social media.” It would be hard to outrank this listing as my click-through rate is currently 52%.

ctr ranking

If you want to see your click-through rates, log into Webmaster Tools, and click on your site profile. If you don’t have a site profile, that means you need to add your site to Webmaster Tools and wait a few days.

Once you are viewing your site in Webmaster Tools, click on the navigational option “search traffic,” and then click on “search queries.”

If you need help increasing your click-through rates, read this post as I walk you through the steps you need to take.

Reason #2: Age

One of the big factors that cause some sites to rank well is their age. Most of the sites that rank high are at least a few years old.

Sure, most of these older sites have more backlinks and content as they have been around for longer, but not all of them.

What I’ve noticed is that if you take a brand new website, build tons of relevant links, and add high quality content, you still won’t get as much search traffic as older sites will.

There is not much you can do here other than just give it time. The older your site gets, the more search traffic you will generally receive, assuming you are continually trying to improve upon it.

Reason #3: Backlinks

Google doesn’t just look at the sheer number of backlinks a site has—it also looks at relevancy and authority.

Many of these non-optimized sites that are ranking well have a few high quality backlinks pointing to the right internal pages. For example, if you have only few links—but they come from .edu and .gov extensions—your site will rank extremely well.

In addition to having the right backlinks, those sites also have a spot-on anchor text for these links. Most SEOs think you need rich anchor text links to rank well, but the reality is you don’t.

Google is able to look at the web page that is linking to you and analyze the text around the link as well as the text on the page. It helps Google determine if the link is relevant to your site and what you should potentially rank for.

Reason #4: Cross-linking

Even if you don’t have the best on-page SEO and a ton of backlinks, you can rank well from an overall site perspective if you cross-link your pages.

And it’s important not just from a navigational or breadcrumb perspective, but from an in-content perspective. If you can add in-content links throughout your site and cross-link your pages, you’ll find that they all will increase in rankings.

On the flip side, if you aren’t cross-linking your pages within your content, you’ll find that some of your web pages will rank extremely well, while others won’t. It’s because you are not distributing link juice and authority throughout your whole site.

Reason #5: Content quality

Since its Panda update, Google has been able to determine content quality of websites. For example, it can determine whether a site is too thin or has duplicate content, allowing for a much better analysis of content quality than before.

A lot of these well-ranking older sites have extremely high quality content. You may not think so, but Google does.


Because Google doesn’t just look at the content on a site… It looks at the content on one website and compares it to others within that space. So if you have higher quality content than all of your competitors, you are much more likely to outrank them in the long run.

Reason #6: Competition

The beautiful part about ranking for certain keywords is that they are low in competition. And some of these low competitive terms don’t get searched often.

From what I’ve seen, the results pages for these low competition key phrases aren’t updated by Google as often as some of the more competitive terms are. Why? Because more people are viewing the competitive terms.

If you were Google, wouldn’t you focus your resources on ensuring that popular terms and results pages are updated more frequently than phrases that aren’t searched for very often?

Reason #7: Growth rate

What should you do if you want to rank really high for a keyword? Build a ton of relevant backlinks and write a lot of high quality content,right?

Although that’s true, what happens is a lot of webmasters grow their link count a bit too fast…so fast that it seems unnatural. And chances are it is.

Google is smart enough to know this as it has data on a lot of sites within your space. For this reason, you see a lot of older sites ranking well as they are growing at a “natural” pace versus one that seems manufactured.


There are a lot of reasons why sites that don’t seem well-optimized rank well. The seven I listed above are the main reasons I’ve seen over the years.

So the next time you are trying to figure out why a certain site ranks well when it shouldn’t, chances are it’s because of one or more reasons on the list.

As a website owner, you shouldn’t focus too much on your competition; instead, you should focus on improving your website. In the long run, the company with the best product or service tends to win.