Friday, January 18, 2013

SEO Issues - is it Penguin? Is it Panda? or is it me?

The following story is one that has been several months in the making. It's one that I have lived through one too many times as an SEO, and it is one that I am sure other SEO's have faced. I fought with the thought of writing this for fear that someone from the company might read it and get angry that the story is told. But, it's something I think that not only people out there could learn from, but speaks to so many others in this industry to show them that they are not alone.

It's long, it's a bit technical (I tried to keep it simple), and it has some personal frustrations laid out in words. My only hope is that you get value out of reading this as much as living it has made me a better person (or well, a better SEO).

It Begins


I started working on this website's SEO in May 2012 at which time I was told the site's traffic was declining due to Panda updates. In February of 2012 the traffic from SEO was the best they had ever seen, but soon after that there was a steady decline.
Traffic from February 2012 - May 2012
Before digging into any possible SEO issues, I first checked the Google Trends to ensure that the decline isn't searcher related. Often times a drop in traffic could just mean that users aren't searching for the terms the website is ranking for as they were in the past.

Top Key Terms in Google Trends
Looking at the same time frame as the traffic data, I noticed an increase in searches for the top 3 terms the website ranked for, and there appeared to be a decline around the same time from March to April that the traffic was reflecting. But there was a drop in the website's traffic in April from the 23rd to the 24th and then significantly on the 25th. The website I was working on had two SEO's already working on it: an agency and a consultant. Both had already done a numerous amount of research and some work to get the website on track. Both were stressing that the drop in traffic was due to the Panda updates by Google. I looked at SEOmoz's Google Algorithm Change History and found an update to Google's Panda on April 19th and an update to Penguin on April 24th. Given that the traffic significantly dropped on the 24th my best guess is that it was possibly Penguin related, but still needed further exploration.

Figuring Out What Was Hit by Penguin.


The site is/was broken up into sections by keyword focus. At one point, I could tell that someone really had a good head on their shoulders for SEO, but the strategy that was used was outdated. Perhaps the site was originally optimized several years before, and it just needs some cleanup now to bring it up to 2012's optimization standards. So, understanding Penguin and identifying which part of the site was driving the bulk of the organic traffic was going to be my next step in solving this mystery. Once I understood why, and where, then I could start to establish a what to do to solve the problem.

I broke the site traffic report by sections as best I could in Google Analytics. There was a bit of a struggle as all of the pages of the site resided on the main domain. Without a hierarchy in place, breaking out the sections had to be accomplished with a custom report and a head matching for landing pages. I hadn't had to do this before, so the agency that was working with the site already helped build the first report, and I began building out the other reports from there.
Click to View Larger
Section 1 over 72% of traffic

Just focusing on April and May I created a Dashboard in Google Analytics focusing on organic Traffic and identifying the sections of the site. Looking at the different sections - Section 1 was the bulk of the traffic with over 72% and Section 2 coming in second with just over 15%. Subs of Section 3 and other one-off pages make up the difference.

Both Section 1 and Section 2 dropped off after the April 24th date, so clearly they were the bulk of what was pulling the overall traffic numbers down. Since Section 1 was the majority of the traffic, I presented to the executive responsible for the site that we address any issues with that page first.

Actual screenshot of Section 1 presented
I took all of the research from the agency and consultant and we quickly reworked the pages to represent a hierarchy in the URL structure, and cleaned up any issues from the outdated optimization that was done.

Soon after Section 1 was addressed, we did the same with Section 2, and then worked on Section 3 (and sub pages, rolling them up into a solid section) and then added a few pages to grab any new opportunity.

Not Quite As  Easy as it Looks


The projects were launched in increments - first URL hierarchy fix to Section 1 and then the page redesign. Next was a full launch of URL fixes and page redesign to Section 2, and then lastly Section 3 and the new Section 4.
Section 1 - Section 2- Section 3 Launch Dates and Organic Traffic
Soon after Section 1 was launched traffic started declining rapidly. I was asked several times why traffic was getting worse, and I started digging some more. Every time I looked at the Impressions of the new URLs from Section 1 they weren't getting any traction, but the previous URLs were still.  I began looking at the history of the website, trying to find out why it was doing so well at one point, but was not doing well at that time. One of the things I noticed was that there was a lack of priority linking to these pages, but at some point there were links to some of them individually from the homepage. Google matches a hierarchy of pages to a directory structure that links are presented on a site. This site had every page on the first level, and linking to those pages from the homepage, which was telling Google that every page was the most important page. It worked at one time, but as Google has been rolling out their 2012 updates these pages were getting hit, and those links on the homepage weren't there anymore. Before the launch of Section 2, I had them put links to the main directory for each section on the homepage. The links would tell the search engines that these are important pages of the website, but not be so obnoxious with a dozen or more links on the homepage to discourage users (avoiding the appearance of spamminess).

But - even after adding the links to the homepage, the traffic to those pages was still declining. Pressure was put on me to figure out what was wrong. In addition, accusations were flying that I single-handedly ruined the SEO for the site, I spent every waking hour looking at reports, and trying to figure out what was going on. I consulted friends in the industry, and read every article I could find to figure out what Panda or Penguin updates were affecting these pages.

Then it hit me - just as the links to these sections would help them get recognized as important pages, so were the other pages that were being linked to from the homepage. In fact a set of them linked to the website's search results with queries attached to them mimicking pages, but showing search results. On those search results pages, there were over 200 links with multiple (we're talking hundreds - possibly thousands) combinations of parameters. The bots were coming to the homepage, going to the links to the search results pages, and then getting stuck in this vortex of links and combinations of parameter generating URLs - not allowing any crawl time for the pages that once were getting rankings. This also explains why the new URLs weren't showing very many impressions in the Webmaster Tools Data - those pages just weren't getting crawled.

There was a project underway that would solve the many links on the search pages, and there was also talk of using ajax to show the results. When this project would launch, the bots would go to the URL from the homepage, but would then essential not go much further. With this project a few months out, I made the case to add the search page to robots.txt to allow the bots to then recognize the Sections as important pages. After several weeks of attempting to convince the powers that be, the URL was eventually added to the robots.txt file.

Immediately after the search page was added to the robots.txt Google Webmaster tools presented me with a warning:
Warning in Webmaster Tools
In most cases, a warning from Google should never be taken lightly, but in this case it was exactly what I wanted. In fact it proved to me that my theory was correct, and that the site was hopefully headed down the right path.


Panic, Questioning, and a Third Party


As with every up in the SEO world, there must be a down. Soon after the search result page was added to the robots.txt the organic traffic to the site dropped, and continued to drop. Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
Organic Traffic from September 2012 - November 2012
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.

The patent states:

When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
 But the article and my please fell on deaf ears...

It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively.

Just as the consultant was starting his audit, and the traffic hit the lowest I ever thought it could possibly go, the next day traffic went up. The last week in November (roughly 3 months after we blocked the search result page) I saw an increase in traffic in Google Analytics to Section 1:
Section 1 Organic Traffic
I quickly pulled up my report to check the Section's impressions from the Webmaster Tools data, and there was a significant increase as well:
Section 1 Impressions from Webmaster Tools Data
On December 3, 2012 I logged into Webmaster Tools and saw that the warning had gone away:
It was the "halleluiah" moment that every SEO dreams of, and very few get. All the work I had done, the fighting for what I believed in, it all finally paid off.

To this day traffic continues to increase - we can now focus on some of the cleanup still left to do, and then onto projects that will attract new opportunity.
Organic Traffic from November 2012 - January 17, 2013 (day before this post is written)
Quick Note: 
I forgot to mention a post I wrote months ago while going through all of this - SEO - Panda and the Penguins. It helps to give a bit of perspective of some of the linking stuff I didn't get into in this post. 

Monday, December 3, 2012

SEO Buzzwords - don't get sucked into the hype

I am asked often by people wanting to get into the SEO business where to get training. There are a lot of online resources available; articles, blog posts, videos, and even downloadable presentations. It's difficult to know what to believe, who to pay attention to, and what will work for any particular website. Most experienced SEO's will tell you to learn as much as you can and then simply start optimizing and learning from trial and error. But who has the time? Let alone wanting to risk a website losing rankings or, even worse, getting banned for using the wrong techniques? This industry is very fickle and is always changing - what may work for one website, may not work for another, and what that big company that dictates how we should be doing our job changes it's mind often.

I have sat back and watched how the industry began, has grown, and developed throughout the years. On one hand it's been fun to be a part of something big that started from one company's idea and development that turned out an entire industry as a result.  On the other had, because it is still a very young industry, and that industry is dictated by the company that sparked it, we are all still developing standards, strategies, and learning every day.  In fact, just the other day I saw a post on Facebook for a workshop on how to use the Google Disavow Tool. It scares me to see SEO's already taking advantage of a strategy that is to not be taken lightly and making money off of "teaching" people on how to do it themselves. It's like a surgeon trying to teach a child how to patch up a kidney. Any seriously wrong move and the patient could die, and any slightly overlooked part of the process then the kidney could fail over time not knowing if it was the surgery or that the patient drinks vodka all day long.

In trying to learn and keep up with the latest in this capricious industry we often find ourselves having to look-up and research what the "experts" are talking about - those SEO buzzwords - coming across contradicting opinions, and quite frequently second guessing ourselves (even the highly experienced SEO experts second guess themselves). I have too often seen people trying to do what they think is right, and completely messing up their own site, and even client's websites because of all of the hype and misinformation out there.

The truth is that it is all viable, it's all in how you approach it. Of course, hearing that probably doesn't help, so the following are some of the most common strategies and some SEO buzzwords and hopefully clear up any confusion you might have. 

Link Building


Yeah, I started with the most common, yet most controversial buzzword of all. The term "Link Building" began with the birth of the almighty Google itself. What was a very simple and quick way to get rankings for a website for the most popular search engine, is slowly becoming an art-form within itself. The basic idea is that a link from one site pointing to another site is counted as a "vote". The more links pointing from other sites to one site the more votes, and thus higher rankings. With such an easy strategy to implement, and the growing popularity of the search engine that uses the algorithm, more and more spammers began to take advantage. By offering website owners to pay money fro a link pointing to a website (a.k.a. purchasing links), asking a website to link to a site in exchange for a link back (a.k.a.link exchanging), submitting a website to directories (a.k.a. directory submissions), commenting on blog posts (a.k.a. commenting), and even submitting articles with links in them to article distribution sites - all of these means of obtaining links tricked the search engines into ranking websites that might not have otherwise deserved the positions they were given.  In December of 2007 Google began cracking down on such strategies not only with increasingly new algorithms that catch sites that might be purchasing links, but by allowing webmasters to report one another manually. In the years since, we have seen a dramatic increase in the quality of the websites appearing in search results as a result.


SEO Buzzwords from Link Building:

Text Links
Links that point a page that contain a descriptive keyword or phrase. Many SEO's have used this strategy in the past because they give a context and a relevance to a link. This means that the search engines can read and index a page with all the text links and assign a ranking based on the quality if the content, the links and the destination of the links.

With Google's latest updates, the search engine no longer looks at the text within the link itself, but rather the words and relevance around the link. By recognizing that a page on a dog breed website with a link to a pet related website contains terms like "puppy", "hound", "paws", and other pet related terms that the dog site pointing to the pet site is, in fact, related. What I have seen in the past is a automotive website with an article on candy that contains one text link for "chocolate bon bons" and points to a chocolate website just isn't going to count (believe me, I've seen it). In fact, it will hurt the website's rankings.

Link Bait
The idea behind ‘link bait’ is to encourage people to bookmark or link to your website from theirs. Personal blogs, social media sites, and other communities will usually link to a site if the site offers something useful. Because of this, the search engines place a high value on the link.The best way to obtain these types of links is to write articles or white papers, a very valuable blog posting, or any sort of information your audience will find relevant. The more they share, the better the website ranks. The trick is to not force it - don't go out hiring people to share your posts, just let them happen naturally.

Link Juice
The ‘search equity’ that is passed to one page from another is called "link juice". The more relevant a page is, how often it has been shared, and how many times it is visited places a high value from the search engines. From that page (or website) the pages that link from it will also gain extra value because the original content is deemed useful to users.

Internal Linking
Almost self explanatory, most individuals tend to overlook the importance of linking within their own website. In fact, in most cases, the link to a page from a homepage can be just as valuable, if not more, than an external link. This, of course, does not mean that you should go adding a link to every page of your website from your homepage; nor does it mean you should link to a few pages, then rotate them, so that every page gets a chance at a high vote. What it means is that the pages that are most relevant to your users and make the most sense to continue from the homepage to, are the ones you should link to, and are the second most valuable pages (next to your homepage) that the search engines will rank.

Taxonomy
Categorizing a website with a hierarchy and linking to one another internally is one of the best ways to show the search engines which pages are most important, and where they should rank. If a website is about cupcakes and selling supplies, the site should be organized by types of cupcakes (perhaps flavors) and then categories of supplies. Then place pages within that category that make sense. From there, pages should link to one another where relevant to show the search engines that this is X category and a set of pages, and this is Y category with a set of pages.

Internal Optimization


Often overlooked by agencies simply due to the fact that so many clients will hire an agency to "optimize" their site only to tell them in the end that they don't have the resources to make the suggested changes, or that they simply just can't make changes (whether it be because of design, usability, business reasons, etc).  Unfortunately this leaves agencies in the predicament that they have to please the client and do what they were hired to do (which is to get the website rankings and increase traffic) but left with no other choice but to start link building. But a good SEO knows that internal optimization is really the heart and soul into obtaining legitimate rankings that will stick throughout all of the spam algorithm updates like Panda and Penguin. Below is a quick list and brief explanations for internal optimization.

Metatags


Title Tag - This often shows up as the title in your search engine result. The title tag should never be more than 70 characters, and should only contain your most broad term that describes your website.

Description Tag - The description tag will often appear in the search result as the description text if the key term searched is within the tag. If not, then the search engine will pull from the content on the page itself where the key term is located. If a page on your site is specific to a certain term, then this is a good time to get that term within the description.

Keyword Tag - The keyword meta tag was once the main source of how search engines determined what site would show up for what search. Now it isn't as relevant, but is still used by some meta crawler search engines (not Google - but Excite, and often Bing). List out a few of your target terms for the page you are optimizing to help you focus on what you want the page to rank for, and just in case a search engine is paying attention.

Content


Keyword density in document text - simply put, search engines look at how often a term shows up within the content of a page. If a word is mentioned 10 times within 300 words on a page, then the page won't get very good rankings. If a word is mentioned 10 times within 1200 words and spread out once or perhaps twice in a paragraph or two, then that page is more likely to rank better. A quick way to check densities is to put the content of a page within Microsoft Word, do a search within the document (Find), type in the word, and click "Highlight All". it's a great visual to see where a term is placed.

Content around the anchor text - As mentioned earlier, the words and context around an internal link is representative of the relevance of that page. The more a page will have of terms similar in context to the term you are optimizing for, the better.

Unique content - Any content borrowed, rented, or just stolen is considered a felony in the SEO world. There are algorithms in place that look for not only content within a site that exists elsewhere on a site, but content that exists on other sites as well. A quick way to check to see if your site has unique content is by searching on copyscape.com. Content that you have on yoru site that exists on other pages (or every page) will simply just not get counted (sort of just overlooked by the search engine), so any key terms within duplicate content on your site won't count. Duplicate content outside of your website is another story. If a website has content that you have copied (in other words, they had it first) then your site will get penalized. If your site had content first, and then someone copied you, then they would get penalized.

Frequency of content change - Search engines don't know the difference between a blog, a new publication, or a brochure-ware site that remains static. The best way they have developed to recognize a cutting edge news site and a static site, is how often new content is generated. The more often a new page is created with a robust amount of text, the more the search engine will come back and index, and therefore the higher the priority those new pages will get. If your site is something that is updated often, and is generating new content regularly, then the search engines will adjust accordingly. If your site is static, then don't worry, let it be, and the age of the pages will determine where they belong in the world of rankings (mentioned later).

Anchor text has key term(s) in links - What was a solid strategy of obtaining rankings for key terms in the past, is now less relevant, and even considered bad SEO. It's more about keyword "essence" and the relevance of the terms around the anchor text, than the anchor text itself (as mentioned above). Some of the more experienced SEO's are even finding that linking the word "more" or "click here" are helping their rankings more so than putting the key term within the anchor text.


Duplicating content - As mentioned before in the "Unique Content" bullet item, duplicating content on a site, or from another site is a very bad technique.

Invisible text - Nope, don't use white text on a white background with a bunch of keywords in it that only the search engine can see. Even 1 pixel high div's with the overflow hidden set in the stylesheet is a bad thing. Not only will you not get rankings, but your site will get penalized for it.

Overall Website


Age of website - the older a domain (or website) is, the higher a priority it will get within search rankings. A typical spam strategy is to buy a new domain and optimize it as much as possible to obtain quick rankings. Because of this, search engines will tend to ignore a website until it has been around for a few weeks, sometimes even months or years. If you have an older domain, then don't go thinking you should change it because it's "stale", it's actually a good thing.

Poor coding and design - Search engines can't tell what good design is, but they can tell from the popularity of the website. Social sharing, articles, blog posts, and all of the buzz about a website will only happen when a website is easy for the visitor to use, and gives all of the value a user is looking for. So, make sure your website is easy on the eyes, gives a clear and concise value proposition with a call to action, and is easy to navigate.

Exact Match Domain - Many spammers create website with a descriptive key term in the domain in attempts to get rankings. Google announced in October of 2012 that they were updating with an algorithm that will weed out any exact match domains. For example: http://www.compareinterestrates.com/ or http://www.best-interest-mortgage-rates.com/

Keyword-rich URLs and filenames - Just as the exact match domain is taking a hit in the recent updates, the keyword rich URL and filename strategy is as well. SEO's used to put their keyword within the URL with dashed between words in order to obtain ranking for long tail terms.
Site Accessibility - it's not talked about often, but can be potentially beneficial when your website is designed with accessibility in mind. Someone that has poor vision, hard of hearing, or may have trouble clicking links and buttons, is going to have trouble with most websites. If your website audience contains users that might need some extra help, keep this in consideration. Search engines know, and it could help you rank over your competition that hasn't.

Website size - Big or small, size doesn't matter. Some SEO's stress that a website needs to have millions upon millions of pages, but I have often personally witnessed websites that get penalized for having too many pages. Don't let this happen to your site, keep the pages down to a manageable and reasonable number. If your site is a publication with thousands or even hundreds of thousands of pages with unique content, then you should be fine. Just watch your webmaster tools notifications. Most of the websites that trigger the warnings are ecommerce websites with masses of pages for each product. If you find your site is showing this kind of error, it's best to seek out an experienced professional to help you get your pages under control and managed properly.

Domains versus subdomains - A subdomain is a subset of a main domain. Often used as a place to store images, or for other purposes, a subdomain looks something like images.mysite.com. Too often websites will put their highly valuable unique content of their blog on a subdomain. Unfortunately search engines don't know the difference between a main domain and the subdomain. Because of this, they treat each one as a separate entity. In the past SEO's have taken advantage of this and tried to get multiple rankings on one page with multiple subdomains. Just this year (2012) Matt Cutts has announced that they no longer treat them separately for separate rankings, but rather as an extension of the main domain. Because of this, subdomains not only won't see rankings, but the content is still not counted as part of the main domain. When setting up a blog, or any section of your website, it's best to simply just add a new directory (ex: www.mysite.som/blog) so that any of the content within that directory supports the domain as a whole.

Hyphens in URLs - When creating URLs for your website, it's still considered best practice to separate each word with a hyphen rather than a space, or an underscore. For example, if you write a blog post or article titled "The ten best puppies everyone should own" the URL should be "www.mysite.com/the-ten-best-puppies-everyone-should-own.html" or to avoid getting pegged for keyword rich URLs and a set hierarchy, it should be "www.mysite.com/puppies/ten-best.html".

URL length - A URL that is too long is a red flag for a keyword rich URL. try to keep your URL simple, and keep that site hierarchy.

IP address - The IP address is the unique identifying number (like a phone number) of where the server that hosts your website is located. If you are targeting a local audience, or maybe even just focusing on one country, be aware of where your website is hosted. A website that targets users searching in Canada, and is hosted in the U.S. will have an IP that resides within the U.S. In this case, search engines will only rank the site for U.S. searchers, and not for their Canadian searchers. If you aren't' worried about focusing your SEO by location, then don't worry about your IP.

robots.txt - The robots.txt file is a very simple text file (like Notepad) that resides on the main server. The only case in which you need a robots.txt is when you want to block certain sections of your website. Some search engines will allow you to put links to your xml sitemap for better indexing. For more information on setting up your robots.txt you can visit robotstxt.org.

XML Sitemap - Sitemaps are an easy way to let search engines know about all of the pages within your website that you would like to see indexed.

Redirects (301 and 302) or Status Codes - 404, 301, 302... Each one of these numbers has a different meaning to a search engine. The most common is a 404 or "page not found" it basically means that the UIRL existed, and now it doesn't. In the SEO world, the 301 is another code that is mentioned often. A 301 one lets the search engine know that the URL existed and has been moved, so we let the search engine know by redirecting the old URL to the new URL. My favorite explanation of these codes is from a dear friend of mine Lindsay Wassell at SEOmoz in which she uses pictures to best explain the different codes, and what they mean.

Some basic SEO buzzwords


Long Tail - A long tail is what most SEO refer to when talking about a 3-5 or more word term. When a user is looking to buy a computer and begins their search with the word "computers", they will often start to get specific as they search focusing on the specifics like "500 GB laptop computer". This is what a long tail key terms is - the more specific you can target your audience, the more likely they will be to convert as they find what they are looking for.

Indexed - Indexing is a term SEO's use when a search engine has crawled a website and it's pages, and then starts to display them within the search results. This doesn't effect rankings, but merely expresses that a page is within the database, and recognized by the search engine. A quick and easy way to see if your website is indexed is to search with site: before your domain. For example: search for "site:oceansofpets.com".

SERP - Simply meaning the "search engine results page" and rolled off of the tongue of SEO's quite often. Pronounced just as it looks (serp) the search engine results page is the page that the user sees after completing the search.

Snippet - A search snippet is what SEO's use to describe the title and description a search engine displays on the search results page.


I think that should just about do it to get you started. With SEO there is no standard way of doing things. There is no true right and no true wrong, there is only what we try, fail or succeed, and try again.

Please feel free to add anything I might have missed in the comments below. I'm hoping this will become a pretty comprehensive list that newbie SEO's can get started with.

Tuesday, October 16, 2012

Official Google Announcement: A new tool to disavow links

As I have been watching the Pubcon Twitter stream today for any news on the recent updates by Google the past few weeks, I witnessed a large amount of tweets flying through with excitement about the new Disavow Link Tool from Google.  At 1:18 pm pst I saw three tweets come through saying there would be an announcement on the Google Webmaster Central Blog, so I quickly opened up the page and continued to hit refresh every 10 minutes.


Then suddenly - there it was:

Official Google Webmaster Central Blog: A new tool to disavow links: Webmaster level: Advanced Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manua...

To reiterate what Matt and the Google team are stressing - Most sites DO NOT need this tool. They are serious, and cannot say it enough.

That being said - here's a quick note about links, why you would want to disavow them, and how to do it...

Bad Links

A long, long time ago when SEO's could tweak and play with sites and watch the rankings happen almost instantly, a strategy call "Link Building" was provided by agencies and consultants as a quick means to obtain instant rankings. By submitting a website URL to directories, submitting articles with links to article distribution sites, and press releases with links to the URL surrounded with key terms and anchortext with longtail keywords websites began to see nearly instant rankings. But as all spammy tactics must come to an end, so did the link building strategies.

Unnatural Links

In Google Webmaster Tools, some sites are seeing a warning in their messages about "unnatural links". This is due to Google picking seeing evidence of paid links, link exchanges, or other link schemes that violate their quality guidelines. More specifically participating in "link schemes" as Google puts it.

A few examples of what a link scheme could entail:
Buying or selling links that pass PageRank (ex: paying for a link from a site with a high pagerank)
Excessive link exchanging (ex: asking a site to link to you if you link to them))
Linking to web spammers or unrelated sites with the intent to manipulate PageRank
Building partner pages exclusively for the sake of cross-linking
Using automated programs or services to create links to your site

Removing Bad or Unnatural Links

The absolute best way to remove these unnatural links is to find out where the links re coming from and contact the website owner or webmaster directly. I myself have used our agency to contact such site owners individually and receive a weekly report of how many of the low quality links have been removed. In addition, I have contact a few of the site owners myself. I have found I either do not get a response at all - or a nice email asking for more details of where the link is located and what I would like removed. It's that simple...
Of course, I have heard stories from others in the industry with not as much luck or ease in this process. Some site owners have clued into these requests and actually ask for money for removal of the link.
But you really have to put the time in to find the link, contact the website admin and ask away.
A quick tool to help you determine what are good links, and what are bad links is the SEOmoz Open Site Explorer tool. You can plug in your URL and a full report of incoming links, page authority and domain authority will help you decide which links need to be removed.

I Tried - But I Still See Unnatural Links

Ok, so you did all you can, but you're still seeing bad or unnatural links pointing to your site. This is where the Disavow Link Tool will come in handy. But remember, don't get carried away with submitting a large amount of links. Also - take note that whatever link you remove, you cannot reavow and get the credit you once had (according to Matt Cutts in this video).
1) Head to the
2) You will be asked which domain you would like to use the disavow link tool for:
3) You will be prompted to upload a file with the links you want to disavow:



Disavow Text File

The file you will upload is a simple .txt file. You can easily create this in notepad on windows, just as you would your robots.txt file.

Within the txt file you want to add a snippet after a # sign with any comments as to the domain, and what you have done to try to remove the link(s).
Example:

# Contacted owner of webdomain.com on 10/1/2012 to

# ask for link removal but didn't get a response
In the case of links from a whole domain that need to be disavowed, you can add the line "domain:" with the domain preceding.
Example:
  domain: webdomain.com
If you have worked with a webmaster that has removed some links, but not all, and you wish that more be disavowed then you can comment as to the details of the request with the date. Then add a list of individual links you want disavowed.
Example:


# Owner of otherdomain.com removed most links, but missed these
http://www.otherdomain.com/sampleA.html
http://www.otherdomain.com/sampleB.html
http://www.otherdomain.com/sampleC.html

Right now Google only supports one disavow file for each domain, so choose and create the file wisely. Naming of the file doesn't matter - as long at the file extension is a .txt. Perhaps calling it simply "disavow.txt" would be the safest route in case of Google not accepting "-", "_", "=", or spaces.

For more information visit these links:
Google Webmaster Central Blog: Disavow Link Tool
Google Webmaster Tools: Disavow Links
Matt Cutts on Youtube - Disavow Links
Matt Cutts' PPT from Pubcon
Lisa Barone's Pubcon Live Blogging:  Google Announces New & Improved Disavow Link Tool
SEOmoz post by Dr. Pete Google's Disavow Tool - Take a Deep Breath

The Dos and Don’ts for Google’s New Disavow Links Tool

Direct Link to Disavow Links Tool