Friday, January 18, 2013

SEO Issues - is it Penguin? Is it Panda? or is it me?

The following story is one that has been several months in the making. It's one that I have lived through one too many times as an SEO, and it is one that I am sure other SEO's have faced. I fought with the thought of writing this for fear that someone from the company might read it and get angry that the story is told. But, it's something I think that not only people out there could learn from, but speaks to so many others in this industry to show them that they are not alone.

It's long, it's a bit technical (I tried to keep it simple), and it has some personal frustrations laid out in words. My only hope is that you get value out of reading this as much as living it has made me a better person (or well, a better SEO).

It Begins


I started working on this website's SEO in May 2012 at which time I was told the site's traffic was declining due to Panda updates. In February of 2012 the traffic from SEO was the best they had ever seen, but soon after that there was a steady decline.
Traffic from February 2012 - May 2012
Before digging into any possible SEO issues, I first checked the Google Trends to ensure that the decline isn't searcher related. Often times a drop in traffic could just mean that users aren't searching for the terms the website is ranking for as they were in the past.

Top Key Terms in Google Trends
Looking at the same time frame as the traffic data, I noticed an increase in searches for the top 3 terms the website ranked for, and there appeared to be a decline around the same time from March to April that the traffic was reflecting. But there was a drop in the website's traffic in April from the 23rd to the 24th and then significantly on the 25th. The website I was working on had two SEO's already working on it: an agency and a consultant. Both had already done a numerous amount of research and some work to get the website on track. Both were stressing that the drop in traffic was due to the Panda updates by Google. I looked at SEOmoz's Google Algorithm Change History and found an update to Google's Panda on April 19th and an update to Penguin on April 24th. Given that the traffic significantly dropped on the 24th my best guess is that it was possibly Penguin related, but still needed further exploration.

Figuring Out What Was Hit by Penguin.


The site is/was broken up into sections by keyword focus. At one point, I could tell that someone really had a good head on their shoulders for SEO, but the strategy that was used was outdated. Perhaps the site was originally optimized several years before, and it just needs some cleanup now to bring it up to 2012's optimization standards. So, understanding Penguin and identifying which part of the site was driving the bulk of the organic traffic was going to be my next step in solving this mystery. Once I understood why, and where, then I could start to establish a what to do to solve the problem.

I broke the site traffic report by sections as best I could in Google Analytics. There was a bit of a struggle as all of the pages of the site resided on the main domain. Without a hierarchy in place, breaking out the sections had to be accomplished with a custom report and a head matching for landing pages. I hadn't had to do this before, so the agency that was working with the site already helped build the first report, and I began building out the other reports from there.
Click to View Larger
Section 1 over 72% of traffic

Just focusing on April and May I created a Dashboard in Google Analytics focusing on organic Traffic and identifying the sections of the site. Looking at the different sections - Section 1 was the bulk of the traffic with over 72% and Section 2 coming in second with just over 15%. Subs of Section 3 and other one-off pages make up the difference.

Both Section 1 and Section 2 dropped off after the April 24th date, so clearly they were the bulk of what was pulling the overall traffic numbers down. Since Section 1 was the majority of the traffic, I presented to the executive responsible for the site that we address any issues with that page first.

Actual screenshot of Section 1 presented
I took all of the research from the agency and consultant and we quickly reworked the pages to represent a hierarchy in the URL structure, and cleaned up any issues from the outdated optimization that was done.

Soon after Section 1 was addressed, we did the same with Section 2, and then worked on Section 3 (and sub pages, rolling them up into a solid section) and then added a few pages to grab any new opportunity.

Not Quite As  Easy as it Looks


The projects were launched in increments - first URL hierarchy fix to Section 1 and then the page redesign. Next was a full launch of URL fixes and page redesign to Section 2, and then lastly Section 3 and the new Section 4.
Section 1 - Section 2- Section 3 Launch Dates and Organic Traffic
Soon after Section 1 was launched traffic started declining rapidly. I was asked several times why traffic was getting worse, and I started digging some more. Every time I looked at the Impressions of the new URLs from Section 1 they weren't getting any traction, but the previous URLs were still.  I began looking at the history of the website, trying to find out why it was doing so well at one point, but was not doing well at that time. One of the things I noticed was that there was a lack of priority linking to these pages, but at some point there were links to some of them individually from the homepage. Google matches a hierarchy of pages to a directory structure that links are presented on a site. This site had every page on the first level, and linking to those pages from the homepage, which was telling Google that every page was the most important page. It worked at one time, but as Google has been rolling out their 2012 updates these pages were getting hit, and those links on the homepage weren't there anymore. Before the launch of Section 2, I had them put links to the main directory for each section on the homepage. The links would tell the search engines that these are important pages of the website, but not be so obnoxious with a dozen or more links on the homepage to discourage users (avoiding the appearance of spamminess).

But - even after adding the links to the homepage, the traffic to those pages was still declining. Pressure was put on me to figure out what was wrong. In addition, accusations were flying that I single-handedly ruined the SEO for the site, I spent every waking hour looking at reports, and trying to figure out what was going on. I consulted friends in the industry, and read every article I could find to figure out what Panda or Penguin updates were affecting these pages.

Then it hit me - just as the links to these sections would help them get recognized as important pages, so were the other pages that were being linked to from the homepage. In fact a set of them linked to the website's search results with queries attached to them mimicking pages, but showing search results. On those search results pages, there were over 200 links with multiple (we're talking hundreds - possibly thousands) combinations of parameters. The bots were coming to the homepage, going to the links to the search results pages, and then getting stuck in this vortex of links and combinations of parameter generating URLs - not allowing any crawl time for the pages that once were getting rankings. This also explains why the new URLs weren't showing very many impressions in the Webmaster Tools Data - those pages just weren't getting crawled.

There was a project underway that would solve the many links on the search pages, and there was also talk of using ajax to show the results. When this project would launch, the bots would go to the URL from the homepage, but would then essential not go much further. With this project a few months out, I made the case to add the search page to robots.txt to allow the bots to then recognize the Sections as important pages. After several weeks of attempting to convince the powers that be, the URL was eventually added to the robots.txt file.

Immediately after the search page was added to the robots.txt Google Webmaster tools presented me with a warning:
Warning in Webmaster Tools
In most cases, a warning from Google should never be taken lightly, but in this case it was exactly what I wanted. In fact it proved to me that my theory was correct, and that the site was hopefully headed down the right path.


Panic, Questioning, and a Third Party


As with every up in the SEO world, there must be a down. Soon after the search result page was added to the robots.txt the organic traffic to the site dropped, and continued to drop. Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
Organic Traffic from September 2012 - November 2012
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.

The patent states:

When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
 But the article and my please fell on deaf ears...

It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively.

Just as the consultant was starting his audit, and the traffic hit the lowest I ever thought it could possibly go, the next day traffic went up. The last week in November (roughly 3 months after we blocked the search result page) I saw an increase in traffic in Google Analytics to Section 1:
Section 1 Organic Traffic
I quickly pulled up my report to check the Section's impressions from the Webmaster Tools data, and there was a significant increase as well:
Section 1 Impressions from Webmaster Tools Data
On December 3, 2012 I logged into Webmaster Tools and saw that the warning had gone away:
It was the "halleluiah" moment that every SEO dreams of, and very few get. All the work I had done, the fighting for what I believed in, it all finally paid off.

To this day traffic continues to increase - we can now focus on some of the cleanup still left to do, and then onto projects that will attract new opportunity.
Organic Traffic from November 2012 - January 17, 2013 (day before this post is written)
Quick Note: 
I forgot to mention a post I wrote months ago while going through all of this - SEO - Panda and the Penguins. It helps to give a bit of perspective of some of the linking stuff I didn't get into in this post. 

Monday, December 3, 2012

SEO Buzzwords - don't get sucked into the hype

I am asked often by people wanting to get into the SEO business where to get training. There are a lot of online resources available; articles, blog posts, videos, and even downloadable presentations. It's difficult to know what to believe, who to pay attention to, and what will work for any particular website. Most experienced SEO's will tell you to learn as much as you can and then simply start optimizing and learning from trial and error. But who has the time? Let alone wanting to risk a website losing rankings or, even worse, getting banned for using the wrong techniques? This industry is very fickle and is always changing - what may work for one website, may not work for another, and what that big company that dictates how we should be doing our job changes it's mind often.

I have sat back and watched how the industry began, has grown, and developed throughout the years. On one hand it's been fun to be a part of something big that started from one company's idea and development that turned out an entire industry as a result.  On the other had, because it is still a very young industry, and that industry is dictated by the company that sparked it, we are all still developing standards, strategies, and learning every day.  In fact, just the other day I saw a post on Facebook for a workshop on how to use the Google Disavow Tool. It scares me to see SEO's already taking advantage of a strategy that is to not be taken lightly and making money off of "teaching" people on how to do it themselves. It's like a surgeon trying to teach a child how to patch up a kidney. Any seriously wrong move and the patient could die, and any slightly overlooked part of the process then the kidney could fail over time not knowing if it was the surgery or that the patient drinks vodka all day long.

In trying to learn and keep up with the latest in this capricious industry we often find ourselves having to look-up and research what the "experts" are talking about - those SEO buzzwords - coming across contradicting opinions, and quite frequently second guessing ourselves (even the highly experienced SEO experts second guess themselves). I have too often seen people trying to do what they think is right, and completely messing up their own site, and even client's websites because of all of the hype and misinformation out there.

The truth is that it is all viable, it's all in how you approach it. Of course, hearing that probably doesn't help, so the following are some of the most common strategies and some SEO buzzwords and hopefully clear up any confusion you might have. 

Link Building


Yeah, I started with the most common, yet most controversial buzzword of all. The term "Link Building" began with the birth of the almighty Google itself. What was a very simple and quick way to get rankings for a website for the most popular search engine, is slowly becoming an art-form within itself. The basic idea is that a link from one site pointing to another site is counted as a "vote". The more links pointing from other sites to one site the more votes, and thus higher rankings. With such an easy strategy to implement, and the growing popularity of the search engine that uses the algorithm, more and more spammers began to take advantage. By offering website owners to pay money fro a link pointing to a website (a.k.a. purchasing links), asking a website to link to a site in exchange for a link back (a.k.a.link exchanging), submitting a website to directories (a.k.a. directory submissions), commenting on blog posts (a.k.a. commenting), and even submitting articles with links in them to article distribution sites - all of these means of obtaining links tricked the search engines into ranking websites that might not have otherwise deserved the positions they were given.  In December of 2007 Google began cracking down on such strategies not only with increasingly new algorithms that catch sites that might be purchasing links, but by allowing webmasters to report one another manually. In the years since, we have seen a dramatic increase in the quality of the websites appearing in search results as a result.


SEO Buzzwords from Link Building:

Text Links
Links that point a page that contain a descriptive keyword or phrase. Many SEO's have used this strategy in the past because they give a context and a relevance to a link. This means that the search engines can read and index a page with all the text links and assign a ranking based on the quality if the content, the links and the destination of the links.

With Google's latest updates, the search engine no longer looks at the text within the link itself, but rather the words and relevance around the link. By recognizing that a page on a dog breed website with a link to a pet related website contains terms like "puppy", "hound", "paws", and other pet related terms that the dog site pointing to the pet site is, in fact, related. What I have seen in the past is a automotive website with an article on candy that contains one text link for "chocolate bon bons" and points to a chocolate website just isn't going to count (believe me, I've seen it). In fact, it will hurt the website's rankings.

Link Bait
The idea behind ‘link bait’ is to encourage people to bookmark or link to your website from theirs. Personal blogs, social media sites, and other communities will usually link to a site if the site offers something useful. Because of this, the search engines place a high value on the link.The best way to obtain these types of links is to write articles or white papers, a very valuable blog posting, or any sort of information your audience will find relevant. The more they share, the better the website ranks. The trick is to not force it - don't go out hiring people to share your posts, just let them happen naturally.

Link Juice
The ‘search equity’ that is passed to one page from another is called "link juice". The more relevant a page is, how often it has been shared, and how many times it is visited places a high value from the search engines. From that page (or website) the pages that link from it will also gain extra value because the original content is deemed useful to users.

Internal Linking
Almost self explanatory, most individuals tend to overlook the importance of linking within their own website. In fact, in most cases, the link to a page from a homepage can be just as valuable, if not more, than an external link. This, of course, does not mean that you should go adding a link to every page of your website from your homepage; nor does it mean you should link to a few pages, then rotate them, so that every page gets a chance at a high vote. What it means is that the pages that are most relevant to your users and make the most sense to continue from the homepage to, are the ones you should link to, and are the second most valuable pages (next to your homepage) that the search engines will rank.

Taxonomy
Categorizing a website with a hierarchy and linking to one another internally is one of the best ways to show the search engines which pages are most important, and where they should rank. If a website is about cupcakes and selling supplies, the site should be organized by types of cupcakes (perhaps flavors) and then categories of supplies. Then place pages within that category that make sense. From there, pages should link to one another where relevant to show the search engines that this is X category and a set of pages, and this is Y category with a set of pages.

Internal Optimization


Often overlooked by agencies simply due to the fact that so many clients will hire an agency to "optimize" their site only to tell them in the end that they don't have the resources to make the suggested changes, or that they simply just can't make changes (whether it be because of design, usability, business reasons, etc).  Unfortunately this leaves agencies in the predicament that they have to please the client and do what they were hired to do (which is to get the website rankings and increase traffic) but left with no other choice but to start link building. But a good SEO knows that internal optimization is really the heart and soul into obtaining legitimate rankings that will stick throughout all of the spam algorithm updates like Panda and Penguin. Below is a quick list and brief explanations for internal optimization.

Metatags


Title Tag - This often shows up as the title in your search engine result. The title tag should never be more than 70 characters, and should only contain your most broad term that describes your website.

Description Tag - The description tag will often appear in the search result as the description text if the key term searched is within the tag. If not, then the search engine will pull from the content on the page itself where the key term is located. If a page on your site is specific to a certain term, then this is a good time to get that term within the description.

Keyword Tag - The keyword meta tag was once the main source of how search engines determined what site would show up for what search. Now it isn't as relevant, but is still used by some meta crawler search engines (not Google - but Excite, and often Bing). List out a few of your target terms for the page you are optimizing to help you focus on what you want the page to rank for, and just in case a search engine is paying attention.

Content


Keyword density in document text - simply put, search engines look at how often a term shows up within the content of a page. If a word is mentioned 10 times within 300 words on a page, then the page won't get very good rankings. If a word is mentioned 10 times within 1200 words and spread out once or perhaps twice in a paragraph or two, then that page is more likely to rank better. A quick way to check densities is to put the content of a page within Microsoft Word, do a search within the document (Find), type in the word, and click "Highlight All". it's a great visual to see where a term is placed.

Content around the anchor text - As mentioned earlier, the words and context around an internal link is representative of the relevance of that page. The more a page will have of terms similar in context to the term you are optimizing for, the better.

Unique content - Any content borrowed, rented, or just stolen is considered a felony in the SEO world. There are algorithms in place that look for not only content within a site that exists elsewhere on a site, but content that exists on other sites as well. A quick way to check to see if your site has unique content is by searching on copyscape.com. Content that you have on yoru site that exists on other pages (or every page) will simply just not get counted (sort of just overlooked by the search engine), so any key terms within duplicate content on your site won't count. Duplicate content outside of your website is another story. If a website has content that you have copied (in other words, they had it first) then your site will get penalized. If your site had content first, and then someone copied you, then they would get penalized.

Frequency of content change - Search engines don't know the difference between a blog, a new publication, or a brochure-ware site that remains static. The best way they have developed to recognize a cutting edge news site and a static site, is how often new content is generated. The more often a new page is created with a robust amount of text, the more the search engine will come back and index, and therefore the higher the priority those new pages will get. If your site is something that is updated often, and is generating new content regularly, then the search engines will adjust accordingly. If your site is static, then don't worry, let it be, and the age of the pages will determine where they belong in the world of rankings (mentioned later).

Anchor text has key term(s) in links - What was a solid strategy of obtaining rankings for key terms in the past, is now less relevant, and even considered bad SEO. It's more about keyword "essence" and the relevance of the terms around the anchor text, than the anchor text itself (as mentioned above). Some of the more experienced SEO's are even finding that linking the word "more" or "click here" are helping their rankings more so than putting the key term within the anchor text.


Duplicating content - As mentioned before in the "Unique Content" bullet item, duplicating content on a site, or from another site is a very bad technique.

Invisible text - Nope, don't use white text on a white background with a bunch of keywords in it that only the search engine can see. Even 1 pixel high div's with the overflow hidden set in the stylesheet is a bad thing. Not only will you not get rankings, but your site will get penalized for it.

Overall Website


Age of website - the older a domain (or website) is, the higher a priority it will get within search rankings. A typical spam strategy is to buy a new domain and optimize it as much as possible to obtain quick rankings. Because of this, search engines will tend to ignore a website until it has been around for a few weeks, sometimes even months or years. If you have an older domain, then don't go thinking you should change it because it's "stale", it's actually a good thing.

Poor coding and design - Search engines can't tell what good design is, but they can tell from the popularity of the website. Social sharing, articles, blog posts, and all of the buzz about a website will only happen when a website is easy for the visitor to use, and gives all of the value a user is looking for. So, make sure your website is easy on the eyes, gives a clear and concise value proposition with a call to action, and is easy to navigate.

Exact Match Domain - Many spammers create website with a descriptive key term in the domain in attempts to get rankings. Google announced in October of 2012 that they were updating with an algorithm that will weed out any exact match domains. For example: http://www.compareinterestrates.com/ or http://www.best-interest-mortgage-rates.com/

Keyword-rich URLs and filenames - Just as the exact match domain is taking a hit in the recent updates, the keyword rich URL and filename strategy is as well. SEO's used to put their keyword within the URL with dashed between words in order to obtain ranking for long tail terms.
Site Accessibility - it's not talked about often, but can be potentially beneficial when your website is designed with accessibility in mind. Someone that has poor vision, hard of hearing, or may have trouble clicking links and buttons, is going to have trouble with most websites. If your website audience contains users that might need some extra help, keep this in consideration. Search engines know, and it could help you rank over your competition that hasn't.

Website size - Big or small, size doesn't matter. Some SEO's stress that a website needs to have millions upon millions of pages, but I have often personally witnessed websites that get penalized for having too many pages. Don't let this happen to your site, keep the pages down to a manageable and reasonable number. If your site is a publication with thousands or even hundreds of thousands of pages with unique content, then you should be fine. Just watch your webmaster tools notifications. Most of the websites that trigger the warnings are ecommerce websites with masses of pages for each product. If you find your site is showing this kind of error, it's best to seek out an experienced professional to help you get your pages under control and managed properly.

Domains versus subdomains - A subdomain is a subset of a main domain. Often used as a place to store images, or for other purposes, a subdomain looks something like images.mysite.com. Too often websites will put their highly valuable unique content of their blog on a subdomain. Unfortunately search engines don't know the difference between a main domain and the subdomain. Because of this, they treat each one as a separate entity. In the past SEO's have taken advantage of this and tried to get multiple rankings on one page with multiple subdomains. Just this year (2012) Matt Cutts has announced that they no longer treat them separately for separate rankings, but rather as an extension of the main domain. Because of this, subdomains not only won't see rankings, but the content is still not counted as part of the main domain. When setting up a blog, or any section of your website, it's best to simply just add a new directory (ex: www.mysite.som/blog) so that any of the content within that directory supports the domain as a whole.

Hyphens in URLs - When creating URLs for your website, it's still considered best practice to separate each word with a hyphen rather than a space, or an underscore. For example, if you write a blog post or article titled "The ten best puppies everyone should own" the URL should be "www.mysite.com/the-ten-best-puppies-everyone-should-own.html" or to avoid getting pegged for keyword rich URLs and a set hierarchy, it should be "www.mysite.com/puppies/ten-best.html".

URL length - A URL that is too long is a red flag for a keyword rich URL. try to keep your URL simple, and keep that site hierarchy.

IP address - The IP address is the unique identifying number (like a phone number) of where the server that hosts your website is located. If you are targeting a local audience, or maybe even just focusing on one country, be aware of where your website is hosted. A website that targets users searching in Canada, and is hosted in the U.S. will have an IP that resides within the U.S. In this case, search engines will only rank the site for U.S. searchers, and not for their Canadian searchers. If you aren't' worried about focusing your SEO by location, then don't worry about your IP.

robots.txt - The robots.txt file is a very simple text file (like Notepad) that resides on the main server. The only case in which you need a robots.txt is when you want to block certain sections of your website. Some search engines will allow you to put links to your xml sitemap for better indexing. For more information on setting up your robots.txt you can visit robotstxt.org.

XML Sitemap - Sitemaps are an easy way to let search engines know about all of the pages within your website that you would like to see indexed.

Redirects (301 and 302) or Status Codes - 404, 301, 302... Each one of these numbers has a different meaning to a search engine. The most common is a 404 or "page not found" it basically means that the UIRL existed, and now it doesn't. In the SEO world, the 301 is another code that is mentioned often. A 301 one lets the search engine know that the URL existed and has been moved, so we let the search engine know by redirecting the old URL to the new URL. My favorite explanation of these codes is from a dear friend of mine Lindsay Wassell at SEOmoz in which she uses pictures to best explain the different codes, and what they mean.

Some basic SEO buzzwords


Long Tail - A long tail is what most SEO refer to when talking about a 3-5 or more word term. When a user is looking to buy a computer and begins their search with the word "computers", they will often start to get specific as they search focusing on the specifics like "500 GB laptop computer". This is what a long tail key terms is - the more specific you can target your audience, the more likely they will be to convert as they find what they are looking for.

Indexed - Indexing is a term SEO's use when a search engine has crawled a website and it's pages, and then starts to display them within the search results. This doesn't effect rankings, but merely expresses that a page is within the database, and recognized by the search engine. A quick and easy way to see if your website is indexed is to search with site: before your domain. For example: search for "site:oceansofpets.com".

SERP - Simply meaning the "search engine results page" and rolled off of the tongue of SEO's quite often. Pronounced just as it looks (serp) the search engine results page is the page that the user sees after completing the search.

Snippet - A search snippet is what SEO's use to describe the title and description a search engine displays on the search results page.


I think that should just about do it to get you started. With SEO there is no standard way of doing things. There is no true right and no true wrong, there is only what we try, fail or succeed, and try again.

Please feel free to add anything I might have missed in the comments below. I'm hoping this will become a pretty comprehensive list that newbie SEO's can get started with.

Tuesday, October 16, 2012

Official Google Announcement: A new tool to disavow links

As I have been watching the Pubcon Twitter stream today for any news on the recent updates by Google the past few weeks, I witnessed a large amount of tweets flying through with excitement about the new Disavow Link Tool from Google.  At 1:18 pm pst I saw three tweets come through saying there would be an announcement on the Google Webmaster Central Blog, so I quickly opened up the page and continued to hit refresh every 10 minutes.


Then suddenly - there it was:

Official Google Webmaster Central Blog: A new tool to disavow links: Webmaster level: Advanced Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manua...

To reiterate what Matt and the Google team are stressing - Most sites DO NOT need this tool. They are serious, and cannot say it enough.

That being said - here's a quick note about links, why you would want to disavow them, and how to do it...

Bad Links

A long, long time ago when SEO's could tweak and play with sites and watch the rankings happen almost instantly, a strategy call "Link Building" was provided by agencies and consultants as a quick means to obtain instant rankings. By submitting a website URL to directories, submitting articles with links to article distribution sites, and press releases with links to the URL surrounded with key terms and anchortext with longtail keywords websites began to see nearly instant rankings. But as all spammy tactics must come to an end, so did the link building strategies.

Unnatural Links

In Google Webmaster Tools, some sites are seeing a warning in their messages about "unnatural links". This is due to Google picking seeing evidence of paid links, link exchanges, or other link schemes that violate their quality guidelines. More specifically participating in "link schemes" as Google puts it.

A few examples of what a link scheme could entail:
Buying or selling links that pass PageRank (ex: paying for a link from a site with a high pagerank)
Excessive link exchanging (ex: asking a site to link to you if you link to them))
Linking to web spammers or unrelated sites with the intent to manipulate PageRank
Building partner pages exclusively for the sake of cross-linking
Using automated programs or services to create links to your site

Removing Bad or Unnatural Links

The absolute best way to remove these unnatural links is to find out where the links re coming from and contact the website owner or webmaster directly. I myself have used our agency to contact such site owners individually and receive a weekly report of how many of the low quality links have been removed. In addition, I have contact a few of the site owners myself. I have found I either do not get a response at all - or a nice email asking for more details of where the link is located and what I would like removed. It's that simple...
Of course, I have heard stories from others in the industry with not as much luck or ease in this process. Some site owners have clued into these requests and actually ask for money for removal of the link.
But you really have to put the time in to find the link, contact the website admin and ask away.
A quick tool to help you determine what are good links, and what are bad links is the SEOmoz Open Site Explorer tool. You can plug in your URL and a full report of incoming links, page authority and domain authority will help you decide which links need to be removed.

I Tried - But I Still See Unnatural Links

Ok, so you did all you can, but you're still seeing bad or unnatural links pointing to your site. This is where the Disavow Link Tool will come in handy. But remember, don't get carried away with submitting a large amount of links. Also - take note that whatever link you remove, you cannot reavow and get the credit you once had (according to Matt Cutts in this video).
1) Head to the
2) You will be asked which domain you would like to use the disavow link tool for:
3) You will be prompted to upload a file with the links you want to disavow:



Disavow Text File

The file you will upload is a simple .txt file. You can easily create this in notepad on windows, just as you would your robots.txt file.

Within the txt file you want to add a snippet after a # sign with any comments as to the domain, and what you have done to try to remove the link(s).
Example:

# Contacted owner of webdomain.com on 10/1/2012 to

# ask for link removal but didn't get a response
In the case of links from a whole domain that need to be disavowed, you can add the line "domain:" with the domain preceding.
Example:
  domain: webdomain.com
If you have worked with a webmaster that has removed some links, but not all, and you wish that more be disavowed then you can comment as to the details of the request with the date. Then add a list of individual links you want disavowed.
Example:


# Owner of otherdomain.com removed most links, but missed these
http://www.otherdomain.com/sampleA.html
http://www.otherdomain.com/sampleB.html
http://www.otherdomain.com/sampleC.html

Right now Google only supports one disavow file for each domain, so choose and create the file wisely. Naming of the file doesn't matter - as long at the file extension is a .txt. Perhaps calling it simply "disavow.txt" would be the safest route in case of Google not accepting "-", "_", "=", or spaces.

For more information visit these links:
Google Webmaster Central Blog: Disavow Link Tool
Google Webmaster Tools: Disavow Links
Matt Cutts on Youtube - Disavow Links
Matt Cutts' PPT from Pubcon
Lisa Barone's Pubcon Live Blogging:  Google Announces New & Improved Disavow Link Tool
SEOmoz post by Dr. Pete Google's Disavow Tool - Take a Deep Breath

The Dos and Don’ts for Google’s New Disavow Links Tool

Direct Link to Disavow Links Tool


Tuesday, August 21, 2012

SEO - Panda and the Penguins

I am now at this point 2 weeks into my position based in San Francisco and am finding myself in need of expressing my deepest concerns with how the SEO industry has been representing itself. During my first week on the job I found several questionable SEO strategies implemented on the main site that I was hired to work on. In addition, I have found very aggressive techniques for link building provided by the agency hired roughly a year ago. I'm not going to name the name's of the people or agency involved to spare them, but I will get into some detail of what was done so that we can all learn from the experience.

Panda and Penguin

To precursor what I am about to discuss, there have been some fairly recent massive updates to the Google algorithms that have caused cries heard around the world by SEO's. Those updates would be what is now known as Panda and Penguin. To help you get a better idea of what Panda and Penguin mean for SEO (no they aren't the cute bear or birds we know from the Discover Channel), check out the High-quality sites algorithm goes global, incorporates user feedback post on April 11, 2011 by Amit Singhal to the Google Webmaster Blog, and Another step to reward high-quality sites posted by Matt Cutts on April 24, 2012 (nearly one year later).  In general, what the updates themselves are about is to target websites that are "optimizing" content for the sole purpose of getting rankings, and apply techniques in link building that focus just on links, rather than quality content referring to their website.

Findings

The company I work for has hired the esteemed Laura Lippay (in addition to the agency) to bring another level of expertise to the very important SEO that drives a very large percentage of revenue to the site. In Laura's evaluation she found many odd links pointing to the website, and more specifically a large amount to lower level pages within the site. There is also a numerous amount of links from one page to the next, and content that seems to be fairly cookie cutter that doesn't quite make sense to the user. The links in particular Laura had asked about several times, with little to no response from the agency. Given that there was a lot more to be done with the site, Laura decided to focus on more important efforts.

Enter SEOGoddess...

Laura showed me what she had found in her early days working with the company and I began to dig a bit further. As she was showing me, I noticed that the content linking from random sites that had absolutely nothing to do with the site they were linking to (for example: a hair advice website linking to a data processing website - note: example has nothing to do with the site I am working on, but is a similar scenario) all seemed to be very much in the same. Each and every one of the pages of content also had a standard last paragraph with a couple of sentences containing 2 links with the most aggressive key terms in them. hmmmm (I thought).. Laura copied the first couple of sentences and searched for them in Google. One of the hundreds of results looked like an article submission domain. We clicked to it, and looked around. Lo and behold, it was an article submission site, and the article that was submitted (among several others) was represented by our company's website brand. The articles were submitted around the time the agency was hired to work on SEO.  Laura grabbed the screenshots for me to email to the agency. While Laura was talking with another staff with the company I went to the agency's website to see if I could get a better idea of how they approach SEO. From the home page I clicked "What We DO" in their navigation and proceeded to the "Link Building" section of their website. From there they listed an article "Five Surefire Ways to Build Links and Increase Traffic to Your Website or Blog" - in the article the #1 surefire way: "Article Marketing" which then lists out websites they submit articles to, including the one Laura and I had found. I went to my desk and emailed the agency. Not asking if they knew anything about the article submission, but I simply showed them what I found, how I found it, and then cited the article on their website leaving them no room to neither deny, or to not respond.

Getting Help from a Friend

While drafting up the email to the agency, I wanted to make sure I was approaching the issue the best way possible. I know that Bruce Clay had talked about Link Cleanup that he has done for his clients while he was at my Search and Social Hawaii last September 2011. Bruce had advised one of our attendees in an open forum discussion that his staff would ask the websites to remove the links. In some cases he was having to pay them to remove the links, and some were just outright difficult to deal with. While I am in the process of restructuring the website I thought "Why not 404 the URLs that the sites are linking to, while I change the URL structure?" But before I go down the road (well not so easy considering some of them have good quality links, and determining which ones could stay and which would go would be a chore all its own.) I asked Bruce what he thought our best strategy would be. He of course replied with recommending that we evaluate the links pointing to the site, and which pages.

His words exactly:
How important is the target landing page?
If home page you may be stuck.
If a sub-page, then an article in our newsletter on link pruning... http://www.bruceclay.com/newsletter/volume102/link-pruning-procedure.htmIt is hard, tedious work, especially since many junk sites are easily offended by de-link requests.

After my response, he mentioned that we could add the pages that we no longer cared about to the robots.txt and Google would then make the appropriate adjustments (in short). Perhaps I should write another blog post on link removal strategy and best practices.

Inspired Blog Posts and Conversations

Bruce Clay

 on May 24, 2012 inspired by the email discussion he and I had gone through as a result of the work the agency had done. After he shared it to the SEO Group on Facebook it sparked quite a lot of comments around how our industry should be policing themselves to use better strategies.
Bruce mentions in his blog post:
I believe that SEOs who openly engage in a practice that was always doomed are intentionally harming their clients, and this is grossly unethical. bottom: 0px; padding-left: 0px; padding-right: 0px; padding-top: 0px; text-align: left; vertical-align: baseline;"> What’s even more unfortunate for site owners is that the repair on their sites with paid links usually costs more than the original damage. If a site owner spent $200 per month to buy links, they’ll likely spend 10 times that to correct damage bad links have caused, plus the loss of business until it’s fixed. The easy way out has led to a long, hard road to repair. The demand for an SEO penalty audit service is very high now, and the cost is significant – perhaps hundreds of hours (we happen to know; we offer a cleanup service ourselves). For example, if you are a well-known brand and your site has sufficient value, the repair can begin with just link pruning. However, if your site was based solely on paid links, it’s also very likely you won’t have enough quality content to rank even after fixing the link damage.
The Facebook post by Bruce sparked many 'Like's and over 75 comments. The discussion even started questioning and debating SEMPO's involvement in ensuring that agencies stay on the up-and-up.

Laura Lippay

On June 1, 2012 Laura Lippay posted her blog post "Is Your SEO Putting Your Company at Risk of Losing Everything? Take the quiz." in response to the article posting we had found. In her post she listed out a few questions to ask your agency to ensure they are on the up-and-up:
Does your SEO:
  1. Build content for the purpose of attracting search engines? [yes] [no]
  2. Report primarily on keyword rankings? [yes] [no] 
  3. Seem to always only have good news (but are eerily quiet when search traffic dips)? [yes] [no]
  4. Engage in massive linking campaigns, building gobs of meaningless links to your content from splogs and directories that no one really visits (especially your target audiences)? [yes] [no]
  5. Sending traffic to your pages by any means necessary? [yes] [no] [no idea (most people don't)]
If you said yes to any of those I’m going to ask you to think real hard about this for a second: Would you rather fire your agency or lay off your staff?

I had shared Laura's post to the same SEO Group on Facebook that Bruce had posted to and once again, the comments flew.  This time the discussion went in a different direction, instead of going after SEMPO the discussion seemed to go into the direction of name dropping and a bit of a back and forth banter between Adam John Humphreys and John S. Britsios.  The comments on what group members are seeing from SEO's is almost discouraging (though at this point I'm nearly out of shock).

Keep checking the group to see what SEO's are saying, or even post a comment yourself.

In conclusion

What the agency did for our website wasn't entirely a bad thing at the time. Of course it is now with the Panda and Penguin updates knocking all these linking strategy SEO's off their perch, but if you look at any Agency I guarantee you will find that each and every one of them have "Link Building" (or the like) listed as one of their services. In fact - if I do a search on Google right now (not logged in) I see a list of paid search results in which the first links to an agency that lists "link Building" right there on the landing page. The second is another agency that lists out "Directory Listings", "Link Acquisitions", "Press Releases", and "Link Bait" as part of the strategy they provide for SEO.  
In fact moz.org (a trusted resource for all SEO's) even recommends directory submissions as a way to obtain links in their "Professional Guide to Link Building" Copyrighted in 2010 (just 2 years ago) Stating:
Directories
Directories can be a great way to obtain links. There are a large number of directories out there, and they may or may not require money in order to obtain a listing. Examples of high quality directories that are free include:
   Librarian's Internet Index
   Open Directory Project (aka DMOZ)
Examples of high quality directories which require a fee are:
   Yahoo! Directory
   Business.com
   Best of the Web
A more comprehensive list of directories is available from Strongest Links.

As I explained to my boss and our General Manager the agency was doing what they felt was a good SEO strategy. They were acting on what every SEO agency, respected leaders, and common practice that our industry has been allowing for many years. They hadn't done anything wrong really. 
I had even told the agency that I don't care who did it or why it was done, what I care about is that we get it cleaned up and move forward from here.

A Shift in Thinking

What we need to do now is shift our thinking back to what SEO was originally about before SEO's started finding ways to get quick results. Create a website that is user friendly and present your product or service to your audience in a way that makes sense to them. Don't only give them what they are searching for, but simply follow the rules that ensure the site will get crawled, that your key terms are available in an easily digestible manner, and your website is organized and structured so that every page is easily crawled. (of course there is so much more than that - but you get the general idea).
When people ask me "Can I do X to my site to generate rankings?" My answer is more often than not; "If you are thinking of doing something with the sole purpose to generate rankings... Then don't do it."

The Future of SEO

As a result of the Panda and Penguin updates I am sure we will start to see agencies list "Linking Cleanup" as a service they provide. Heck, Bruce Clay himself offers an "SEO Penalty Assessment Service" in which he evaluates and offers linking cleanup for clients (which is why I asked him for the advice in the first place). 
My hope is that we start to see less spamming of comments to our blogs, less random directories showing up in search results, and more quality content when we search for that special item we want to buy, or continue to do research online.