Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Showing posts with label Google. Show all posts
Showing posts with label Google. Show all posts

Monday, August 4, 2014

SEO

That's right, I am an SEO. So, what does that mean? It means that I optimize websites so that they show up on the search engines for certain terms. Those terms are usually focused on what your key audience might be searching.

Some Stats About SEO:
  • 93% of online experiences begin with a search on Google, Bing, or Yahoo!
  • Google owns 65%-70% of the market share.
  • 70% of users will click on SEO results over paid.
  • 70-80% of users ignore the paid ads, focusing on the SEO results.
  • 75% of users stay on the first page of search results (1-10th position).
  • SEO beats social media by more than 300% in traffic for most content sites.
  • Traffic from SEO has a 14% close rate, while outbound leads (such as direct mail or print advertising) has a 1.5% close rate.
  • For Google, 18% of clicks from SEO are on the 1st, 10% of clicks from SEO are from the 2 to 3rd.
I primarily work as an in-house SEO, which means I work for large companies within the organization rather than an agency or as a consultant. I have, in the past, helped some companies out as an SEO consultant, but if it takes time away from my job and career I will generally offer to recommend someone else to consult. Most SEO consulting consists of myself reviewing the website and any possible issues the company is finding. I review the analytics looking at the SEO traffic as well as traffic from other sources. I also look at Google's Webmaster Tools data to gauge how the current SEO is doing for the site, and how the impressions for key terms look compared to clicks. Sometimes just a simple change to meta tags for a different title and description can increase the click through rate from SEO therefore increasing traffic. In most cases, a complete restructuring of the site along with basic SEO implementation is needed in order to increase rankings. After I review the site, I will come up with a list of recommendations along with how much effort it should take along with the measure of impact. Your report will also include potential traffic and current traffic from SEO so that the client can see where the biggest gaps are. At times the reporting for SEO that I send over can be pretty technical, but rest assured I spend time making sure all the data is easy to understand, and a clear direction is not only explained, but in full detail in the final recommendations. From there it is up to the client to decide on whether they can do the work themselves, hire someone, have it done by their current employees, hire an agency, another consultant, or have me do the work for SEO. Since I have a background in design and development any work needed for SEO or to simply just increase conversion rates from SEO traffic, are fairly easy for me to do, and can happen pretty quickly. It all depends on how much I have on my plate at the time the work needs to get done with my full time job.

If you're not sure you want to have me, or someone else, optimize your site for SEO, it's no problem. Most people can pick up on the basics of SEO themselves. I always like to see clients having some understanding of SEO before I work with them. If they don't have time to learn, that's perfectly acceptable, as I can explain how things work in ways most people understand and pick up quickly. The following is a check list I have come up with for SEO that will help anyone understand and get started in SEO quickly and easily. Of course, there are so many algorithms that Google and other search engines use to determine which site gets to show up for their respective terms, but this at least gets you on your way to understanding the basics of SEO.
  1. Keywords – you can't do anything with SEO until you know what keywords you are optimizing for. Once you have your basic list, then structuring your site, and any work you do with the site, around them will all fall into place. I usually recommend one or two broad terms that describe a website. These terms should only be one work, and very rarely more than two. From there a few two to three word terms that might describe a sub-category will help you structure your plan and organize for SEO. Your longtail (as SEO's will put it) or exact match (as Paid Search people call them) are the phrases that are more specific. These phrases then to be the biggest payoff for SEO since they represent terms that users will use when they really know what they want and are ready to buy. Therefore they tend to convert a lot faster and higher. I talk more about this in-depth in my workshops, and in my book titled “Search and Social” that is currently in the works. So stay tuned for the book that helps you really understand SEO on a very detailed level.

    Keywords in
     – Keywords should be in the following items for SEO.
    • Keywords in title tag  - The title tag is what show up in the browser top. It is also what search engines use for the title in the “snippet” that displays in the results after a search has been completed. Having your keyword in the title tag not only helps SEO, but will aid in the click as the user will recognize the word they searched for within your title encouraging them to click your result over the other's on the page.
    • Keywords in URL – Getting the key words in the URL is very important for SEO. Start with the broad terms in the domain if possible. If not, then in a directory with the category terms (2-3 word terms mentioned before) as a sub-directory, and then the exact match longtail terms as the name (or in the name) of the file. Your URL hierarchy is very important for SEO and having those keywords in there even moreso.
    • Keyword density in document text – Listing out your keywords over and over again in a short paragraph will harm your SEO more than doing any good. A good way to explain how to watch your densities is to look at a page that has 3 paragraphs, each having about 150 words. Let's say you need to mention your keyword 9 times in order to get rankings. If you mention your keyword 9 times in your first paragraph and then not in the others that's bad. The trick it to distribute your keyword evenly among the three. So mention that keyword 3 times in each paragraph and evenly distribute it throughout each one of the paragraphs.
    • Keywords in anchor text – The anchor text is the text that a user will click on within a page's content that sends them to another page. The text that links back to your website should include the main broad keyword that describes the site. The trick to this is to make sure that the page and the whole site linking to the site is relevant to the word in the anchor text. If the site linking to your site isn't relevant than that will actually get your site in trouble, and too many will cause you to lose rankings.
    • Keywords in tags - The alt tag is the alternative text that displays in the rare case that an image doesn't show up. It's a simple line of code that goes in the html that generates the image. For SEO purposes, the alt tag containing the keyword is important, and will actually help rankings. Be sure to stick with only the words relevant on that page, and don't list all of the keywords out with commas. That will get a site in trouble.
    • Keywords in metatags – Be sure to get your keyword in your SEO meta tags, that's the description, title, and keyword tag that resides in the background of the html.
  2. Metatags – meta tags are (as explained above) the lines of code within html for SEO that describe your page. This includes the title, description, and keyword tag.
    • meta description tag - The description tag should be no more than 150 characters, and include your keyword(s). Try to describe the page as much as possible for SEO while keeping in mind that the user will see this in the search results.
    • meta keyword tag - Some SEOs will say that keyword meta tags don't make a difference. Google doesn't really pay attention to them, but the meta driven search engines will, and there are thousands of other search engines aside from Google. So, for SEO purposes, and to help keep the focus of the page of the site, I recommend listing out the keywords in the keyword tag with the broad terms first, then the category, and the longtail. You never know, it might actually help SEO.
    • meta language - If the site is in English then adding the language meta tag will help the search engines know which language to display the site on. If you have other languages, then try to make sure the language is in the meta tag. In some cases it can really benefit SEO.
  3. Links – Internal – Linking internally to other pages of the site that are related to the page you are optimizing can be quite important for SEO. Almost as much (if not more) than external links coming in.
  4. Anchor text has key term(s) in links – As much as the links pointing to other pages, the keyword in the anchor text is important. I cannot stress just how important it is for SEO to have those internal links, and the keywords in the anchor text.
  5. Content Around-the-anchor text is relevant - If a section of pages are relevant to other pages, the cross linking with a paragraph mentioning the page before and after the link is very helpful for SEO.
  6. Content – content, content, and more content is the key to optimizing a site for SEO. Pages don't have to have large chunks of paragraphs, but can have words here and there throughout the page. Too many SEOs will put big blocks of content on the homepage of a website thinking that it will benefit it. Sure, it helps for SEO, but it looks horrible and users don't fall for it. A paragraph of 10 words at the top describing the site, and then perhaps another clock of text highlighting the value proposition of the product or service in blocks around the page are just fine. The trick is to search your term you are trying to rank for, look at the first few pages or sites ranking, and then look at how many words they have on their page with the number of mentions of keywords. Then, simply just do a little more. Once you have that content in place for SEO, you're on your way to rankings.
  7. Unique content – Unique content is very key to making sure your SEO is in place. Not only do you need to watch out for other sites having the same content that you have, but look at other pages of your site. If a block of content is repeated on more than one page, then the content just won't be counted towards SEO. If a page has less than 15% content then it will even work against SEO and even get a site penalized. Sites that use tracking tags, parameters, or might have issues with validating URLs can often run into the issue of duplicate content, and really harm the work they have done for SEO without even realizing it.
  8. Frequency of content change – A site that is recognized as a publication and pushes content several days a week (or even several times a day) will train the search engines to visit and see updates regularly. In this case SEO will work to their benefit with fresh content getting recognized and ranked quickly.
  9. Age of document - If a site is a brochureware site that doesn't update content often, the search engines will visit less, but give more value to the pages the longer they stick around. Pages that are years old will rank better than new ones added. SO keep this in mind for your SEO and your site. Are you a publication that pushes out content frequently and needs to get rankings fast, or are you a site that holds true with valuable content that gets better with age?
  10. File size – A page that takes a long time to load, or is extraordinarily large can be quite detrimental to SEO. So be wary of how big that file is that you are creating.
  11. Content separation – As mentioned before breaking up your content throughout your page is more beneficial to SEO than blocking out whole paragraphs. If the site is a known publication, or the section is a blog or article section of a site then whole blocks of content is perfectly acceptable for SEO. But pages that go up and stick around a while with the purpose of providing marketing information, should have content broken up throughout for SEO.
  12. Poor coding and design – This one gets overlooked a lot. Sure, search engines can't determine good design from poor design, but your user's sure can. If a user comes to your site from Google and then immediately bounces, Google will mark the value of that page for SEO down. Therefore, affecting your rankings. So pay attention to design, look at your bounce rate data in Google Analytics, and improve it as much as possible.
  13. Duplicating Content = NO – DO NOT DUPLICATE CONTENT… Just as mentioned earlier, this is very bad for SEO. If one page has more than 80% duplicate content to any other page on the site, then it can harm your SEO. So be sure that the content on every page of the site has more than 80% unique content.
  14. Invisible text = NO – Invisible text is content a site has hidden from users but allows the search engines to see. A div that is only on pixel high with the attribute to hide overflow, or white text on white background (both allowing search engines to see it in the code) is a huge no no for SEO and can actually get your site penalized. So don't do it!
  15. Domains & URLs – Check you domain and URLs often. Look for your keywords, check to make sure the hierarchy is clear and set properly for SEO, and make sure that there are no funky issues like parameters, easily changed (by typing anything in), or redirects to some odd UR. Check your trailing backslash or file extenions as well to make sure it either 404s if wrong or resolves to the correct one.
  16. Keyword-rich URLs and filenames – Watch for those keywords in URLs and filenames. Long URLs that mention more than one keyword will cause issues, so always check and double check the URL for SEO before going live.
  17. Site Accessibility – In some cases having an accessible for those with disabilities. Whether it be sight, or even hard of hearing if you have video. It can actually help your SEO.
  18. Sitemap – creating an page that links to all of your pages can ensure that all of your pages are getting crawled for SEO. Many times I have seen website have pages that they don't link to and wonder why those pages aren't getting rankings. If search engines can't crawl the page, then they don't know to rank it for SEO. You can also create an .xml file for Google, Bing, and Yahoo! site submission. But do remember that an xml sitemap alone just won't cut it, you have to have links pointing to pages from multiple locations. Otherwise it just won't do any good for SEO.
  19. Website size - Keep an eye on the size of your website. Large corporate sites like Amazon.com and MSN.com are expected to have thousands if not millions of pages. If your site is a mall to medium size company and website, yet the search engines somehow crawl millions of pages, then you need to relook at your SEO. Check your paramaters, or other issues that might be causing more pages than your site should have.
  20. Website/Domain age – The older the website the better. A brand new site that is loaded with pages and pages of content all in one day will get added to a sandbox as us SEOs call it. It will sit there for a few months before the search engines even give it the time of day. The reason for this is that search engines want to make sure the site is legitimate and not just a spam site there to just get rankings. To keep your site from falling under this category, having an older domain is key. If you have a new domain, then roll out your pages slowly. Push a section one week, wait a few weeks and push out another section. Having a blog is also good for SEO as you can add posts with content encouraging search engines to keep coming back regularly and learn that this site has something interesting and unique. Of course, the more traffic you can get in those first few months the better, so get your social media and advertising up and going.
  21. File Location on Site – This falls under the URL hierarchy category. Watch out for where pages and files are located on the site. For SEO and for your uses, the structure and location should make sense.
  22. Domains versus subdomains, separate domains – Watch out for the use of sub-domains for your site. Too many websites will put their blog on a sub-domain and not in a directory. This won't hurt your SEO, but it won't help either. What happens is that the search engines count the subdomain as it's very own website, and doesn't link the content with the rest of the site. It is more beneficial for SEO to have all of your content no matter what it is, on your main domain in a directory. Keep it out of the sub-domain unless absolutely necessary.
  23. Top-level domains (TLDs) – A top level domain is the main domain for the site. Even in the case of a www.yoursite.com, the “www” is considered a sub-domain. Yes, a sub-domain… So try to use http://yoursite.com if you can. If the search engines already recognize your www.domain.com then leave it alone, and let Google know that you prefer to use your www. Vs. just the domain. You can do this in your Webmaster Tools.
  24. Hyphens in URLs – For SEO, it is recommended that you use “-“ in your URL rather than “_” or even just a space (which ends up rendering to %20). Search engines just happen to prefer the hyphen to underscore or space.
  25. URL length – For SEO purposes try to keep your URL under 2000 characters, but really the shorter the better. Pay attention not to have more than 3-4 parameters, or a URL that has a really long sentence.
  26. IP address – Your IP address should reside in the country your website is ranking in. US and English should have an IP located in the US. French and Canadian, should have an IP in Canada.
  27. robots.txt – Blocking irrelevant content in the robots.txt will really make a difference for your SEO. It has been recommended in the past to block external css and image directories, but now Google has said they would like to crawl them. Search engines are getting more and more sophisticated to where they can decipher all of the code and really get a good idea of what the whole website is about. Some only block pages and content you really don't want search engines to crawl.
  28. Redirects (301 and 302) – For SEO, redirecting an old URL to a new URL will usually pass the old URLs value to the new URL. But be careful to use 301 redirecting sparingly. I personally have witnessed and dealt with sites that had issues with too many 301 redirects causing rankings to drop.
  29. Social Actions – Social actions like Facebook ‘like's, tweets, shares, Google +1s, and so on will really add  lot of value for SEO. Anytime a user has to take action to show that they see the value in the page will show the search engines that the page is relevant and valuable. Therefore, increasing your rankings for SEO.
    • Google+– Yes, Google loves their social media site, and providing a way for users to +1 your page and site will drive up rankings in Google.
    • Facebook 'Like' or 'Recommend" – The action of ‘Liking' a page for Facebook will sometimes help with Google, but really helps with Bing more than anything. Microsoft anf Facebook have a very close relationship allowing for Bing to use social actions that happen in Facebook to help drive rankings for sites.
    • Facebook comments – If you can, try to pull comments that happen in Facebook related to your site and the page into the page itself. It not only allows for more and unique content, but shows Bing and other search engines that the content on the page is valuable to the user, therefore driving up your SEO.
    • Twitter "tweet" - A simple tweet with your page's URL will always be counted as a “vote” for your page and website. The more you can get, the better for SEO.
    • OGP - Open Graph Protocol – OGP was developed and adopted by Facebook as  way to manage how a page or website looks when shared in social channels. Twitter, and other social sites have followed suit, and my prediction is that Google will start to pay attention to OGP soon. So be sure to spend the time and make sure your basic OGP tags are set for all of your pages. It could really help your SEO.
  30. Links – External – Links pointing to your site are important. As mentioned early, tread very carefully with your link building. Make sure that the page(s) linking to your site and pages are relevant to your site. Do not use directory submission websites, don't buy links, and be weary of link exchange requests. Just as external links can benefit SEO, they can also harm if not done properly. Keep the following in mind for your SEO:
    • Quality of source of inbound links
    • Links from similar sites
    • Links from .edu and .gov sites
    • Age of inbound links
    • Links from directories
    • Links from Social Media
    • Links on pages that include social actions
  31. Schema – Google places a high emphasis on schema tags and information. In the past they have said that if you can get it in there, then great. Now they look at schema information to help drive rankings for SEO. Not to mention that you can manage what is displayed in your snippet from star reviews, author information, embedded video, etc.
Of course there are thousands, if not millions, of algorithms that search engines use to determine rankings, leaving the list I gave you here a small set of what really goes into optimizing a site. In all of my years optimizing websites, I try to write blog posts when I come across issues or get into deep level discussions with my peers on SEO topics. But again, there is so much involved, and sites are all different from one another. I have been teaching workshops since 2007, and have been through thousands of individuals trying to learn SEO and optimize their own sites, only to find that they still need the help of an expert.

What I suggest is that you learn the basics, as much as you can, and start optimizing your site yourself. If you have a site that is older and hasn't been touched in years, go through and see if there are sections and pages you can add with some unique content to add to what you already have. If your site is larger and the traffic just isn't where it should be, then look at what you can do to restructure it to reflect the categories and longtails terms you found in your keyword analysis.

If you want to see how your different categories of terms are performing, you can use this handy template I created along with instructions on how to grab the traffic you are seeing. For some clients, I have used the template to show the estimated traffic I see in the keyword analysis compared to the actual current traffic to show what is missing. I will use the top few terms in the keyword analysis to see how aggressive the category terms are going to need to be to get rankings during the competitive report for SEO. The categories with the most potential, the largest gaps, and the least aggressive with competition are the ones I recommend to tackle first. The competitive report will also help determine what all will need to get done to generate rankings. Is it just one page with a bunch of content and the word mentioned several times, or is it a whole directory with files and filenames that include a mired of terms for SEO that all link to one another?

For usedcars.com the location pages where we generated rankings for the terms “used cars in”… with city and state searches was fairly easy for SEO. The content has a few lines of text seeded with the city and state from the database (also known as templatized content). Content for the page also came from inventory (car listings) provided from the database, with a block from normal listings in that city and a block of deals in which there is a calculation done in the back end that looks at the price of the car and looks up that VIN and price against the Kelley Blue Book value and returns the percentage difference showing cars that are priced under value and are a good deal. Users love those listings. There is also a large map that shows dealerships in the usedcars.com system that are located in that area. The map is generated from Google and helps those pages get rankings for that location.

Those pages were pretty easy to get rankings (after a lot of the mess was cleaned up), and have help rankings providing close to 50% of the traffic from SEO for that site.

A more complex project for usedcars.com that required more pages, and the SEO to be more aggressive is what we called the Make/Model project. The goal was to get rankings in SEO for the brand of cars and the cars with years search trends. We found that users that search the “year make model” search know exactly what they are looking for and are more likely to purchase. So, ranking for all of those year, make and model combinations were highly valuable to the business. The problem is, that all the other car sites know the same strategy and have been very aggressive for their SEO.  A set of rules for syndicated and dynamic content was set in place along with a plan to roll out pages and content in phases. When I left in May of 2014 the project was still underway, but the pages were already seeing some traction. You can see how the pages were developed at http://www.usedcars.com/car/ - considering they are still intact and working on the pages as specified in the project.

I'm always happy to talk SEO with anyone anytime. You can find me on Skype (as SEOGoddess) or fill out the contact form on my site here with any questions. I'm usually pretty quick to respond, and can help you in any quick SEO issues or questions as you try to optimize on your own. I have even been known to look at a website when an agency is working on the SEO just to make the site owner or boss feel comfortable that their agency really knows what they are doing.

There are also many resources other than myself or this blog, and plenty of SEOs with a lot of great experience. Ian Lurie is one of my favorite people in the world, and has a very successful agency with a lot of great SEOs he has taken under his wing and turned into skilled professionals. His company Portent can also help with website design, social media, and paid search marketing. Give them a glance over and see if they fit your needs. Bruce Clay is also a very close friend and someone I go to regularly myself for help. He works with very large corporations on a large scale including AT&T, CNN.com, Edmunds, and more. He is what some of us in the SEO industry call the “Godfather of SEO” since he was one of the original SEOs that has set the standards for quality in optimizing.

I do have a larger list of SEOs I know and trust, so feel free to contact me and ask me for someone in your area, or who might specialize in a site that is much like yours.

Either way, SEO can be fun and you can really learn a lot quickly if you want. You can know enough to be dangerous, but if you stick with the general rule of “don't trick the search engines” you should, for the most part, be just fine.

In the end a site that has increased traffic from SEO is a site that I generating a lot of money, and that's just good for business.

Friday, January 18, 2013

SEO Issues - is it Penguin? Is it Panda? or is it me?

The following story is one that has been several months in the making. It's one that I have lived through one too many times as an SEO, and it is one that I am sure other SEO's have faced. I fought with the thought of writing this for fear that someone from the company might read it and get angry that the story is told. But, it's something I think that not only people out there could learn from, but speaks to so many others in this industry to show them that they are not alone.

It's long, it's a bit technical (I tried to keep it simple), and it has some personal frustrations laid out in words. My only hope is that you get value out of reading this as much as living it has made me a better person (or well, a better SEO).

It Begins


I started working on this website's SEO in May 2012 at which time I was told the site's traffic was declining due to Panda updates. In February of 2012 the traffic from SEO was the best they had ever seen, but soon after that there was a steady decline.
Traffic from February 2012 - May 2012
Before digging into any possible SEO issues, I first checked the Google Trends to ensure that the decline isn't searcher related. Often times a drop in traffic could just mean that users aren't searching for the terms the website is ranking for as they were in the past.

Top Key Terms in Google Trends
Looking at the same time frame as the traffic data, I noticed an increase in searches for the top 3 terms the website ranked for, and there appeared to be a decline around the same time from March to April that the traffic was reflecting. But there was a drop in the website's traffic in April from the 23rd to the 24th and then significantly on the 25th. The website I was working on had two SEO's already working on it: an agency and a consultant. Both had already done a numerous amount of research and some work to get the website on track. Both were stressing that the drop in traffic was due to the Panda updates by Google. I looked at SEOmoz's Google Algorithm Change History and found an update to Google's Panda on April 19th and an update to Penguin on April 24th. Given that the traffic significantly dropped on the 24th my best guess is that it was possibly Penguin related, but still needed further exploration.

Figuring Out What Was Hit by Penguin.


The site is/was broken up into sections by keyword focus. At one point, I could tell that someone really had a good head on their shoulders for SEO, but the strategy that was used was outdated. Perhaps the site was originally optimized several years before, and it just needs some cleanup now to bring it up to 2012's optimization standards. So, understanding Penguin and identifying which part of the site was driving the bulk of the organic traffic was going to be my next step in solving this mystery. Once I understood why, and where, then I could start to establish a what to do to solve the problem.

I broke the site traffic report by sections as best I could in Google Analytics. There was a bit of a struggle as all of the pages of the site resided on the main domain. Without a hierarchy in place, breaking out the sections had to be accomplished with a custom report and a head matching for landing pages. I hadn't had to do this before, so the agency that was working with the site already helped build the first report, and I began building out the other reports from there.
Click to View Larger
Section 1 over 72% of traffic

Just focusing on April and May I created a Dashboard in Google Analytics focusing on organic Traffic and identifying the sections of the site. Looking at the different sections - Section 1 was the bulk of the traffic with over 72% and Section 2 coming in second with just over 15%. Subs of Section 3 and other one-off pages make up the difference.

Both Section 1 and Section 2 dropped off after the April 24th date, so clearly they were the bulk of what was pulling the overall traffic numbers down. Since Section 1 was the majority of the traffic, I presented to the executive responsible for the site that we address any issues with that page first.

Actual screenshot of Section 1 presented
I took all of the research from the agency and consultant and we quickly reworked the pages to represent a hierarchy in the URL structure, and cleaned up any issues from the outdated optimization that was done.

Soon after Section 1 was addressed, we did the same with Section 2, and then worked on Section 3 (and sub pages, rolling them up into a solid section) and then added a few pages to grab any new opportunity.

Not Quite As  Easy as it Looks


The projects were launched in increments - first URL hierarchy fix to Section 1 and then the page redesign. Next was a full launch of URL fixes and page redesign to Section 2, and then lastly Section 3 and the new Section 4.
Section 1 - Section 2- Section 3 Launch Dates and Organic Traffic
Soon after Section 1 was launched traffic started declining rapidly. I was asked several times why traffic was getting worse, and I started digging some more. Every time I looked at the Impressions of the new URLs from Section 1 they weren't getting any traction, but the previous URLs were still.  I began looking at the history of the website, trying to find out why it was doing so well at one point, but was not doing well at that time. One of the things I noticed was that there was a lack of priority linking to these pages, but at some point there were links to some of them individually from the homepage. Google matches a hierarchy of pages to a directory structure that links are presented on a site. This site had every page on the first level, and linking to those pages from the homepage, which was telling Google that every page was the most important page. It worked at one time, but as Google has been rolling out their 2012 updates these pages were getting hit, and those links on the homepage weren't there anymore. Before the launch of Section 2, I had them put links to the main directory for each section on the homepage. The links would tell the search engines that these are important pages of the website, but not be so obnoxious with a dozen or more links on the homepage to discourage users (avoiding the appearance of spamminess).

But - even after adding the links to the homepage, the traffic to those pages was still declining. Pressure was put on me to figure out what was wrong. In addition, accusations were flying that I single-handedly ruined the SEO for the site, I spent every waking hour looking at reports, and trying to figure out what was going on. I consulted friends in the industry, and read every article I could find to figure out what Panda or Penguin updates were affecting these pages.

Then it hit me - just as the links to these sections would help them get recognized as important pages, so were the other pages that were being linked to from the homepage. In fact a set of them linked to the website's search results with queries attached to them mimicking pages, but showing search results. On those search results pages, there were over 200 links with multiple (we're talking hundreds - possibly thousands) combinations of parameters. The bots were coming to the homepage, going to the links to the search results pages, and then getting stuck in this vortex of links and combinations of parameter generating URLs - not allowing any crawl time for the pages that once were getting rankings. This also explains why the new URLs weren't showing very many impressions in the Webmaster Tools Data - those pages just weren't getting crawled.

There was a project underway that would solve the many links on the search pages, and there was also talk of using ajax to show the results. When this project would launch, the bots would go to the URL from the homepage, but would then essential not go much further. With this project a few months out, I made the case to add the search page to robots.txt to allow the bots to then recognize the Sections as important pages. After several weeks of attempting to convince the powers that be, the URL was eventually added to the robots.txt file.

Immediately after the search page was added to the robots.txt Google Webmaster tools presented me with a warning:
Warning in Webmaster Tools
In most cases, a warning from Google should never be taken lightly, but in this case it was exactly what I wanted. In fact it proved to me that my theory was correct, and that the site was hopefully headed down the right path.


Panic, Questioning, and a Third Party


As with every up in the SEO world, there must be a down. Soon after the search result page was added to the robots.txt the organic traffic to the site dropped, and continued to drop. Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
Organic Traffic from September 2012 - November 2012
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.

The patent states:

When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
 But the article and my please fell on deaf ears...

It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively.

Just as the consultant was starting his audit, and the traffic hit the lowest I ever thought it could possibly go, the next day traffic went up. The last week in November (roughly 3 months after we blocked the search result page) I saw an increase in traffic in Google Analytics to Section 1:
Section 1 Organic Traffic
I quickly pulled up my report to check the Section's impressions from the Webmaster Tools data, and there was a significant increase as well:
Section 1 Impressions from Webmaster Tools Data
On December 3, 2012 I logged into Webmaster Tools and saw that the warning had gone away:
It was the "halleluiah" moment that every SEO dreams of, and very few get. All the work I had done, the fighting for what I believed in, it all finally paid off.

To this day traffic continues to increase - we can now focus on some of the cleanup still left to do, and then onto projects that will attract new opportunity.
Organic Traffic from November 2012 - January 17, 2013 (day before this post is written)
Quick Note: 
I forgot to mention a post I wrote months ago while going through all of this - SEO - Panda and the Penguins. It helps to give a bit of perspective of some of the linking stuff I didn't get into in this post. 

Wednesday, October 26, 2011

Why the Google Changes? Ooh Ooh - I get it!

 Stripping out "+" in searches and not providing keyword referrals in analytics
Last week I posted about Google's announcement to stop reporting on referring key terms in Analytics, and I have been keeping up to date as much as I can with all the news around it since then.

2 Days ago Barry Schwartz posted an article to SEL about the changes in how we search on Google. Google has removed the ability to use the "+" in our advanced search.

Google themselves said:
"We're streamlining the ways you can tell Google to search for the exact keywords you type, whether it's an exact phrase or a single word, by focusing on the functionality of the quotation marks operator. So, if in the past you would have searched for [magazine +latina], you should now search for [magazine "latina"] to get the same results."

So it hit me this morning...

Since the launch of Google+ several months ago, as an SEO, I have at times found it difficult to search for "Google+" or even the "+1 button". I am sure that since the launch, Google themselves are having trouble seeing referring key terms. In the past the referring URL would have "+" in between the terms. So if someone searched "Google+" then the referring URL would strip out the "+" and those monitoring the referring terms for Google+ would just see "Google" as the referring term. So their question would be: Did people search "Google" or "Google+"?

With Google+ itself being under a microscope after the dying "Google Wave" and "Google Buzz" I can see someone saying to the powers that be that this needed to get fixed. Otherwise they couldn't accurately decide if Google+ is going to succeed.

The next step in this process is to strip the referring URLs of their "+" in between key terms. This unfortunately directly affects analytics as companies won't be able to accurately see referring search terms anymore.

So now Google just needs to fix the tracking of referring terms somehow. Google doesn't want to miss out on that data any more than we do. So be patient, it will come back again...

Tuesday, October 18, 2011

Google Secure Search and what it means for SEO's

I was editing videos from Search and Social Hawaii diligently and getting ready for my talk on SEO this Thursday when I saw a post update from the Google Analytics Blog addressing the announcement that Google is going to make search more secure. So I came out of my hole for a moment to draft up a quick blog post to clear up any questions.

Google says:

"As search becomes an increasingly customized experience, we recognize the growing importance of protecting the personalized search results we deliver. As a result, we’re enhancing our default search experience for signed-in users. Over the next few weeks, many of you will find yourselves redirected to https://www.google.com (note the extra “s”) when you’re signed in to your Google Account. This change encrypts your search queries and Google’s results page. This is especially important when you’re using an unsecured Internet connection, such as a WiFi hotspot in an Internet cafe. You can also navigate tohttps://www.google.com directly if you’re signed out or if you don’t have a Google Account."


What does this mean for SEO's?

Rankings:
It doesn't directly effect rankings as a whole, but it does effect the individual user's results as they will see a more personalized list of websites in their search results. In all honesty, that doesn't change anymore than what us SEO's have been working towards for several years now since Google setup Gmail and a login feature for their products and searches. It just means that instead of a few users seeing personalized results, more users will start to see personalized results. So dont' focus on whether or not your target audience is going to be logged in and what they might see in their personalized results, assume that all of them are.

Analytics:
Now here's where it gets tricky. The Google Analytics team is working very closely with the rest of the company to ensure that data is being passed showing the referring URL and Terms from paid and natural search results.

"How will this change impact Google Analytics users?
When a signed in user visits your site from an organic Google search, all web analytics services, including Google Analytics, will continue to recognize the visit as Google “organic” search, but will no longer report the query terms that the user searched on to reach your site. Keep in mind that the change will affect only a minority of your traffic. You will continue to see aggregate query data with no change, including visits from users who aren’t signed in and visits from Google "cpc"."


What is Google Analytics doing about it?
We are still measuring all SEO traffic. You will still be able to see your conversion rates, segmentations, and more.

Which is great for those of you that have Google Analytics installed on their websites, but what about Omniture or Webtrends?
In 2005 when I was working with Omniture to start showing referring key terms for SEO and streamlining our PPC within the system they were able to crack the code and get us the robust tracking we see today. While I don't see a blog post on either Omniture, or Webtrands blogs (I'll add the links in a comment as soon as I see something) I can assure you that Google doesn't want to hide anything from us SEO's and Marketers. They want us to be able to see the referring traffic and asses what is working, and what isn't. If we don't see it, we can't create a better user experience, and that would go against all of Google's ideals.

For those of you that rely solely on Omniture (or have clients that do) take a moment and reconsider adding the Google Analytic tracking onto the website (or talk to your clients about it) for the time being.

So never fear, rankings will still go on as usual, and the ability to track in Google Analytics won't be effected in any way...

Carry on...