Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Showing posts with label search engine optimization. Show all posts
Showing posts with label search engine optimization. Show all posts

Monday, February 19, 2018

Keyword Categorization

What is Keyword Categorization?

Simply put keyword categorization is the process in which you organize search phrases by common words. Organizing keywords helps structure your website capturing all phases of the buying process.

As mentioned in my post in 2013 "Categorizing Keywords" I cover the importance of categorizing keywords, but also a quick way to do it in Excel. This is a great strategy when working with a smaller set of terms, but what happens if you need to focus on larger sets including millions of terms to categorize that will most likely break you excel document?

There are tools like Conductor's Searchlight and Keylime that are great for dynamically tagging your terms with common words so that you can quickly categorize them. As more terms are added to your searches the dynamic tagging will add them without any additional effort from you. So easy!

Not every business has the thousands per month budget to shell out for these awesome tools so manually or setting up your own means of tagging is going to have to be done. Using Excel you can export your terms from Google Search Console and a simple sumif formula will pull in the totals for your terms.

For this lesson I am going to use my keyword list around auto make, models and year make model searches to categorize with the three distinct categories, and additional tagging to understand how many terms are reviews or the long tail and "for sale" that would be a high intent to convert.

We'll start with our exported CSV file that you'll pull from Google Search Console, Keylime, or or Keyword Planner. It should look something like this when you first open it up:
Downloaded CSV File from Google Search Console
Downloaded CSV File from Keyword Planner
 As you can see you will have three or 5 important columns. The Keyword column holds the list of terms you're going to set your lookup against. The data following falls under your Impressions, Clicks and additionally Click Through Rate and Average Position. We're going to focus on Impressions and Clicks for this exercise and use a formula to calculate the CTR from there.

Start by saving your document as an Excel Workbook. Formulas won't work in CSV.
Saving CSV as Excel WOrkbook
 Next up you'll want to create a new tab. I usually name the tab "Lookup" so that I know that this is where all my criteria I want to lookup is located.
Adding a new tab in Excel called "Lookup"
To make it easier to type in formulas and manage then I like to name the criteria that I will be using. In this case we are naming A2 through A55 "Make", doing the same with "MakeModel" and "YearMakeModel". I additionally will name the "All" tab columns as "Term" for the Keyword list, "Impressions", and "Clicks".
Excel naming feature

How to Sum Cells by List of Criteria

Using this formula and method only looks for exact match phrases to the keyword phrases in the cell and adds up all into a total. 

The formula will look like this:

=SUMPRODUCT(SUMIF(Term, Make, Impressions))

How the formula works is by using the first array which contains the TRUE / FALSE values that result from the expression (SUMPRODUCT) then using the SUMIF criteria to lookup the range (Term), Criteria (Make) and finally Sum Range (Impressions). Only the Terms that include the list of the criteria will be added up, or "summed".

Excel Sum cells by list of criteria

Excel Sum Cells by List of Criteria and is Included (not Exact)

In this case we want to add up all the keywords that include a few specific words within them. So we need to use the SUM formula to add up each of the words we have in our Lookup. In this case we have set three terms that we believe have a high user intent to purchase that are "for sale", "used" and "priced under".

A normal SUMIF sets the Range we want to lookup, the criteria of what to look for, and then what we want totaled. In this case the criteria involves a word that is located somewhere in the phrase so we will be using a wild card before and after. Ex: SUMIF(Term,"*for sale*",Impressions).

Because we are using criteria in a cell we need to concatenate the star to complete the formula. Ex:"*"&Lookups!E2&"*"

Since the Intent list just includes three words we'll SUM all three as a total.

=SUM((SUMIF(Term,"*"&Lookups!E2&"*",Impressions)),(SUMIF(Term,"*"&Lookups!E3&"*",Impressions)),(SUMIF(Term,"*"&Lookups!E4&"*",Impressions)))
Excel Sumif list within terms in cell
In order to calculate the Click Through Rate we'll just divide the Clicks total by the Impressions and format at a percentage.
=C3/B3

I have saved the excel document I used for this example in my Google Drive so you can download and play with. The numbers for the terms are completely falsified, so don't be using them to optimize your auto site. 

Depending on what data you need to look at for your terms, you can use these formulas to quickly add up totals to help you understand how your SEO is working, and determine what you will need to build out to capture new opportunities.

Sunday, February 23, 2014

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the "SEO Issues - is it Penguin? Is it Panda? or is it me?" in which I detailed out the struggles I had faced with recovering a website that had been hit hard by Penguin and Panda updates. I had first identified which section of the site that was hit the most, and presented my findings to the head of the corporation's division. 
In the months that followed, there were projects quickly rolled out that were designed to fix the site, and to recover the traffic that was lost. After months of the site losing traffic and still getting penalized there was a light at the end of the tunnel in recovered traffic just before I wrote and published the blog post.
I am sure a lot of you who have read the article are wondering where the site is today. Sure, a quick hockey stick of recovered traffic is great, but did that traffic hold throughout the year? 

I am happy to report that traffic has held:

Traffic continued to grow throughout January 2013 (just after the jump in traffic in November 2012) and through February. The industry the site is in sees an overall rise in traffic and consumer interest to purchase from the last week of December through February with a downtrend through the rest of the year every year. While we saw the site follow this trend, in July we saw a boost in traffic. After careful research, it was determined that a Google Panda update confirmed on July 24th was the boost we needed. Ironically, what had taken the site down in the past was now helping the site improve as we continued to clean up issues remaining causing the hits from Panda and Penguin.

Most important issues identified:


  1. Too many on-page links
  2. Top level pages
  3. No site hierarchy
  4. An extremely high number of URLs
  5. Duplicate content
  6. Not enough unique content


In general - the site was not following the core of the Webmaster Tools Guidelines which is the Bible of SEO. If a site isn't following the core of what Google says, then there is virtually no hope for it.

As 2012 was focused on recovering from Panda and Penguin hits, 2013 was focused on cleaning up the remaining issues from parameters, thin content, duplicate content, finalizing the last pages to the hierarchy, and so on. I am happy to report that the site is now 96% complete of all issues, and the first of many projects are underway that are designed to grab new opportunity for the site. Traffic has seen increases in August, October, and another boost in December with continued growth as we are in the peak of our season again.

Courage is Grace Under Pressure


It was what was stated in the image I had added to my post along with the description of the pressures I had faced in 2012.
Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.
But the article and my please fell on deaf ears...
It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively. 
It was a lot to bear, and lot to live through. Sadly, what I had experienced in the organization was not unlike the experience many SEOs face. We are constantly questioned and second guessed. Many coworkers and bosses will even take it upon themselves to learn as much as they can, only to find that there is a lot of arguing among the experts in the industry as to what is the "right" way to optimize, and what is "wrong". But for SEO there is no right or wrong way, only what works. My philosophy behind optimizing is to always follow the rules. Start with the guidelines, organize the website, provide information for the user and present it in a way that makes sense. No shortcuts, trickery, or "optimizing". It is a philosophy I fight for every day (and boy do I fight).

Today I am happy to say the tune towards myself and the SEO work that is getting done has completely changed. My Boss now gives me rave reviews, the company has belief in the work done for SEO, and the team (product and engineering) is extremely supportive with any issue or requests needed for SEO. We are a cohesive and supportive team, and the results from the work we are doing is paying off.

Many people in the industry have told me to look for work elsewhere. That not only included people in the industry and some of my supportive coworkers, but the person my Boss reports to had even recommended it. I will admit I had interviewed at a couple of places. What discouraged me was that I knew the work that had been done up until that point would eventually pay off. If I had left then credit would have been given to the next SEO hired, or worse, it would be ruined by someone focusing on getting instant results. Most importantly, leaving the team that is now supportive was just not what I wanted. I love working for this company (the benefits are good), and everyone that I work with is like family. I call us the "Scranton branch of Dunder Mifflin." since we are a small office within a very large organization. We are a close family that support one another and they all seem to welcome my quirkiness.

Until Next Year

So until next year, I will be continuing my job and the work for SEO. I have big plans for the website, and will continue to fight for what I believe in for SEO. I will also try to get more informational posts on here as we continue our last bit of cleanup on the site and continue to see results. I am even contemplating writing a book (as I have been told I should for a few years now), but am still reluctant with all those great SEO books already on the shelves - most of which have been written by some very dear friends. Until then - keep optimizing, and fighting for what you believe in SEOs.

Sunday, August 4, 2013

SEO Analysis for Good!

Just a week ago I saw a post come through the Seattle Digital Eve mailing list from someone asking for help on behalf of a friend.

The post:
I have a friend who is looking to improve the SEO on their ecommerce site.  They are a small business, and the SEO people they've talked to want to sell them a big, complex package, when what they really need is some coding improvements on the site, consolidation of 2 sites into one while preserving link juice, better keyword/metadata, etc.  
Anyone out there with some decent SEO expertise?




Offering to Help


I emailed the person that posted to the group:
If it's a small project I can probably spend some time on it.
Even though I have a ton of experience, I can keep the fees down since I have a day job.

My email was quickly forwarded on and the business owner (Martin) emailed me right away with a simple "Hey Jenn, I am interested in your SEO skills. Please call me...". I called him on my way home as I was stuck in traffic that evening. Martin described the two websites for me, how they were getting rankings and traffic, and now aren't. He said he had done some work to one of them, and traffic kicked up. He asked me to take a look at them and see if it is something to do with the website, if either had been hit by Panda or Penguin, or if people just aren't searching for rugs online as much as they used to. I quickly thought - how odd that this is almost exactly what we have been going through with the websites at ADP. Not to mention that they have gone through this up and down, all the time me trying to figure out if it was just general loss of interest, change in searcher behavior, or if the site was going through a penalty or hit from just general Panda and Penguin updates. While this man's sites are a small version of what I deal with on a daily basis, this could not only be fun, but should be fairly easy to figure out.

Martin asked what my hourly rate is - but I immediately responded with a giggle (at $250/hr I'm sure it's probably more than he was expecting to pay) and told him that for the initial conversation and if it's something quick, I won't charge him. I then asked him if he has a Google Analytics account attached to the websites, of which he responded with a "Yes". I told him how if he can look at the Webmaster Tools data in Google Analytics and look at impressions vs. Avg position, and then impressions vs. Click Through Rate. If the Avg position, and CTR stay fairly steady, but impressions drop, then that is a quick way to see if interest has gone down. Another way to back that up, is look at how many terms get impressions one week, to the next week. If those numbers drop, then there is a good chance that rankings are dropping and there is a penalty. He was a bit overwhelmed, and asked if he gives me the login if I would take a look for him. I excitingly said "Sure" (since most people don't like to give out their login, it's easier to tell them how to do it - but this was the next best thing). I told him the next steps - I would spend some time that evening and see what I can find, then come up with a diagnosis and a plan of attack going forward of which he can either do himself, or if he has the budget, hire me to help him with it.

The Evaluation


That evening I settled in, opened up the computer while watching Hulu on the TV (we don't have cable), looked through both websites, and then logged into GA to dig through the analytics.

Checking 

Behavior vs. Penalty/Panda/Penguin


I started by going back as far as I could in the analytics account looking at organic traffic only. The month that both sites did the best was just a few months back this year. I also checked the timeline against Penguin and Panda updates via the Moz.com Google Algorithm Change timeline and noted if there were any clear drops that sync up with an update. There was a slight drop with one of the updates that hit one of the sites I work on for my day job that is built similar to his.

I first pulled the Impressions against the Avg position to see if the position stays the same while impressions go down. Though given that there is a slight drop after an algorithm update, I figure that there will be a drop in position, impressions, and number of terms getting impressions.

My First Chart:

Site #1
Impressions vs. Avg Pos.
Note: I took out numbers to protect the client even though he gave me permission.
Site #2
Impressions vs. Avg Pos
As you can see - there is a drop in impressions but the average position stays fairly consistent, and even more-so for Site #2. Both improved the last few weeks in positions, and impressions.

My Second Chart:
Site #1
The number of keywords showing impressions week over week for site #1
Site #2
The number of keywords showing impressions week over week for site #1
I noticed that the number of keywords getting impressions (meaning how many terms showing up when a person searches regardless of position) drops when the impressions drop in the earlier charts, and then goes up when impressions go up.

This clearly shows a penalty, and given that there was a penguin update just before that drop, it is pretty clear that the site took a hit from that update, then saw an improvement when he completed the little bit of work he did.

What Got Hit?


Knowing now for sure that both sites took a hit, the next step is to figure out exactly what was hit. With the site I manage at my day job I will usually run a category report to find out which terms were affected so that we can evaluate and establish a plan to recover. I didn't have time to set up the categories of terms to run the report (it takes days to categorize terms, but since that has already been done for the site I manage - it takes me just a few minutes or so to categorize now) so this time I grabbed the number of words in each term. If the phrase has just 1-2 words then it's safe to say it is a broad term, and if it has 3-5 words then those are more exact. Penguin tends to focus on sites that have optimized for long tail terms, and less on the broad terms. So, this is a faster way to get a similar understanding.

I ran a comparison to see how things were in his big traffic months compared to the recent months that there was a drop. I took 4 weeks in the high traffic month from a Saturday through the last Sunday and compared it to the last 4 weeks Saturday through the most recent Sunday. This would give me an exact day of week compared to that day of week and reference the beginning of a month to the end of a month. Ideally it should compare to the same time of year to reflect searcher behavior for the day of the week, time of month, and the time of year, but in this case the day of week and time of month was going to be good enough.

Example of Keyword Data with Count
Note: Terms and traffic are not representative - I changed them to protect the client.
The table above is what the data looks like. Do note, I changed the top keywords, and numbers to protect the client - but this gives you an idea of what I was working with. From there, I created a pivot table and played around with the data to give me more insight into what was going on. Number of words in a phrase, visits, pages/visit, etc. It all helped me understand what was going on before and after the update.

Keyword Count - showing how many words in a phrase were driving traffic from high month compared to low month.
Note: numbers and details have been removed/changed to protect the client.
The 3-5 word terms dropped from the high month to the recent months. This shows that the longer tail terms were hit, which is pretty indicative of Penguin.

Looking at the Sites


Having spent the first hour of my time running the reports and pulling charts, I spent the rest of the time looking through the sites now that I know what to look for. The sites were once optimized for long tail terms, but something happened that they lost that traffic. As I dig through both sites had categories for the two word terms (such as "area rugs) with links to individual pages for each item that fit in that category. The first that I noticed is that there is a URL hierarchy (something the website I work on lacked). So he was good there...

I started noticing as I was looking through both websites, that they were structured exactly the same way. I also saw that the navigation was the same on the left linking to different URLs, but the content appeared to look very similar on the pages. I grabbed a couple of the URLs that were focused on the long tail terms and pasted them into copyscape. The report kept not only pulling the other site as the first match, but other sites that sold the same products. This is a very common issue with eCommerce websites - since they don't have the time to write their own copy for each product, they tend to pull it in dynamically through syndication. With not enough unique content on those pages, then the site appears to be duplicating what all those other sites have. It's not a majorly serious issue, as Google tends to understand syndicated content, but if a site doesn't support the content with something unique they just won't get rankings as well as the ones that do.

Martin's sites had a bit more of a issue though since he has two sites with the exact same content, and the exact same structure. When I compared the terms that both sites got visits from during the high month, then I noticed that not only were a lot of the terms the same, but there were a lot of the terms with site #2s domain, and name in there.

Compare Terms from both sites with visits from organic traffic.
Note: the domain name has been changed, and so have the numbers to protect the client.
I think at this point it was very safe to say that the site took a hit by the Penguin update to the long tail terms due to both sites duplicating each other.

My Email and Recommendation


After spending a couple of hours on the site, I drafted this email and attached the excel document I used to analyze the sites (note: the email below is changed slightly to protect the client):
Hey Martin -
So I dug into both sites and the Google Analytics to see what’s going on. I’m attaching my excel doc if you want to see my work, but it looks like both sites definitely took a hit of some sort.
 Moz.com keeps a list of when updates happen so you can keep an eye on things: http://moz.com/google-algorithm-changeThere was an update at the end of January then another big update in March that may have led to you losing your rankings. I've seen this drop in other sites that are built very similar to yours – so I dug into the analytics to make sure that my assumptions are correct.
 What I found:I first compared visits from organic traffic (SEO) against your average position the past few months (webmaster tools only goes back 90 days, so I couldn't go back to January unfortunately).
 Site #1 definitely saw a decrease in traffic along with the drop in conversions (pasting the charts below for you to look at). With Site #2 there was a drop in traffic, but the average position seemed to not drop as much. Usually this would be a sign that people aren’t searching as much, so I wanted to check your keyword count and impressions week over week. If the number of terms drops seeing impressions drops from one week to the next it is usually an indicator of a penalization or hit by a panda or penguin update. I’m not pasting those charts in here since they are really raw, but you can see them in the excel sheet. The terms that have 2-5 words in there took a huge hit, while the one word, and longer tails appear to be sticking around. I toggled from keyword count to visits from the keywords and those sets stay pretty consistent in dropping.
 What this means is that you most definitely took a hit in rankings from the updates. Not just rankings dropping, but a bulk of your 3-5 word terms dropped out of the index completely. Those 3-5 word terms are also the bulk of where your visitors come from – those longtails are higher converting terms and can really affect revenue if they drop off. It looks like both sites are build very similar and have a lot of the same content. I compared the top referring terms both sites saw in your highest traffic month and both refer traffic for “your domain” which isn't good. They both get traffic from “broad term” but site #2 has site #2 beat there. They also both have several long tail terms that are the same.
 When I run a report on copyscape.com to check for duplicate content – the site #2 along with a few others come up (included link directly to copyscape) The “product” rugs page on both sites is exactly the same – almost word for word.
 It’s kinda fun to have two sites show up for the same terms, since you could get double the traffic. In fact that’s what my company does – and what I manage. We have dozens of “portal” sites to grab leads to sell to car dealers. But if Google has any idea that both sites are connected then both sites get penalized. I think this might be what has happened here.
 Your first solution would be to get unique copy on all of the pages of the site. I know it’s tough writing for all of those pages, and copy writers can be expensive. There is an alternative called TextBroker (http://textbroker.com). They have writers that bust out copy pretty quickly (2-3 day turnaround) at a pretty reasonable rate.
 I would recommend getting an account set up and start asking them to write for your pages. Even your homepage content – while there is a lot of it, but looks to be pieced from other content on the web.
 The order I would have them do it in is in order of the pages that had the most traffic in your highest month, and then work down from there.
 Once you get them going on that – I can do a full keyword analysis, check to see where the opportunity might lie, and get you a complete plan.
 The excel doc is attached – let me know if you have any questions.
 Hopefully this was helpful J
All in all it took me just a couple of hours since this is what I do for our executives regularly, so I didn't charge Martin for the work.

SEO for Good!


Martin was so excited and appreciative of the work I had done, and what I had found that he asked me if he could pay me in some way - "..if anything to help the school in Nicaragua". I gave Martin the link to donate to the school, and he did.

The money immediately went to help buy supplies for my Husband's students we are bringing with us. Since they had to pay for their immunizations out of their paychecks, don't have sleeping bags (we are loaning bags to them), and anything else they need they pay for themselves, I wanted to help them so that they could focus on helping build the school and not stress that they have everything they need for the trip.

In the end, I helped Martin with his websites because I like to help small businesses succeed, Martin returned the kindness by helping the students with their supplies, so that they in-turn could help build the school for children in Nicaragua.

Everybody wins!







Tuesday, July 30, 2013

Anatomy of the URL and Stuff

I'm sure you are looking at the URL above and thinking to yourself; "Wow, I never realized that all that stuff meant something." Oddly enough it actually does... As the world wide web has changed into a search friendly, user interactivity playground, the formation and meaning of the URL has evolved considerably in to a very significant factor in not only search engine compliance but in how people use websites. Lately I have been helping clients understand how their website's are structured and how servers to browsers to users work. It's something us search optimizers view as something so simple yet can be so complex to someone who doesn't understand how it all works. So here is the URL broken down piece by piece and explained.  

First - What is a URL? 
A Uniform Resource Locator is a website address that holds very important information between each "." and "/" much like an address to your home contains a house number, city, state, country, etc. This allows the browser to connect to a specific website, directory/path, and/or file in order for the user to see what it is on the page you want them to see. A URL consists of the following:


Hypertext Protocol Established by English physicist Tim Berners-Lee in 1990, hypertext protocol is a request/response standard typical in which the client is the application (User on a web browser such as IE, Firefox, safari, etc) and the server which hosts the web site itself. The client submitting HTTP requests is typically referred to as a user-agent (or user) with the responding server—which stores or creates resources such as files (html, .asp, .php, css, etc) and images—referred to as the origin server.*

  WWW (World Wide Web) or "sub-domain" 
The WWW is typically placed before the main domain of your website URL, referencing the World Wide Web. Remember the game you played in elementary school where you could start your home address with your house number, street, city, state and then go off as far as your country, continent, and even earth. The WWW is the address starting with "earth". In some cases, what we call a "sub-domain" can replace the WWW in your URL, which references a whole new website within your existing domain. Search optimizers can use this as a way to target certain key terms. For example, a real estate agent targeting a specific city will use http://city.domain.com and thus will have a leg up when ranking for anything within that city. In most cases the sub-domains will link to the main domain and, since they are treated by most search engines as a domain all it's own, then it will count as an external link credit, boosting the rankings for the main domain it is linking to. It is highly recommended that you avoid this technique as it is only tricking the search engines and in the end will hurt your rankings rather than help. 

  Domain Naming System (or DNS) 
The domain naming system was established so that the common user can understand in simple terms the location of a web site. A web site's files are usually stored on a server that points to a specific IP address (much like a phone number directs someone's call to your phone). In order for the general public to understand where to locate a certain website and it's files, the specific domain name resolves to that particular IP address. In addition, the Domain Name System also stores other types of information, such as the list of mail servers that accept email for a given domain (such as you@yourdomain.com). 

Top-level Domain Extension 
The domain extension originally consisted of the generic gov, edu, com, mil, and org. With the growth of the internet, the addition of country extensions and other such categories have come into play. The most recognized of the extensions is of course the .com. If you are optimizing for a specific country and language, then the best route to take is to register your domain with that specific country's extension. This will help the search engines recognize that you are targeting that particular audience and will rank that site accordingly. Be sure that your country specific site is in the native language for that country to avoid any duplicate content issues. Do also be careful of linking from that domain to your main domain as once again the site will be penalized. 

Directories and Files 
Here's where the fun stuff comes into play. Just as your computer organizes your word doc, excel, and other such files into folders, a server structures your website files in the same way. A "directory" or "path" is much like a "folder" is on your computer. In standard (old school) html development (before the days of creating dynamic websites powered by databases and user interactivity) a file would be created and named "index.html" or "default.html" and placed either on the main domain folder (in which the DNS resolves to on the server) or placed in a named folder (in order to help the webmaster organize the site's files). As the technology grew and more ways to develop websites with user interactivity and database driven websites advanced, the structure has pretty much stayed the same with the addition of "parameters" that reference a part of the database and returns content and such on a page based on those parameters. (have I lost you yet?) Let's go back to the basic structure of the static html files and go from there...

A Dynamic website is one that has a few static pages (in other words the pages are coded and are only editable by a developer) that have parameters that will pull in content or trigger specific actions from a database. The basics of a dynamic page is one that pulls words, images, etc from a database and can do so creating multiple pages with different content from one basic page. A more complex dynamic page (or site) is something like Facebook, or Twitter in which they recognize whether or not you are signed in with a username and password and will show you either your profile page (if you are signed in) or a "please sign up" page (if you are not signed in or don't have an established username).
In order to help understand this let's talk about how a database works. A database is essentially similar to that of an excel spreadsheet or table in a word document that has a unique identifier for each line (or row) and holds different content for each line item. Example:
Email
First Name
Last Name
Sujo234
bob@bobsemail.com
Bob
Sujo
Forjill23
jill@jillsemail.com
Jill
Forman
Username
In this example the username is the unique identifier with the email, first name, and last name as different parameters for that username.

The content will be different on each page. With dynamic content the possibilities are endless as far as how many pages you can create from developing and design just one file. A great example of how a dynamic page is created for search optimization purposes is on usedcars.com - If you search for "used cars in oslo mn" you see the "UsedCars.com Oslo MN" page in the results. Look at the URL in the address bar when you go to that particular page - http://www.usedcars.com/browse/mn-24/oslo-163.aspx. In this case the page is pulling in the unique ID that is equal to "OSLO 163" and "MN 24", just as the username is the unique ID in the above table.  

SEO Friendly URL 
In order to make your dynamic URL friendly for search engines you must use a rewrite. A great resource for rewriting a URL is the Apache Rewriting Guide. Some open source content management systems (such as Wordpress, Drupal, etc) already do the rewriting for you and all you have to do is enter what you want the URL to be (be sure to include your key terms separated with dashes "-" and not underscores "_" for search happiness) Who would have thought a URL could be so complicated? But when it comes to search optimization and understanding basic website development it is very important to understand how the URL works, how it is structured, and how to make sure your site is URL and search engine compliant. *http://en.wikipedia.org/wiki/Http_protocol


Tuesday, July 23, 2013

Categorizing Keywords

For those of you SEO's that manage very large sites and map your keyword categories to sections of your website - you know how difficult it is to categorize your terms and track their performance. Well, I have to say that after searching, asking, and digging around for a tool that does exactly what I am talking about, I finally came up with a solution. It's a bit of a workaround in Excel - but it's the best I can do until someone comes up with a tool that categorizes keywords for SEO.

Know Your Keywords and Categories


Before you get started categorizing the terms that come to your site, you should know what keywords you are targeting, and the combinations of terms as well. I'm going to use a flower shop's website as an example for this particular blog post. Categorizing is something you can do with any website. At the very least, you can categorize terms into "Broad" and "Branded", to get you started.

Most keyword tools can help you establish what categories to target. Google's Keyword Tool or WordTracker are just a couple of the many tools available on the web.

Another way to figure out terms that fit in categories is by grabbing search data (referring terms in Google Analytics) on your site for the past few months or year. I personally spent some time going through and categorizing keywords in Excel by using the filters and then having the sheet show all words including "anniversary" for terms around "anniversary flowers". It takes a lot of work and time, but in the long run you will have a more accurate account of the terms you will need to do the Lookup against.

Setting Up Your Template

Download the Template

Now that you have all the terms possible in all of your categories it's time to start setting up your template. You are going to want to Download the template I have set up in Excel. You can start from a fresh Excel document if you want, but the template has directions (in case you lose this blog post somehow) and the Lookup formula is in there.

Once you have downloaded the template it's time to get it set up to work for your keywords.

In the following steps - I am going to walk you through setting up the template and then categorizing the terms. If you don't have terms that you can use already, I have a zip file you can download and walk through the example with me to get familiar with how this works.

Copy and paste your first set of categorized terms and paste them into the first Tab marked "Broad". Since every site usually has a "Broad" category of terms, I figure that's probably the best to get started with. In the case of this example "flower shop", "online flower shop", and "best flower shop" terms are the ones that fit under the Broad category.

If you have the .zip folder downloaded, open up the "Terms" Excel doc and you will see the words already categorized for you. There are "Broad", "Branded", "Birthday", "Anniversary", and "Wedding". Click the Drop Down next to "Category" and click "select all (to deselect all) and then click "Broad". You will see all of the terms sort by just that "Broad" category.

Next select all of the terms in the "Keyword" list - copy and paste them into the "Broad" Tab.
We will then need to sort the terms in alphabetical order so that the Lookup string can go through them in order. If you don't then the Lookup won't work.


Highlight the Column with your keywords
Click "data" > "sort"
Select "My data has headers"
Select under "sort by" the column you keywords are under (should be column A)
Click OK

Double click the Tab and rename it with the one word name of your category.
Highlight all of your keywords in the column (just the cells that have words, not any blank cells).
Type the name of the category (stick to one word naming) into the upper left field. You have now named your table.

Do this for "Branded" and the other categories as well. You are going to have to create a new tab in the template to fit all the categories.

If you have not downloaded the .zip file and are working off of your own terms, creating new tabs and naming them is probably going to be something you will need to do. But don't worry, the template will still work.

Now that you have all of your keywords in your Template's Tabs with names and sorted it's time to set up your Lookup string.

Setting up Your Lookup


The way the Lookup works in this case is we are going to ask Excel to look at one Keyword (one cell) and match it up to one of the terms in the Tabs we have set up. If it matches one of those terms then we tell Excel to place the word into that Cell. If it doesn't, then we just leave that cell blank.

The string looks like this:
=IF(ISNA(VLOOKUP(B2,Broad,Broad!A$2:Broad!A$999998,FALSE)),"","Broad")
  • B2 is the cell of the keyword we want to look for.
  • the first "Broad" is the Table name we want to look for that keyword in.
  • Broad!A$2:Broad!A$9999998 is the Tab and range that the Table exists in.
  • FALSE is telling the Lookup to do an exact match. TRUE would look through to see if letters from that Keyword exist in the Cells we are looking in, so in this case it won't work.
  • We leave the ,"", as a blank - but you can put "not categorized" or "misc" to show that it isn't in a category. Though for our purposes here, we keep it blank.
  • ,"Broad" is telling Excel to put the word "Broad" in the cell if the keyword matches one of those in the Broad Table or Tab.


See - it's that easy...

What you are going to do next is replace the word "Broad" or "Cat1" with the name of your table, Tab, and category. This is why we name the Table, the Tab, and the Category the same so that our life is much easier when setting this string up.

Now your template is ready for you to paste some keywords with data and grab some numbers.

Gathering Your Data


Open up your Google Analytics account - if you don't have Google Analytics, pretty much any tracking tool that has a list of referring terms with some sort of data is fine. You can expand and contract the columns to the right of the terms as you wish. The template you will download will have the columns set up just for the purpose of exporting referring terms with visits and such from Google Analytics though.

Log into your Google Analytics account.
Click "Traffic Sources" > "Sources" > "Search" > "Organic"
Select the date range you would like to report on.
Scroll to the bottom of the report and show 5,000 rows.
Scroll back to the top and click "Export" the select "CSV".
After the file has downloaded, open the excel file.
Highlight JUST the cells that include the keywords and your data (ignore the first few at the top with date and information, and the bottom that summarize the data and below).
Copy those cells, and paste into your "Master" Tab.

Note: If you have multiple dates you would like to track, you can export the different date ranges, and then add which keywords go with what date in the Master Tab. This will allow you to see trends of categories.

I added an Excel doc called "Analytics Organic Search Traffic" with some terms and fake data that you can play with. There are three tabs that I added dates for each day's data. Start with just the one day and play with that to get familiar with percentages. From there you can play with all three dates and work on your trends to see what categories are trending up and down.

Completing Your Lookup


Now that you have copied and pasted the keywords into the "Master" Tab it's time to get all of those terms categorized.

Select the top row with your categories and your "All Categories" cell
Copy just those cells in the top row
Highlight the next row (same cells just below) hold down the "shift" key
Scroll down to the last keyword record
Holding down the shift key select the last cell under the "All categories" - this highlights all of those cells for those categories to Lookup the keywords.
Hit "CTRL+V" on your keyboard (this quickly pastes the Lookup formulas for each line)
Be patient, as it may take a while for your Lookup to complete (depending on how many keywords, and records you have)
The "Master" Tab should look something like this:

Playing With Your Data

The most efficient way to gather information from your data is to copy the entire "Master" Tab and paste as values into a new Excel sheet.  This way you won't have to wait for the Lookup to complete each time you sort, pivot, etc.

Click the top left "Arrow" in the "Master" Tab
Right Click and select "Copy"
Open a new Excel Doc
Right Click and select

From here you can create pivot tables then sort them into pie charts, graphs, and all sorts of fun reports to see how your keywords are performing.

I personally like to start with a quick pie chat to see what category of terms brings int he most traffic. At times we will have a drop or rise in traffic, and it's good to understand which category of terms are fluctuating. By copying and pasting terms by dates (weeks, months, or even a set of a few days) will help me see which categories are fluctuating on a timeline trend. Knowing which categories bring int he most traffic, I can then make decisions on which parts of the website we need to focus our efforts on to increase traffic.

See how much fun categorizing your terms can be?
Now that I have a template I work off of, when traffic goes up I can quickly categorize the terms and let our executives know if our recent efforts have worked.

Friday, January 18, 2013

SEO Issues - is it Penguin? Is it Panda? or is it me?

The following story is one that has been several months in the making. It's one that I have lived through one too many times as an SEO, and it is one that I am sure other SEO's have faced. I fought with the thought of writing this for fear that someone from the company might read it and get angry that the story is told. But, it's something I think that not only people out there could learn from, but speaks to so many others in this industry to show them that they are not alone.

It's long, it's a bit technical (I tried to keep it simple), and it has some personal frustrations laid out in words. My only hope is that you get value out of reading this as much as living it has made me a better person (or well, a better SEO).

It Begins


I started working on this website's SEO in May 2012 at which time I was told the site's traffic was declining due to Panda updates. In February of 2012 the traffic from SEO was the best they had ever seen, but soon after that there was a steady decline.
Traffic from February 2012 - May 2012
Before digging into any possible SEO issues, I first checked the Google Trends to ensure that the decline isn't searcher related. Often times a drop in traffic could just mean that users aren't searching for the terms the website is ranking for as they were in the past.

Top Key Terms in Google Trends
Looking at the same time frame as the traffic data, I noticed an increase in searches for the top 3 terms the website ranked for, and there appeared to be a decline around the same time from March to April that the traffic was reflecting. But there was a drop in the website's traffic in April from the 23rd to the 24th and then significantly on the 25th. The website I was working on had two SEO's already working on it: an agency and a consultant. Both had already done a numerous amount of research and some work to get the website on track. Both were stressing that the drop in traffic was due to the Panda updates by Google. I looked at SEOmoz's Google Algorithm Change History and found an update to Google's Panda on April 19th and an update to Penguin on April 24th. Given that the traffic significantly dropped on the 24th my best guess is that it was possibly Penguin related, but still needed further exploration.

Figuring Out What Was Hit by Penguin.


The site is/was broken up into sections by keyword focus. At one point, I could tell that someone really had a good head on their shoulders for SEO, but the strategy that was used was outdated. Perhaps the site was originally optimized several years before, and it just needs some cleanup now to bring it up to 2012's optimization standards. So, understanding Penguin and identifying which part of the site was driving the bulk of the organic traffic was going to be my next step in solving this mystery. Once I understood why, and where, then I could start to establish a what to do to solve the problem.

I broke the site traffic report by sections as best I could in Google Analytics. There was a bit of a struggle as all of the pages of the site resided on the main domain. Without a hierarchy in place, breaking out the sections had to be accomplished with a custom report and a head matching for landing pages. I hadn't had to do this before, so the agency that was working with the site already helped build the first report, and I began building out the other reports from there.
Click to View Larger
Section 1 over 72% of traffic

Just focusing on April and May I created a Dashboard in Google Analytics focusing on organic Traffic and identifying the sections of the site. Looking at the different sections - Section 1 was the bulk of the traffic with over 72% and Section 2 coming in second with just over 15%. Subs of Section 3 and other one-off pages make up the difference.

Both Section 1 and Section 2 dropped off after the April 24th date, so clearly they were the bulk of what was pulling the overall traffic numbers down. Since Section 1 was the majority of the traffic, I presented to the executive responsible for the site that we address any issues with that page first.

Actual screenshot of Section 1 presented
I took all of the research from the agency and consultant and we quickly reworked the pages to represent a hierarchy in the URL structure, and cleaned up any issues from the outdated optimization that was done.

Soon after Section 1 was addressed, we did the same with Section 2, and then worked on Section 3 (and sub pages, rolling them up into a solid section) and then added a few pages to grab any new opportunity.

Not Quite As  Easy as it Looks


The projects were launched in increments - first URL hierarchy fix to Section 1 and then the page redesign. Next was a full launch of URL fixes and page redesign to Section 2, and then lastly Section 3 and the new Section 4.
Section 1 - Section 2- Section 3 Launch Dates and Organic Traffic
Soon after Section 1 was launched traffic started declining rapidly. I was asked several times why traffic was getting worse, and I started digging some more. Every time I looked at the Impressions of the new URLs from Section 1 they weren't getting any traction, but the previous URLs were still.  I began looking at the history of the website, trying to find out why it was doing so well at one point, but was not doing well at that time. One of the things I noticed was that there was a lack of priority linking to these pages, but at some point there were links to some of them individually from the homepage. Google matches a hierarchy of pages to a directory structure that links are presented on a site. This site had every page on the first level, and linking to those pages from the homepage, which was telling Google that every page was the most important page. It worked at one time, but as Google has been rolling out their 2012 updates these pages were getting hit, and those links on the homepage weren't there anymore. Before the launch of Section 2, I had them put links to the main directory for each section on the homepage. The links would tell the search engines that these are important pages of the website, but not be so obnoxious with a dozen or more links on the homepage to discourage users (avoiding the appearance of spamminess).

But - even after adding the links to the homepage, the traffic to those pages was still declining. Pressure was put on me to figure out what was wrong. In addition, accusations were flying that I single-handedly ruined the SEO for the site, I spent every waking hour looking at reports, and trying to figure out what was going on. I consulted friends in the industry, and read every article I could find to figure out what Panda or Penguin updates were affecting these pages.

Then it hit me - just as the links to these sections would help them get recognized as important pages, so were the other pages that were being linked to from the homepage. In fact a set of them linked to the website's search results with queries attached to them mimicking pages, but showing search results. On those search results pages, there were over 200 links with multiple (we're talking hundreds - possibly thousands) combinations of parameters. The bots were coming to the homepage, going to the links to the search results pages, and then getting stuck in this vortex of links and combinations of parameter generating URLs - not allowing any crawl time for the pages that once were getting rankings. This also explains why the new URLs weren't showing very many impressions in the Webmaster Tools Data - those pages just weren't getting crawled.

There was a project underway that would solve the many links on the search pages, and there was also talk of using ajax to show the results. When this project would launch, the bots would go to the URL from the homepage, but would then essential not go much further. With this project a few months out, I made the case to add the search page to robots.txt to allow the bots to then recognize the Sections as important pages. After several weeks of attempting to convince the powers that be, the URL was eventually added to the robots.txt file.

Immediately after the search page was added to the robots.txt Google Webmaster tools presented me with a warning:
Warning in Webmaster Tools
In most cases, a warning from Google should never be taken lightly, but in this case it was exactly what I wanted. In fact it proved to me that my theory was correct, and that the site was hopefully headed down the right path.


Panic, Questioning, and a Third Party


As with every up in the SEO world, there must be a down. Soon after the search result page was added to the robots.txt the organic traffic to the site dropped, and continued to drop. Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
Organic Traffic from September 2012 - November 2012
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.

The patent states:

When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
 But the article and my please fell on deaf ears...

It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively.

Just as the consultant was starting his audit, and the traffic hit the lowest I ever thought it could possibly go, the next day traffic went up. The last week in November (roughly 3 months after we blocked the search result page) I saw an increase in traffic in Google Analytics to Section 1:
Section 1 Organic Traffic
I quickly pulled up my report to check the Section's impressions from the Webmaster Tools data, and there was a significant increase as well:
Section 1 Impressions from Webmaster Tools Data
On December 3, 2012 I logged into Webmaster Tools and saw that the warning had gone away:
It was the "halleluiah" moment that every SEO dreams of, and very few get. All the work I had done, the fighting for what I believed in, it all finally paid off.

To this day traffic continues to increase - we can now focus on some of the cleanup still left to do, and then onto projects that will attract new opportunity.
Organic Traffic from November 2012 - January 17, 2013 (day before this post is written)
Quick Note: 
I forgot to mention a post I wrote months ago while going through all of this - SEO - Panda and the Penguins. It helps to give a bit of perspective of some of the linking stuff I didn't get into in this post.