Sunday, August 4, 2013

SEO Analysis for Good!

Just a week ago I saw a post come through the Seattle Digital Eve mailing list from someone asking for help on behalf of a friend.

The post:
I have a friend who is looking to improve the SEO on their ecommerce site.  They are a small business, and the SEO people they've talked to want to sell them a big, complex package, when what they really need is some coding improvements on the site, consolidation of 2 sites into one while preserving link juice, better keyword/metadata, etc.  
Anyone out there with some decent SEO expertise?




Offering to Help


I emailed the person that posted to the group:
If it's a small project I can probably spend some time on it.
Even though I have a ton of experience, I can keep the fees down since I have a day job.

My email was quickly forwarded on and the business owner (Martin) emailed me right away with a simple "Hey Jenn, I am interested in your SEO skills. Please call me...". I called him on my way home as I was stuck in traffic that evening. Martin described the two websites for me, how they were getting rankings and traffic, and now aren't. He said he had done some work to one of them, and traffic kicked up. He asked me to take a look at them and see if it is something to do with the website, if either had been hit by Panda or Penguin, or if people just aren't searching for rugs online as much as they used to. I quickly thought - how odd that this is almost exactly what we have been going through with the websites at ADP. Not to mention that they have gone through this up and down, all the time me trying to figure out if it was just general loss of interest, change in searcher behavior, or if the site was going through a penalty or hit from just general Panda and Penguin updates. While this man's sites are a small version of what I deal with on a daily basis, this could not only be fun, but should be fairly easy to figure out.

Martin asked what my hourly rate is - but I immediately responded with a giggle (at $250/hr I'm sure it's probably more than he was expecting to pay) and told him that for the initial conversation and if it's something quick, I won't charge him. I then asked him if he has a Google Analytics account attached to the websites, of which he responded with a "Yes". I told him how if he can look at the Webmaster Tools data in Google Analytics and look at impressions vs. Avg position, and then impressions vs. Click Through Rate. If the Avg position, and CTR stay fairly steady, but impressions drop, then that is a quick way to see if interest has gone down. Another way to back that up, is look at how many terms get impressions one week, to the next week. If those numbers drop, then there is a good chance that rankings are dropping and there is a penalty. He was a bit overwhelmed, and asked if he gives me the login if I would take a look for him. I excitingly said "Sure" (since most people don't like to give out their login, it's easier to tell them how to do it - but this was the next best thing). I told him the next steps - I would spend some time that evening and see what I can find, then come up with a diagnosis and a plan of attack going forward of which he can either do himself, or if he has the budget, hire me to help him with it.

The Evaluation


That evening I settled in, opened up the computer while watching Hulu on the TV (we don't have cable), looked through both websites, and then logged into GA to dig through the analytics.

Checking 

Behavior vs. Penalty/Panda/Penguin


I started by going back as far as I could in the analytics account looking at organic traffic only. The month that both sites did the best was just a few months back this year. I also checked the timeline against Penguin and Panda updates via the Moz.com Google Algorithm Change timeline and noted if there were any clear drops that sync up with an update. There was a slight drop with one of the updates that hit one of the sites I work on for my day job that is built similar to his.

I first pulled the Impressions against the Avg position to see if the position stays the same while impressions go down. Though given that there is a slight drop after an algorithm update, I figure that there will be a drop in position, impressions, and number of terms getting impressions.

My First Chart:

Site #1
Impressions vs. Avg Pos.
Note: I took out numbers to protect the client even though he gave me permission.
Site #2
Impressions vs. Avg Pos
As you can see - there is a drop in impressions but the average position stays fairly consistent, and even more-so for Site #2. Both improved the last few weeks in positions, and impressions.

My Second Chart:
Site #1
The number of keywords showing impressions week over week for site #1
Site #2
The number of keywords showing impressions week over week for site #1
I noticed that the number of keywords getting impressions (meaning how many terms showing up when a person searches regardless of position) drops when the impressions drop in the earlier charts, and then goes up when impressions go up.

This clearly shows a penalty, and given that there was a penguin update just before that drop, it is pretty clear that the site took a hit from that update, then saw an improvement when he completed the little bit of work he did.

What Got Hit?


Knowing now for sure that both sites took a hit, the next step is to figure out exactly what was hit. With the site I manage at my day job I will usually run a category report to find out which terms were affected so that we can evaluate and establish a plan to recover. I didn't have time to set up the categories of terms to run the report (it takes days to categorize terms, but since that has already been done for the site I manage - it takes me just a few minutes or so to categorize now) so this time I grabbed the number of words in each term. If the phrase has just 1-2 words then it's safe to say it is a broad term, and if it has 3-5 words then those are more exact. Penguin tends to focus on sites that have optimized for long tail terms, and less on the broad terms. So, this is a faster way to get a similar understanding.

I ran a comparison to see how things were in his big traffic months compared to the recent months that there was a drop. I took 4 weeks in the high traffic month from a Saturday through the last Sunday and compared it to the last 4 weeks Saturday through the most recent Sunday. This would give me an exact day of week compared to that day of week and reference the beginning of a month to the end of a month. Ideally it should compare to the same time of year to reflect searcher behavior for the day of the week, time of month, and the time of year, but in this case the day of week and time of month was going to be good enough.

Example of Keyword Data with Count
Note: Terms and traffic are not representative - I changed them to protect the client.
The table above is what the data looks like. Do note, I changed the top keywords, and numbers to protect the client - but this gives you an idea of what I was working with. From there, I created a pivot table and played around with the data to give me more insight into what was going on. Number of words in a phrase, visits, pages/visit, etc. It all helped me understand what was going on before and after the update.

Keyword Count - showing how many words in a phrase were driving traffic from high month compared to low month.
Note: numbers and details have been removed/changed to protect the client.
The 3-5 word terms dropped from the high month to the recent months. This shows that the longer tail terms were hit, which is pretty indicative of Penguin.

Looking at the Sites


Having spent the first hour of my time running the reports and pulling charts, I spent the rest of the time looking through the sites now that I know what to look for. The sites were once optimized for long tail terms, but something happened that they lost that traffic. As I dig through both sites had categories for the two word terms (such as "area rugs) with links to individual pages for each item that fit in that category. The first that I noticed is that there is a URL hierarchy (something the website I work on lacked). So he was good there...

I started noticing as I was looking through both websites, that they were structured exactly the same way. I also saw that the navigation was the same on the left linking to different URLs, but the content appeared to look very similar on the pages. I grabbed a couple of the URLs that were focused on the long tail terms and pasted them into copyscape. The report kept not only pulling the other site as the first match, but other sites that sold the same products. This is a very common issue with eCommerce websites - since they don't have the time to write their own copy for each product, they tend to pull it in dynamically through syndication. With not enough unique content on those pages, then the site appears to be duplicating what all those other sites have. It's not a majorly serious issue, as Google tends to understand syndicated content, but if a site doesn't support the content with something unique they just won't get rankings as well as the ones that do.

Martin's sites had a bit more of a issue though since he has two sites with the exact same content, and the exact same structure. When I compared the terms that both sites got visits from during the high month, then I noticed that not only were a lot of the terms the same, but there were a lot of the terms with site #2s domain, and name in there.

Compare Terms from both sites with visits from organic traffic.
Note: the domain name has been changed, and so have the numbers to protect the client.
I think at this point it was very safe to say that the site took a hit by the Penguin update to the long tail terms due to both sites duplicating each other.

My Email and Recommendation


After spending a couple of hours on the site, I drafted this email and attached the excel document I used to analyze the sites (note: the email below is changed slightly to protect the client):
Hey Martin -
So I dug into both sites and the Google Analytics to see what’s going on. I’m attaching my excel doc if you want to see my work, but it looks like both sites definitely took a hit of some sort.
 Moz.com keeps a list of when updates happen so you can keep an eye on things: http://moz.com/google-algorithm-changeThere was an update at the end of January then another big update in March that may have led to you losing your rankings. I've seen this drop in other sites that are built very similar to yours – so I dug into the analytics to make sure that my assumptions are correct.
 What I found:I first compared visits from organic traffic (SEO) against your average position the past few months (webmaster tools only goes back 90 days, so I couldn't go back to January unfortunately).
 Site #1 definitely saw a decrease in traffic along with the drop in conversions (pasting the charts below for you to look at). With Site #2 there was a drop in traffic, but the average position seemed to not drop as much. Usually this would be a sign that people aren’t searching as much, so I wanted to check your keyword count and impressions week over week. If the number of terms drops seeing impressions drops from one week to the next it is usually an indicator of a penalization or hit by a panda or penguin update. I’m not pasting those charts in here since they are really raw, but you can see them in the excel sheet. The terms that have 2-5 words in there took a huge hit, while the one word, and longer tails appear to be sticking around. I toggled from keyword count to visits from the keywords and those sets stay pretty consistent in dropping.
 What this means is that you most definitely took a hit in rankings from the updates. Not just rankings dropping, but a bulk of your 3-5 word terms dropped out of the index completely. Those 3-5 word terms are also the bulk of where your visitors come from – those longtails are higher converting terms and can really affect revenue if they drop off. It looks like both sites are build very similar and have a lot of the same content. I compared the top referring terms both sites saw in your highest traffic month and both refer traffic for “your domain” which isn't good. They both get traffic from “broad term” but site #2 has site #2 beat there. They also both have several long tail terms that are the same.
 When I run a report on copyscape.com to check for duplicate content – the site #2 along with a few others come up (included link directly to copyscape) The “product” rugs page on both sites is exactly the same – almost word for word.
 It’s kinda fun to have two sites show up for the same terms, since you could get double the traffic. In fact that’s what my company does – and what I manage. We have dozens of “portal” sites to grab leads to sell to car dealers. But if Google has any idea that both sites are connected then both sites get penalized. I think this might be what has happened here.
 Your first solution would be to get unique copy on all of the pages of the site. I know it’s tough writing for all of those pages, and copy writers can be expensive. There is an alternative called TextBroker (http://textbroker.com). They have writers that bust out copy pretty quickly (2-3 day turnaround) at a pretty reasonable rate.
 I would recommend getting an account set up and start asking them to write for your pages. Even your homepage content – while there is a lot of it, but looks to be pieced from other content on the web.
 The order I would have them do it in is in order of the pages that had the most traffic in your highest month, and then work down from there.
 Once you get them going on that – I can do a full keyword analysis, check to see where the opportunity might lie, and get you a complete plan.
 The excel doc is attached – let me know if you have any questions.
 Hopefully this was helpful J
All in all it took me just a couple of hours since this is what I do for our executives regularly, so I didn't charge Martin for the work.

SEO for Good!


Martin was so excited and appreciative of the work I had done, and what I had found that he asked me if he could pay me in some way - "..if anything to help the school in Nicaragua". I gave Martin the link to donate to the school, and he did.

The money immediately went to help buy supplies for my Husband's students we are bringing with us. Since they had to pay for their immunizations out of their paychecks, don't have sleeping bags (we are loaning bags to them), and anything else they need they pay for themselves, I wanted to help them so that they could focus on helping build the school and not stress that they have everything they need for the trip.

In the end, I helped Martin with his websites because I like to help small businesses succeed, Martin returned the kindness by helping the students with their supplies, so that they in-turn could help build the school for children in Nicaragua.

Everybody wins!







Tuesday, July 30, 2013

Anatomy of the URL and Stuff

I'm sure you are looking at the URL above and thinking to yourself; "Wow, I never realized that all that stuff meant something." Oddly enough it actually does... As the world wide web has changed into a search friendly, user interactivity playground, the formation and meaning of the URL has evolved considerably in to a very significant factor in not only search engine compliance but in how people use websites. Lately I have been helping clients understand how their website's are structured and how servers to browsers to users work. It's something us search optimizers view as something so simple yet can be so complex to someone who doesn't understand how it all works. So here is the URL broken down piece by piece and explained.  

First - What is a URL? 
A Uniform Resource Locator is a website address that holds very important information between each "." and "/" much like an address to your home contains a house number, city, state, country, etc. This allows the browser to connect to a specific website, directory/path, and/or file in order for the user to see what it is on the page you want them to see. A URL consists of the following:


Hypertext Protocol Established by English physicist Tim Berners-Lee in 1990, hypertext protocol is a request/response standard typical in which the client is the application (User on a web browser such as IE, Firefox, safari, etc) and the server which hosts the web site itself. The client submitting HTTP requests is typically referred to as a user-agent (or user) with the responding server—which stores or creates resources such as files (html, .asp, .php, css, etc) and images—referred to as the origin server.*

  WWW (World Wide Web) or "sub-domain" 
The WWW is typically placed before the main domain of your website URL, referencing the World Wide Web. Remember the game you played in elementary school where you could start your home address with your house number, street, city, state and then go off as far as your country, continent, and even earth. The WWW is the address starting with "earth". In some cases, what we call a "sub-domain" can replace the WWW in your URL, which references a whole new website within your existing domain. Search optimizers can use this as a way to target certain key terms. For example, a real estate agent targeting a specific city will use http://city.domain.com and thus will have a leg up when ranking for anything within that city. In most cases the sub-domains will link to the main domain and, since they are treated by most search engines as a domain all it's own, then it will count as an external link credit, boosting the rankings for the main domain it is linking to. It is highly recommended that you avoid this technique as it is only tricking the search engines and in the end will hurt your rankings rather than help. 

  Domain Naming System (or DNS) 
The domain naming system was established so that the common user can understand in simple terms the location of a web site. A web site's files are usually stored on a server that points to a specific IP address (much like a phone number directs someone's call to your phone). In order for the general public to understand where to locate a certain website and it's files, the specific domain name resolves to that particular IP address. In addition, the Domain Name System also stores other types of information, such as the list of mail servers that accept email for a given domain (such as you@yourdomain.com). 

Top-level Domain Extension 
The domain extension originally consisted of the generic gov, edu, com, mil, and org. With the growth of the internet, the addition of country extensions and other such categories have come into play. The most recognized of the extensions is of course the .com. If you are optimizing for a specific country and language, then the best route to take is to register your domain with that specific country's extension. This will help the search engines recognize that you are targeting that particular audience and will rank that site accordingly. Be sure that your country specific site is in the native language for that country to avoid any duplicate content issues. Do also be careful of linking from that domain to your main domain as once again the site will be penalized. 

Directories and Files 
Here's where the fun stuff comes into play. Just as your computer organizes your word doc, excel, and other such files into folders, a server structures your website files in the same way. A "directory" or "path" is much like a "folder" is on your computer. In standard (old school) html development (before the days of creating dynamic websites powered by databases and user interactivity) a file would be created and named "index.html" or "default.html" and placed either on the main domain folder (in which the DNS resolves to on the server) or placed in a named folder (in order to help the webmaster organize the site's files). As the technology grew and more ways to develop websites with user interactivity and database driven websites advanced, the structure has pretty much stayed the same with the addition of "parameters" that reference a part of the database and returns content and such on a page based on those parameters. (have I lost you yet?) Let's go back to the basic structure of the static html files and go from there...

A Dynamic website is one that has a few static pages (in other words the pages are coded and are only editable by a developer) that have parameters that will pull in content or trigger specific actions from a database. The basics of a dynamic page is one that pulls words, images, etc from a database and can do so creating multiple pages with different content from one basic page. A more complex dynamic page (or site) is something like Facebook, or Twitter in which they recognize whether or not you are signed in with a username and password and will show you either your profile page (if you are signed in) or a "please sign up" page (if you are not signed in or don't have an established username).
In order to help understand this let's talk about how a database works. A database is essentially similar to that of an excel spreadsheet or table in a word document that has a unique identifier for each line (or row) and holds different content for each line item. Example:
Email
First Name
Last Name
Sujo234
bob@bobsemail.com
Bob
Sujo
Forjill23
jill@jillsemail.com
Jill
Forman
Username
In this example the username is the unique identifier with the email, first name, and last name as different parameters for that username.

The content will be different on each page. With dynamic content the possibilities are endless as far as how many pages you can create from developing and design just one file. A great example of how a dynamic page is created for search optimization purposes is on usedcars.com - If you search for "used cars in oslo mn" you see the "UsedCars.com Oslo MN" page in the results. Look at the URL in the address bar when you go to that particular page - http://www.usedcars.com/browse/mn-24/oslo-163.aspx. In this case the page is pulling in the unique ID that is equal to "OSLO 163" and "MN 24", just as the username is the unique ID in the above table.  

SEO Friendly URL 
In order to make your dynamic URL friendly for search engines you must use a rewrite. A great resource for rewriting a URL is the Apache Rewriting Guide. Some open source content management systems (such as Wordpress, Drupal, etc) already do the rewriting for you and all you have to do is enter what you want the URL to be (be sure to include your key terms separated with dashes "-" and not underscores "_" for search happiness) Who would have thought a URL could be so complicated? But when it comes to search optimization and understanding basic website development it is very important to understand how the URL works, how it is structured, and how to make sure your site is URL and search engine compliant. *http://en.wikipedia.org/wiki/Http_protocol


Sunday, July 28, 2013

Facetwitetiquette - How to Suck at Facebook and Twitter


Facebook

What not to do on Facebook

The constant Gardener, Mafia Hitman, Virtual Pet Owner, or Whatever

the updates and invites never end...

The Bad Marketer

become a fan, join my group, comment on one of my million updates...

The Crude Photo Tagger

tagging everything from that party last night you don't remember to taking shots off of a girls belly button.

The Rash

following you around commenting on all your posts and liking every photo...

The Unfiltered

they post everything and anything...

The Most Popular Person EVAR

inviting everyone to everything...

The Twitterfied

linking twitter updates to Facebook flooding their profile with meaningless tweets...

The Bored Quizzer

taking every quiz available...

The Passive Aggressor

posting well thought out updates without mentioning names...

The Annoyingly Proud Parent

using their child as their profile pic...

Twitter

What not to do on Twitter

The Unproductive Tweeter

tweets updating every minute...

The Retweeter

retweeting everything they see...

The Conversationalist

bouncing back and forth between one person...

The Untweeter

Tweeting from Foursquare, tweetmeme, or the like...

Tuesday, July 23, 2013

Categorizing Keywords

For those of you SEO's that manage very large sites and map your keyword categories to sections of your website - you know how difficult it is to categorize your terms and track their performance. Well, I have to say that after searching, asking, and digging around for a tool that does exactly what I am talking about, I finally came up with a solution. It's a bit of a workaround in Excel - but it's the best I can do until someone comes up with a tool that categorizes keywords for SEO.

Know Your Keywords and Categories


Before you get started categorizing the terms that come to your site, you should know what keywords you are targeting, and the combinations of terms as well. I'm going to use a flower shop's website as an example for this particular blog post. Categorizing is something you can do with any website. At the very least, you can categorize terms into "Broad" and "Branded", to get you started.

Most keyword tools can help you establish what categories to target. Google's Keyword Tool or WordTracker are just a couple of the many tools available on the web.

Another way to figure out terms that fit in categories is by grabbing search data (referring terms in Google Analytics) on your site for the past few months or year. I personally spent some time going through and categorizing keywords in Excel by using the filters and then having the sheet show all words including "anniversary" for terms around "anniversary flowers". It takes a lot of work and time, but in the long run you will have a more accurate account of the terms you will need to do the Lookup against.

Setting Up Your Template

Download the Template

Now that you have all the terms possible in all of your categories it's time to start setting up your template. You are going to want to Download the template I have set up in Excel. You can start from a fresh Excel document if you want, but the template has directions (in case you lose this blog post somehow) and the Lookup formula is in there.

Once you have downloaded the template it's time to get it set up to work for your keywords.

In the following steps - I am going to walk you through setting up the template and then categorizing the terms. If you don't have terms that you can use already, I have a zip file you can download and walk through the example with me to get familiar with how this works.

Copy and paste your first set of categorized terms and paste them into the first Tab marked "Broad". Since every site usually has a "Broad" category of terms, I figure that's probably the best to get started with. In the case of this example "flower shop", "online flower shop", and "best flower shop" terms are the ones that fit under the Broad category.

If you have the .zip folder downloaded, open up the "Terms" Excel doc and you will see the words already categorized for you. There are "Broad", "Branded", "Birthday", "Anniversary", and "Wedding". Click the Drop Down next to "Category" and click "select all (to deselect all) and then click "Broad". You will see all of the terms sort by just that "Broad" category.

Next select all of the terms in the "Keyword" list - copy and paste them into the "Broad" Tab.
We will then need to sort the terms in alphabetical order so that the Lookup string can go through them in order. If you don't then the Lookup won't work.


Highlight the Column with your keywords
Click "data" > "sort"
Select "My data has headers"
Select under "sort by" the column you keywords are under (should be column A)
Click OK

Double click the Tab and rename it with the one word name of your category.
Highlight all of your keywords in the column (just the cells that have words, not any blank cells).
Type the name of the category (stick to one word naming) into the upper left field. You have now named your table.

Do this for "Branded" and the other categories as well. You are going to have to create a new tab in the template to fit all the categories.

If you have not downloaded the .zip file and are working off of your own terms, creating new tabs and naming them is probably going to be something you will need to do. But don't worry, the template will still work.

Now that you have all of your keywords in your Template's Tabs with names and sorted it's time to set up your Lookup string.

Setting up Your Lookup


The way the Lookup works in this case is we are going to ask Excel to look at one Keyword (one cell) and match it up to one of the terms in the Tabs we have set up. If it matches one of those terms then we tell Excel to place the word into that Cell. If it doesn't, then we just leave that cell blank.

The string looks like this:
=IF(ISNA(VLOOKUP(B2,Broad,Broad!A$2:Broad!A$999998,FALSE)),"","Broad")
  • B2 is the cell of the keyword we want to look for.
  • the first "Broad" is the Table name we want to look for that keyword in.
  • Broad!A$2:Broad!A$9999998 is the Tab and range that the Table exists in.
  • FALSE is telling the Lookup to do an exact match. TRUE would look through to see if letters from that Keyword exist in the Cells we are looking in, so in this case it won't work.
  • We leave the ,"", as a blank - but you can put "not categorized" or "misc" to show that it isn't in a category. Though for our purposes here, we keep it blank.
  • ,"Broad" is telling Excel to put the word "Broad" in the cell if the keyword matches one of those in the Broad Table or Tab.


See - it's that easy...

What you are going to do next is replace the word "Broad" or "Cat1" with the name of your table, Tab, and category. This is why we name the Table, the Tab, and the Category the same so that our life is much easier when setting this string up.

Now your template is ready for you to paste some keywords with data and grab some numbers.

Gathering Your Data


Open up your Google Analytics account - if you don't have Google Analytics, pretty much any tracking tool that has a list of referring terms with some sort of data is fine. You can expand and contract the columns to the right of the terms as you wish. The template you will download will have the columns set up just for the purpose of exporting referring terms with visits and such from Google Analytics though.

Log into your Google Analytics account.
Click "Traffic Sources" > "Sources" > "Search" > "Organic"
Select the date range you would like to report on.
Scroll to the bottom of the report and show 5,000 rows.
Scroll back to the top and click "Export" the select "CSV".
After the file has downloaded, open the excel file.
Highlight JUST the cells that include the keywords and your data (ignore the first few at the top with date and information, and the bottom that summarize the data and below).
Copy those cells, and paste into your "Master" Tab.

Note: If you have multiple dates you would like to track, you can export the different date ranges, and then add which keywords go with what date in the Master Tab. This will allow you to see trends of categories.

I added an Excel doc called "Analytics Organic Search Traffic" with some terms and fake data that you can play with. There are three tabs that I added dates for each day's data. Start with just the one day and play with that to get familiar with percentages. From there you can play with all three dates and work on your trends to see what categories are trending up and down.

Completing Your Lookup


Now that you have copied and pasted the keywords into the "Master" Tab it's time to get all of those terms categorized.

Select the top row with your categories and your "All Categories" cell
Copy just those cells in the top row
Highlight the next row (same cells just below) hold down the "shift" key
Scroll down to the last keyword record
Holding down the shift key select the last cell under the "All categories" - this highlights all of those cells for those categories to Lookup the keywords.
Hit "CTRL+V" on your keyboard (this quickly pastes the Lookup formulas for each line)
Be patient, as it may take a while for your Lookup to complete (depending on how many keywords, and records you have)
The "Master" Tab should look something like this:

Playing With Your Data

The most efficient way to gather information from your data is to copy the entire "Master" Tab and paste as values into a new Excel sheet.  This way you won't have to wait for the Lookup to complete each time you sort, pivot, etc.

Click the top left "Arrow" in the "Master" Tab
Right Click and select "Copy"
Open a new Excel Doc
Right Click and select

From here you can create pivot tables then sort them into pie charts, graphs, and all sorts of fun reports to see how your keywords are performing.

I personally like to start with a quick pie chat to see what category of terms brings int he most traffic. At times we will have a drop or rise in traffic, and it's good to understand which category of terms are fluctuating. By copying and pasting terms by dates (weeks, months, or even a set of a few days) will help me see which categories are fluctuating on a timeline trend. Knowing which categories bring int he most traffic, I can then make decisions on which parts of the website we need to focus our efforts on to increase traffic.

See how much fun categorizing your terms can be?
Now that I have a template I work off of, when traffic goes up I can quickly categorize the terms and let our executives know if our recent efforts have worked.

Thursday, June 20, 2013

How long does it take for Google to recognize 301s?

Or Better Yet - 

It's been over a year and Google still doesn't have the new URLs in the Index


Just over a year ago, I started working on this website that had over 900k top level domain files. We changed the structure of the URLs to a more organized hierarchy. The pages content changed slightly, but most importantly instead of all of the site's pages residing directly under the main domain (Let's use a computer broad to longtail term structure for example - like domain.com/computer.html and domain.com/laptop-computer.html and domain.com/500gb-laptop-computer.html) we changed them to a more representative hierarchy directory to file structure (example - domain.com/computer/ to domain.com/computer/laptop-computer/ then domain.com/computer/laptop-computer/500gb.html).

Why the URL Hierarchy?


The quick and simple explanation as to why we did this is that while URLs are fairly dynamic these days,  the bots like to see and understand how a website is organized on a server. Remember your old school folder and file structure back when sites were in html were built?  The URLs you have today should represent that organize file structure as much as possible. I cover this in my SEO Workshop  (slide 23)- but I also found a pretty good article that explains the hierarchy relatively simply and quickly.

The process in setting the 301


The Since the entire 27+ million pages on the site were mostly files located directly under the main domain, it was difficult to understand what pages fit under what category so that we could organize them. I went to our keyword analysis and bucketed each focus term out and then organized the correlating URLs to fit within that bucket. Once that was done, I worked with the Developers to pull the naming from the database (dynamically) into the directory and file structure that fit the buckets. Some of the keywords I knew I eventually wanted to build out with supporting pages, so those got directory levels instead of page levels for future optimization (and limit more 301 redirecting later on).

I mention a bit about breaking the site up into sections for analytics purposes in my previous post "SEO Issues - is it Penguin? Is it Panda? or is it me?" under "Figuring out what was hit by Penguin". The "video" to the left is a quick (and very raw) animation to help explain exactly what we did. Now that the site was organized it not only helps the bots understand the structure, but helps us understand what sections bring in what SEO traffic in Google Analytics.

How Long Does it Take Google to See New URL via 301 Redirect?


This whole undertaking was completed over a course of 2-3 months starting in June 2012 (last year) and finished up with the last of the redesigns and URL changes in August with one last directory change (no redesigned pages) in January of this year (2013). The most important ones are still are showing 550,000 pages in Google's Index (11 months later):
As I Google to see if others have a solution for speeding up the indexing of these old URLs, or if even if anyone has had the same problem I found a lot of questions in various forums (both reliable and unreliable) but no real articles, blog posts, or anything from reputable SEO's. The most common answer in the forums is to just "wait". It's, of course, what I tell others when they ask me "Be patient, Google will eventually hit those pages again and recognize that they have changed then correct the index then." But after nearly a year and so many pages, this is getting ridiculous.

I spoke with my friend (and SEO mentor) Bruce Clay who came back with the suggestion to add an .xml sitemap and submit it to Google with the old URLs we want removed.

It was kinda making sense that because those old URLs are no longer linked to, and there are so many, that Google wasn't crawling them as much anymore. They are just sitting there in the index - and not getting "updated"

Unfortunately getting a sitemap added is not an easy feat. I would have to define the strategy, present it to the powers that be with data to backup the success metrics in order to get the project prioritized. With so many other initiatives needed for SEO, all of which were more important and affect the business in a positive way, it was in my best interest to keep pushing those and not deal with the sitemap.

My work around, though, was about as black hat as I would get (Matt Cutts if you are reading this, I apologize and throw myself at your mercy, but it had to be done). One weekend over a month ago, I grabbed one of my many impulsive purchased domains and quickly set up hosting and an old school html site that consisted of one page. I then exported all of the links on the Google "site:" search through a Firefox plugin called SEOquake that exports the results into a csv file. It's not the prettiest, and there was a lot of work still needed to get to just the URLs, but it was the best solution I could find (note: if any SEO reading this knows of an easier way to do this - please add to the comments for prosperity). I then parsed out the parameters in the URLs in a separate document and used those as the anchor text for each URL. Finally, using excel I then concatenated the URLs and parameters (that were now anchor text) into an html href string.
Then copying and pasting the "string" column into the html code, the page looked like:
The page wasn't the prettiest, and it had thousands of links (the above is just an example) so it was bad all around, but the point was to get those links crawled by Google.

Of course every SEO knows that you can't just build a website and expect it to immediately get crawled - right? 

So I set it up in Google Webmaster Tools and submitted the page to the the index:
I even got more fancy to ensure Google would see the page and crawl all of those old URLs and +1'd it on Google. 

Did it work?


I checked the URLs this evening to see how many Google is seeing and the number has dropped from 550,000 to now only 175.

I took the domain off of the server, and now have it parked elsewhere (back where it belongs) and removed the webmaster tools account. All traces of it ever existing are now gone, and the small moment of my attempt to get those URLs removed has passed.

Thank For the Advice Jenn - Now I'm Going to Try This!


If you have come across this post and you need to do something similar - I'm going to put the same disclaimer they do when a very dangerous stunt is performed in commercials. 
Do not attempt this at home - this stunt was performed by a trained professional on a closed course.

So, don't go adding a bunch of links to a random domain thinking that your attempt just weeks ago to 301 pages isn't working. The links on the external domain were too many for the domain and page, and were extremely spammy. In addition, all those links pointing to pages that were redirecting and were supposed to pass value to the new URLs, now had many spammy links pointing to them from a very spammy domain. If left up too long, or not done correctly, it could actually cause more damage than ever helping.

If you have any questions, or feel you need to try this same strategy, please don't hesitate to contact me. I'm here to help, and want to ensure that your website has considered all possible options before attempting any such trickery.

Some Helpful Links on the Very Subject:



Wednesday, February 20, 2013

Conference Adventures Part..: Dernier Versement (final installment)

All-in-all the event went as well as could be expected. Financially, I am left with having to put up ~$1,000 to still cover the expenses of the event. The total cost for the event ran me ~$9,000.00 (slightly under) with ~$8,500 (give or take a few hundred) in registrations and no money from sponsors (sponsors this year received their sponsorships in exchange for distributing swag, or offering award prizes).


Why do I do This?


When I started writing my first post documenting the organizing of this conference, I mentioned in my post titled "Conference Adventures Part Un - 18 Days 'til"

Planning and organizing a conference is not an easy feat by any means, and I often ask myself why I keep doing it...

Which is now bringing me back to asking myself once again why do I keep doing it? I honestly can't find a one sentence answer that rationalizes the time, effort, stress, financial strain, or the pressure it all puts on me.

I'm sure you're probably reading frustration in my post here, and probably in my previous posts, but know this - the last few days I have been coming up with ideas, excited that registrations are already coming in, and I almost have a full speaker list for EmMeCon Seattle that isn't happening until June. I have even started the groundwork of organizing a 2 day Search and Social Series, and/or an SEOGoddess 4 hour SEO Workshop in April so that I don't have to wait too long to do another event).

So why this odd addiction to holding events? I got to talking with a few of the attendees and speakers at this last EmMeCon and the one word that kept popping out of my mouth (and repeated back to me) was integrity. My events, though small, and not highly profitable, still have integrity. I haven't sold out with mindless topics, uninspiring speakers, and selling tickets at an insanely low cost just to appease sponsors with more attendees. Even after all these years of organizing conferences, I still constantly remind myself through the entire process what it is that I wanted to do when I started these events. With EmMeCon, I want people to gain inspiration from the amazing people I have been lucky enough to have access to. People like David Evans Ph.D. who has taught at the University of Washington educating Masters students on Psychographic Segmentation and the importance of understanding the minds of the users they are marketing to. Or Gillian Muessig who has guided not one, but 3 children into thriving adults and in the process molded 2 of them into very successful and inspiring SEO's. The list of inspirational individuals that I feel privileged at the ability to pick their brains, gain inspiration from, or have been helped by in some way is a mile long, and ever growing.

Because of this desire to share, I take careful consideration into the details of every event I organize. It may run me rampant and I get flustered and exhausted from it, but reading the tweets, hearing the feedback, and knowing that at least one person (if not many) has gained inspiration from the event is what I deem as success. 

On Thursday night we wrapped up the event with a packed house for the Meetup Group organized by Chase McMichael (CEO of Infinigraph) the tweets were still coming through strong, and the room was full of questions and discussions. After the Meetup wrapped I began packing things up, and while exhausted I was bouncing around with excitement as the folks that lagged behind thanked me for putting on such a great event, and asked me all sorts of questions on how I got into this, how I came up with the idea for the event, and even asked more about the event in Seattle.

It's that feedback that I get that keeps me going. 

I promise that I won't "sell out" and start making this about the money, I won't ever forget what this event (or any of my events) are there to accomplish, and I promise never to lose the integrity that I still hold onto.

If I do - someone please take me out back and put me out of my misery...?

Friday, February 1, 2013

Conference Adventures Part..: en Cours (in progress)

It's the Friday after the event is finished, and I'm finally getting a chance to sit down and relax long enough to to write a summary of how the event went during the week.

How did the event go?


I have been asked the same question several times throughout the course of the week. The first morning of the first day there were only a handful of people in the audience. I had spent the evening before setting up the room and the audio visual until midnight and didn't get to sleep until well after 2am. With just 3 hours of sleep (I spent another 1.5 hours typing up notes for the next day)  I walked up on stage welcomed everyone, explaining a bit about the conference, dove into announcements, and then introduced our first mini-Keynote Irene Koehler. I am always worried about the first speaker of the event. The speaker and the subject chosen is what sets the tone for the remainder of the event. Irene is a seasoned and very respected speaker. She is an expert in her field, and knows more about social marketing on Linkedin and anyone I know.  I think Benj's tweet summarizes the one word used during the remainder of the event that Irene provided us all: "Stalking"
The room filled up as people trailed in throughout the day. The VIP room was a hit, and no one tried to sneak in for lunch. I didn't hear a single complaint, and none of the volunteers said anything to me about anyone being upset. I have to give a special shout out to Tracy, Lydia, Brenda, and my daughter Katie for all their help on the first day. Especially Tracy who's car was randomly dinged by a crazy driver in the parking garage. Poor Tracy was so distraught that she couldn't come back the remainder of the event. 

Everything went so smoothly the first day that one by one, my volunteers said "It looks like you have everything covered, so I won't be coming tomorrow." or something similar. The evening of the event was one of the best nights after a conference I have ever had. The Speaker's Dinner was great (despite our food coming 85 min after arriving and my steak being very, very well done). Bill Leake and Aaron Kronis blew through the wine, and entertained my daughter while I moved onto my Birthday party with my co-workers and a few other friends. All-in-all Day one is going into my record book as a perfect first conference day.

As the sun rose up the next day I was already busy getting ready to head over to the hotel. My daughter got ready for school but then crashed and said she felt very ill. I told her to stay home, and went onto the event. I struggled with the problem of how to check people in when I needed to be on stage most of the day. I asked our hotel rep and she sent one of their staff, but the staff that arrived was clearly very upset that she had to be there. I asked Stephan (who I work with and had a pass to the event) if he would help out by checking people in. We moved the table into the room so he could watch the talks and hand people their badges as they trailed in. It worked out great!

At some point in the middle of the day I was hit by a ton of bricks and could barely get my energy up enough to even introduce the speakers. I hadn't had anything to drink the night before, so had no idea why I was feeling so ill. Hoping it was stress I powered through. The Meetup organizer showed up and I talked to her about getting set up for the talks that evening and then leaving. I asked Aaron Kronis to help her out that evening, and I eventually went home to bed. I really don't remember much of that day - I remember waving to speakers and saying "just introduce yourself" and checking the video camera, taking pictures, and then sitting down to rest until the next speaker went up. 

The last day was the quietest day I have ever experienced at any of my events. Even my workshops had a better turnout than this conference. Was it the free passes that kept people from wanting to come to the event? If they all pay, then will they show up and stay the whole time? Bill and I talked about where this conference should be headed. Cutting it down to 2 days, holding it just in Seattle, and look into other cities (like NY or Vegas). The conference has an "eclectic" (as Bill called it) array of speakers and topics and as a result we get such a mix of attendees it's quite refreshing. It took over 10 years for TED to finally gain some notoriety, perhaps this event is on the same path.

Sunday, January 27, 2013

Conference Adventures Part Sept: Pricing

I woke up this morning not having to go to Personal Training today (did I mention I have been going every morning for the past 8 weeks to lose weight for my 40th Birthday?) and I was actually looking forward at the chance to sleep in. I went back to sleep (after the cat settled down around 5:30 am) and woke up at 9:30 am ready to take on the world. A friend called me bored and wanting to get out of the house, and my daughter needed to do something other than Skype with her friends all day, so we all went shopping for the day. First stop Staples, then Target, then Macy's (and the Hillsdale Mall). It was nice...

2 Days Until the Event


Status Update: 
Speaker's Presentations (PPT or other): 1 (Simon still Rocks!)
Agenda: In print form and solid as it could get
Logistics: BO's approved, Tracy Ng all set to help out, ready to get this party started
Marketing/Promotion:  Facebook Poll setup for the Awards, No way I;m going to get to update Lanyrd or finish last years videos, print materials for event designed and ready to print tomorrow (I ran out of ink tonight).
Master of Ceremonies: Me
Volunteers: 2 Confirmed (Lydia and Tracy)


Figuring out Pricing 

One of the many key important aspects of each conference I plan is how that event is priced. Price it too high, and people will not only refrain from registering, but they will complain (and often publicly). Price it too low and you de-value what the event has to offer... not to mention that with less money coming in, there is less available for quality food, a decent location, and extras like parties, swag, etc. So I try to find a decent location with a well lit space, high ceilings (got a complaint about low ceilings once), and serves high quality food. Once the estimated overhead is figured out, then I take the number of attendees the previous year and calculate what the set costs would be divided among the number then add the fluid costs (like food) to each registration. Essential Passes don't get food and extras (parties, Speaker's Dinner, activities, etc) - so they don't pay for food and such. Full Passes get food and extras, but they don't get the hotel stay included so they don't pay for their stay. VIP passes get it all - so they pay for it all. It's pretty straightforward...  The downside is that I do not make any money off of the events. For now, I am ok with that, but someday it would be nice to get a bit of a kick-back from them.

So, I checked out my competition this evening worried that my next event is maybe priced too high. SMX Advanced is a SEO conference held in Seattle every year, and it's a pretty popular one. This year they are the week after EmMeCon in Seattle, but the audience attending is so different (since we don't do SEO) that I'm not worried that we will lose attendees because of it. What I did notice is their pricing. Their passes work differently that EmMeCon's. 



Basic Pass


They have a "Networking" Pass that gets you into their Expo Hall (that we do not have) but not to sessions or workshops. That pass is $99 pre-reg and $139 regular reg. The closest we have is our Essential, but ours is sessions only priced at $198 pre-reg and $468 regular registration. I'm now wondering if I should lower the price - but 3 days of sessions should be at least $100/day, and with the few hundred we get the average of all passes by the time of the event pays their part in the set overhead. 

Mid-Level Pass


They 4 passes, but their mid-level is probably their "All Access Pass" for $1,595 which includes sessions and access to the expo hall, but doesn't include Workshops. Workshops are $895 - so the two added would be $2,490 (~$100 less than the "All Access"). From pre-reg to regular registration rates they go up roughly $200-$300. Our comparative pass would be our Full Pass that goes for $568 pre-reg and $1,178 regular registration. I priced my pre-reg on the Full pass so low because it covers their portion of the set overhead just perfectly and encourages people to register early (since it's just $100 more than the regular Essential Pass). It seems to work pretty well because we have people register for Seattle months before the turn of the new year (9-10 months before the event). At times we have had workshops the day before the conference, and include them for all attendees - then sell them separately for $198 for a full day. That price point has become our sweet spot for workshops.

Ultimate Pass


The Pass that includes it all for SMX Advanced is their "All Access + Workshop" for $2,395 pre-reg and $2,895 regular registration. I tried to see if attendees get all of the workshops, or just one, but it appears they just give one with the registration. Our comparable is our VIP Pass that we sell for $1,687.85 - which includes 3 nights at the hotel. I use the Marriot in Pioneer Square for this event (well, and for every event I hold in Seattle) so I know what the rates per night are. To be honest, the hotel gives us a bit of a discount, so I pass that into the price of the pass.

SMX has been around for many years, and people have told me they get great value out of the event. But the value they get is out of the networking, not the sessions. I stopped going to SMX 5 years ago because I wasn't learning anything anymore, but I did like getting to know the other SEO's there. That reason is why I hold my events (focusing on the networking, but providing really valuable topics as well). So how do they get people to pay those prices, plus get enough booths int he expo hall and sponsors as well? They must make a killing. Perhaps someday EmMeCon will be a big conference with a expo hall and sponsors too...

Thursday, January 24, 2013

Conference Adventures Part Six: Complaints Happen

Time is ticking and the event is fast approaching. My day job has had me busier than we have been since the first few months I began working there. I had gotten used to the quiet in the office, but the execs had an off-site to solve a lot of bug issues within the corporation and as a result stuff is now getting done. It's a great thing to be a part of, but couldn't have come at a worse time as I am in the home stretch of the conference now.

Today I saw a Tweet about BlueGlass LA coming up in May, and (being I know the CEO Richard Zwicky very well) I dug around to check it out a bit. I found that this event that is talked about a lot and seems to have a very loyal following only has 175 attendees. That's just slightly less than the number EmMeCon gets and we often get called a "Boutique Conference", and I have even gotten feedback from 2 people that said they were surprised at how small the event was. I felt so crappy that I couldn't get more people, but kinda don't want more because then the event loses it's specialness it has. Perhaps BlueGlass has found their events to be the same.

4 Days Until the Event


Status Update: 
Speaker's Presentations (PPT or other): 1 (Simon Rocks!)
Agenda: Had to move a couple speakers around - but good now
Logistics: Meals planned, Speaker's Dinner confirmations and meals sent to hotel, AV guy all set to show up Monday.
Marketing/Promotion:  Tweets scheduled, Facebook updates Scheduled, still need to update Lanyrd,  finish last years videos (probably not going to happen now), print materials for event (signs, schedules, promotion of next events) and...
Master of Ceremonies: Yup it's Me
Volunteers: 2 Confirmed


People Complain


One of the aspects of organizing an event that really gets under my skin are the complaints. My first year I didn't get a single complaint, at least to my face. Everyone was very happy with every aspect of the event, and I event got people coming up to me commending on doing such a great job with the speakers, topics, food, location, etc. But then we grew - and I didn't get to say "hi" or get to know all of the people attending on a more personal level like I did the first year. Now I get people attending that don't even so much as smile at me let alone come up and "hi" or "thanks for organizing such an awesome event". 

Today I got my first complaint leading up to the event. An attendee that was giving me difficulty from the beginning said they were frustrated at the "lack of organization". Ugh - the event hasn't even started...

I try to remind myself that this just comes with planning an event. No matter what you do, there will always be someone complaining about something. Either there are too many emails, or not enough. Or the emails didn't have the information they felt they should have gotten, or the information wasn't displayed prominently in the email. Oh, and the food - the complaints about the food. I try to get away with not serving food if I can, but I feel bad that people pay money to be there and to not feed them is just rude. I finally found a balance that the Essential Passes (that go for $150 - $468 depending on when they register) don't get Lunch or Dinner. That tends to solve the majority of the people, and the overhead of serving them food far outweighs what they pay. The Full and VIP Passes get Lunch, and the Speaker's Dinner. I then get to spend more on the food so we get a great quality meal, and it allows me a chance to honor our speakers for putting in the time and effort they do to speak. 

I have to say my favorite complaints are the ones where people are just whacky. One person at our Hawaii event complained that her picture was taking while she was wearing her bikini... well, don't wear a bikini at a conference then? Or the woman that asked if she could sell her Luau ticket to someone else since she didn't want to go - the Luau was sponsored, she didn't pay for it in her pass (ok that's more of an odd question than a complaint). But my ultimate favorite complaints of all are the people that emailed me complaining that I didn't hold the Hawaii conference last year. Seriously, all the work I put into it to get no money and they are upset with me for not wanting to do it again?

So - complaints happen. It's still difficult to take since I am a person that wants to make sure everyone is having a good time. So, here we go - and here I embrace myself to be ready to take the punches thrown in my direction.