Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Tuesday, December 10, 2013

So You Wanna Learn SEO?

It's been a while since I have posted anything. Mainly because it's been a pretty crazy couple of months at the office. Traffic for SEO has gone up, now bringing in over 92% of the total lead volume (go us!) and when things get to going good it just makes me want to keep doing more. So the things that help me relax, like writing, tend to get pushed to the side lines.

What inspired me to write something tonight was an update I saw in my Facebook feed. I see the question come through often, and even get the question myself more often than you would think. You see, people want to learn SEO. Not necessarily so they can be SEOs themselves, but to understand what it is us SEOs are talking about when we say "SERP", "Meta Tags", or "Canonical Tags". So they can envision why we say you can't have two pages with the same word on them (duplicate content), or why we tell them they have to have links to important pages of their website from their homepage (but not too many links). So I put together a long list of resources recommended by some of the best experts int he industry, and even some sent over my newbies that are learning SEO that have read, completed, or tried the following and highly recommend them.

Get Started - SEO Resources for Beginners


The Next Step - Buy The Book


Get Certified - Academies and Certification


This should be enough to get your started - perhaps just enough to be slightly more than dangerous even. Of course if you ever have any questions for SEO, I am always available - just contact me. I will at times hold a workshop that has been streamlined to 4-5 hours and covers all the basics you will need for SEO. The workshops are small and I am available to answer questions as you have them so it's a great time to get a list of your questions together, and perhaps even have me take a look at your site for just a few hundred bucks.

I do occasionally take on consulting if you need more than just a quick question. A basic SEO Audit runs around $1,000 to $2,500 (depending on the size of the site, and how in-depth you need) and generally takes a week or two (remember I still have a full time job with ADP, and that takes up my daytime). I have been known to find some really interesting issues with sites even with the most basic audits and managed to get them back from the dead after Panda and Penguin updates. Ongoing support can come in the form of an hourly retainer - which I am happy to discuss. Do remember though, I only take on 1-2 clients at a time so that I can give all of my attention and focus on the work and making sure I am there whenever you need me (I demand quality customer service, and therefore ensure my clients get the utmost 100% best service possible as well). 

Thursday, October 3, 2013

Secret to Making Viral Content

I have been in online marketing for many, many years now and have watched as others have made the oddest content go viral. Viral content is like the Loch Ness Monster, few have seen it, some swear it exists, but it is ever so elusive, and only shows up on random occasions. I was there when JibJab released their funny animation in 2004 titled "This Land is Your Land" spoofing the Presidential Election between George W. Bush and John Kerry to the tune originally by Woodie Guthrie. Gregg Spiridellis (co-creator of the famous viral video) said it himself that the video required 4 key points:
1) First and foremost, it was the right piece of content at the right time. The political dialogue was totally asinine and we seemed to capture that spirit. The writing and enhanced production values made it something people wanted to watch.
2) Broadband penetration has skyrocketed since our 2000 election parody making our work accessible to a much larger audience.
3) Processor speeds have also increased making it more enjoyable to watch video/animation on a computer.
4) Everyday people are more comfortable with technology. I can't tell you how many times we heard "my grandmother sent it to me". Grandmas were not emailing in 2000.
 Remember this is coming from 2004 - and viral content is now easier than ever to generate. I was in such awe of what they had accomplished that I have been trying to recreate the same affect for years now.

So you have to understand how amazed I am that one simple Google search for "Car Funny" brought me to the cutest baby pic with a statement that would spark a debate among car enthusiasts. Now, it's not some unique meme I knew immediately would go viral, it was among hundreds I gathered and scheduled to post on my fun Facebook Page called "POS Cars" that supports my even more fun website EmbraceYourPOS.com. The website was created for the sole purpose to make me feel better about owning a 95 Toyota Corolla with over 320k miles, shakes when I go above 65, doesn't unlock from the driver's side, has tape player that doesn't work, zip ties holding the bumper on, and the list goes on...

That photo was posted on September 7, 2013 nearly one month before the photo truly went viral. I hadn't spent a dime on promoting the post, nor did I spend anything on the Facebook page to encourage 'like's.

Yesterday I noticed the post had hundred of shares - and a lot of comments. This afternoon I checked again and the photo has officially gone viral with over 21k 'like's, over 10k shares, and 1,348 comments.

The funny thing is I didn't realize the photo has a misspelling, which seems to be one of the points commented about almost as much as the smack talking among the Mustang and Duramax owners.

So what is it that made the photo go viral suddenly?

To be honest, I can't tell you why, so asking me to do the same thing for your Facebook page isn't going to work. I wouldn't feel comfortable taking your money for the consulting fees, and you probably won't get results from it anyways.

What I do look for when posting content is something controversial that can spark a debate. This clearly fit that bill. A photo that is cute or just in the right context is funny. What I also think played a part in the photo going viral is that there is no hidden agenda to the post or the page. I run the advertising for UsedCars.com and you could probably tie this page to that website somehow and start making money, but that wasn't the point. I have scheduled posts going to the page daily that link to specific years of cars on the site, but it doesn't generate a lot of revenue from them. Nothing near what the SEO for the site does anyways.

You can spend money on sponsored posts, I have often thrown $15-$20 here and there to try and get something going, but it won't guarantee anything other than people seeing your post. If you are still chasing the viral Loch Ness Monster thinking that if you get one it will make you a ton of money, think again. No matter how hard you try to recreate or even just create, something that will go viral the odds are a million to one. If you do hit that viral moment, there is an even slimmer chance that the virality will generate revenue.

But by all means - keep posting that controversial, cute, and funny stuff for your fans, They will love you for it, and when it comes to asking them to give you money for something in return, they just might be more likely to do it.

Want to try one of the baby pics for your site? Here's a Google search for you with plenty to choose from - enjoy

Sunday, August 4, 2013

SEO Analysis for Good!

Just a week ago I saw a post come through the Seattle Digital Eve mailing list from someone asking for help on behalf of a friend.

The post:
I have a friend who is looking to improve the SEO on their ecommerce site.  They are a small business, and the SEO people they've talked to want to sell them a big, complex package, when what they really need is some coding improvements on the site, consolidation of 2 sites into one while preserving link juice, better keyword/metadata, etc.  
Anyone out there with some decent SEO expertise?




Offering to Help


I emailed the person that posted to the group:
If it's a small project I can probably spend some time on it.
Even though I have a ton of experience, I can keep the fees down since I have a day job.

My email was quickly forwarded on and the business owner (Martin) emailed me right away with a simple "Hey Jenn, I am interested in your SEO skills. Please call me...". I called him on my way home as I was stuck in traffic that evening. Martin described the two websites for me, how they were getting rankings and traffic, and now aren't. He said he had done some work to one of them, and traffic kicked up. He asked me to take a look at them and see if it is something to do with the website, if either had been hit by Panda or Penguin, or if people just aren't searching for rugs online as much as they used to. I quickly thought - how odd that this is almost exactly what we have been going through with the websites at ADP. Not to mention that they have gone through this up and down, all the time me trying to figure out if it was just general loss of interest, change in searcher behavior, or if the site was going through a penalty or hit from just general Panda and Penguin updates. While this man's sites are a small version of what I deal with on a daily basis, this could not only be fun, but should be fairly easy to figure out.

Martin asked what my hourly rate is - but I immediately responded with a giggle (at $250/hr I'm sure it's probably more than he was expecting to pay) and told him that for the initial conversation and if it's something quick, I won't charge him. I then asked him if he has a Google Analytics account attached to the websites, of which he responded with a "Yes". I told him how if he can look at the Webmaster Tools data in Google Analytics and look at impressions vs. Avg position, and then impressions vs. Click Through Rate. If the Avg position, and CTR stay fairly steady, but impressions drop, then that is a quick way to see if interest has gone down. Another way to back that up, is look at how many terms get impressions one week, to the next week. If those numbers drop, then there is a good chance that rankings are dropping and there is a penalty. He was a bit overwhelmed, and asked if he gives me the login if I would take a look for him. I excitingly said "Sure" (since most people don't like to give out their login, it's easier to tell them how to do it - but this was the next best thing). I told him the next steps - I would spend some time that evening and see what I can find, then come up with a diagnosis and a plan of attack going forward of which he can either do himself, or if he has the budget, hire me to help him with it.

The Evaluation


That evening I settled in, opened up the computer while watching Hulu on the TV (we don't have cable), looked through both websites, and then logged into GA to dig through the analytics.

Checking 

Behavior vs. Penalty/Panda/Penguin


I started by going back as far as I could in the analytics account looking at organic traffic only. The month that both sites did the best was just a few months back this year. I also checked the timeline against Penguin and Panda updates via the Moz.com Google Algorithm Change timeline and noted if there were any clear drops that sync up with an update. There was a slight drop with one of the updates that hit one of the sites I work on for my day job that is built similar to his.

I first pulled the Impressions against the Avg position to see if the position stays the same while impressions go down. Though given that there is a slight drop after an algorithm update, I figure that there will be a drop in position, impressions, and number of terms getting impressions.

My First Chart:

Site #1
Impressions vs. Avg Pos.
Note: I took out numbers to protect the client even though he gave me permission.
Site #2
Impressions vs. Avg Pos
As you can see - there is a drop in impressions but the average position stays fairly consistent, and even more-so for Site #2. Both improved the last few weeks in positions, and impressions.

My Second Chart:
Site #1
The number of keywords showing impressions week over week for site #1
Site #2
The number of keywords showing impressions week over week for site #1
I noticed that the number of keywords getting impressions (meaning how many terms showing up when a person searches regardless of position) drops when the impressions drop in the earlier charts, and then goes up when impressions go up.

This clearly shows a penalty, and given that there was a penguin update just before that drop, it is pretty clear that the site took a hit from that update, then saw an improvement when he completed the little bit of work he did.

What Got Hit?


Knowing now for sure that both sites took a hit, the next step is to figure out exactly what was hit. With the site I manage at my day job I will usually run a category report to find out which terms were affected so that we can evaluate and establish a plan to recover. I didn't have time to set up the categories of terms to run the report (it takes days to categorize terms, but since that has already been done for the site I manage - it takes me just a few minutes or so to categorize now) so this time I grabbed the number of words in each term. If the phrase has just 1-2 words then it's safe to say it is a broad term, and if it has 3-5 words then those are more exact. Penguin tends to focus on sites that have optimized for long tail terms, and less on the broad terms. So, this is a faster way to get a similar understanding.

I ran a comparison to see how things were in his big traffic months compared to the recent months that there was a drop. I took 4 weeks in the high traffic month from a Saturday through the last Sunday and compared it to the last 4 weeks Saturday through the most recent Sunday. This would give me an exact day of week compared to that day of week and reference the beginning of a month to the end of a month. Ideally it should compare to the same time of year to reflect searcher behavior for the day of the week, time of month, and the time of year, but in this case the day of week and time of month was going to be good enough.

Example of Keyword Data with Count
Note: Terms and traffic are not representative - I changed them to protect the client.
The table above is what the data looks like. Do note, I changed the top keywords, and numbers to protect the client - but this gives you an idea of what I was working with. From there, I created a pivot table and played around with the data to give me more insight into what was going on. Number of words in a phrase, visits, pages/visit, etc. It all helped me understand what was going on before and after the update.

Keyword Count - showing how many words in a phrase were driving traffic from high month compared to low month.
Note: numbers and details have been removed/changed to protect the client.
The 3-5 word terms dropped from the high month to the recent months. This shows that the longer tail terms were hit, which is pretty indicative of Penguin.

Looking at the Sites


Having spent the first hour of my time running the reports and pulling charts, I spent the rest of the time looking through the sites now that I know what to look for. The sites were once optimized for long tail terms, but something happened that they lost that traffic. As I dig through both sites had categories for the two word terms (such as "area rugs) with links to individual pages for each item that fit in that category. The first that I noticed is that there is a URL hierarchy (something the website I work on lacked). So he was good there...

I started noticing as I was looking through both websites, that they were structured exactly the same way. I also saw that the navigation was the same on the left linking to different URLs, but the content appeared to look very similar on the pages. I grabbed a couple of the URLs that were focused on the long tail terms and pasted them into copyscape. The report kept not only pulling the other site as the first match, but other sites that sold the same products. This is a very common issue with eCommerce websites - since they don't have the time to write their own copy for each product, they tend to pull it in dynamically through syndication. With not enough unique content on those pages, then the site appears to be duplicating what all those other sites have. It's not a majorly serious issue, as Google tends to understand syndicated content, but if a site doesn't support the content with something unique they just won't get rankings as well as the ones that do.

Martin's sites had a bit more of a issue though since he has two sites with the exact same content, and the exact same structure. When I compared the terms that both sites got visits from during the high month, then I noticed that not only were a lot of the terms the same, but there were a lot of the terms with site #2s domain, and name in there.

Compare Terms from both sites with visits from organic traffic.
Note: the domain name has been changed, and so have the numbers to protect the client.
I think at this point it was very safe to say that the site took a hit by the Penguin update to the long tail terms due to both sites duplicating each other.

My Email and Recommendation


After spending a couple of hours on the site, I drafted this email and attached the excel document I used to analyze the sites (note: the email below is changed slightly to protect the client):
Hey Martin -
So I dug into both sites and the Google Analytics to see what’s going on. I’m attaching my excel doc if you want to see my work, but it looks like both sites definitely took a hit of some sort.
 Moz.com keeps a list of when updates happen so you can keep an eye on things: http://moz.com/google-algorithm-changeThere was an update at the end of January then another big update in March that may have led to you losing your rankings. I've seen this drop in other sites that are built very similar to yours – so I dug into the analytics to make sure that my assumptions are correct.
 What I found:I first compared visits from organic traffic (SEO) against your average position the past few months (webmaster tools only goes back 90 days, so I couldn't go back to January unfortunately).
 Site #1 definitely saw a decrease in traffic along with the drop in conversions (pasting the charts below for you to look at). With Site #2 there was a drop in traffic, but the average position seemed to not drop as much. Usually this would be a sign that people aren’t searching as much, so I wanted to check your keyword count and impressions week over week. If the number of terms drops seeing impressions drops from one week to the next it is usually an indicator of a penalization or hit by a panda or penguin update. I’m not pasting those charts in here since they are really raw, but you can see them in the excel sheet. The terms that have 2-5 words in there took a huge hit, while the one word, and longer tails appear to be sticking around. I toggled from keyword count to visits from the keywords and those sets stay pretty consistent in dropping.
 What this means is that you most definitely took a hit in rankings from the updates. Not just rankings dropping, but a bulk of your 3-5 word terms dropped out of the index completely. Those 3-5 word terms are also the bulk of where your visitors come from – those longtails are higher converting terms and can really affect revenue if they drop off. It looks like both sites are build very similar and have a lot of the same content. I compared the top referring terms both sites saw in your highest traffic month and both refer traffic for “your domain” which isn't good. They both get traffic from “broad term” but site #2 has site #2 beat there. They also both have several long tail terms that are the same.
 When I run a report on copyscape.com to check for duplicate content – the site #2 along with a few others come up (included link directly to copyscape) The “product” rugs page on both sites is exactly the same – almost word for word.
 It’s kinda fun to have two sites show up for the same terms, since you could get double the traffic. In fact that’s what my company does – and what I manage. We have dozens of “portal” sites to grab leads to sell to car dealers. But if Google has any idea that both sites are connected then both sites get penalized. I think this might be what has happened here.
 Your first solution would be to get unique copy on all of the pages of the site. I know it’s tough writing for all of those pages, and copy writers can be expensive. There is an alternative called TextBroker (http://textbroker.com). They have writers that bust out copy pretty quickly (2-3 day turnaround) at a pretty reasonable rate.
 I would recommend getting an account set up and start asking them to write for your pages. Even your homepage content – while there is a lot of it, but looks to be pieced from other content on the web.
 The order I would have them do it in is in order of the pages that had the most traffic in your highest month, and then work down from there.
 Once you get them going on that – I can do a full keyword analysis, check to see where the opportunity might lie, and get you a complete plan.
 The excel doc is attached – let me know if you have any questions.
 Hopefully this was helpful J
All in all it took me just a couple of hours since this is what I do for our executives regularly, so I didn't charge Martin for the work.

SEO for Good!


Martin was so excited and appreciative of the work I had done, and what I had found that he asked me if he could pay me in some way - "..if anything to help the school in Nicaragua". I gave Martin the link to donate to the school, and he did.

The money immediately went to help buy supplies for my Husband's students we are bringing with us. Since they had to pay for their immunizations out of their paychecks, don't have sleeping bags (we are loaning bags to them), and anything else they need they pay for themselves, I wanted to help them so that they could focus on helping build the school and not stress that they have everything they need for the trip.

In the end, I helped Martin with his websites because I like to help small businesses succeed, Martin returned the kindness by helping the students with their supplies, so that they in-turn could help build the school for children in Nicaragua.

Everybody wins!







Tuesday, July 30, 2013

Anatomy of the URL and Stuff

I'm sure you are looking at the URL above and thinking to yourself; "Wow, I never realized that all that stuff meant something." Oddly enough it actually does... As the world wide web has changed into a search friendly, user interactivity playground, the formation and meaning of the URL has evolved considerably in to a very significant factor in not only search engine compliance but in how people use websites. Lately I have been helping clients understand how their website's are structured and how servers to browsers to users work. It's something us search optimizers view as something so simple yet can be so complex to someone who doesn't understand how it all works. So here is the URL broken down piece by piece and explained.  

First - What is a URL? 
A Uniform Resource Locator is a website address that holds very important information between each "." and "/" much like an address to your home contains a house number, city, state, country, etc. This allows the browser to connect to a specific website, directory/path, and/or file in order for the user to see what it is on the page you want them to see. A URL consists of the following:


Hypertext Protocol Established by English physicist Tim Berners-Lee in 1990, hypertext protocol is a request/response standard typical in which the client is the application (User on a web browser such as IE, Firefox, safari, etc) and the server which hosts the web site itself. The client submitting HTTP requests is typically referred to as a user-agent (or user) with the responding server—which stores or creates resources such as files (html, .asp, .php, css, etc) and images—referred to as the origin server.*

  WWW (World Wide Web) or "sub-domain" 
The WWW is typically placed before the main domain of your website URL, referencing the World Wide Web. Remember the game you played in elementary school where you could start your home address with your house number, street, city, state and then go off as far as your country, continent, and even earth. The WWW is the address starting with "earth". In some cases, what we call a "sub-domain" can replace the WWW in your URL, which references a whole new website within your existing domain. Search optimizers can use this as a way to target certain key terms. For example, a real estate agent targeting a specific city will use http://city.domain.com and thus will have a leg up when ranking for anything within that city. In most cases the sub-domains will link to the main domain and, since they are treated by most search engines as a domain all it's own, then it will count as an external link credit, boosting the rankings for the main domain it is linking to. It is highly recommended that you avoid this technique as it is only tricking the search engines and in the end will hurt your rankings rather than help. 

  Domain Naming System (or DNS) 
The domain naming system was established so that the common user can understand in simple terms the location of a web site. A web site's files are usually stored on a server that points to a specific IP address (much like a phone number directs someone's call to your phone). In order for the general public to understand where to locate a certain website and it's files, the specific domain name resolves to that particular IP address. In addition, the Domain Name System also stores other types of information, such as the list of mail servers that accept email for a given domain (such as you@yourdomain.com). 

Top-level Domain Extension 
The domain extension originally consisted of the generic gov, edu, com, mil, and org. With the growth of the internet, the addition of country extensions and other such categories have come into play. The most recognized of the extensions is of course the .com. If you are optimizing for a specific country and language, then the best route to take is to register your domain with that specific country's extension. This will help the search engines recognize that you are targeting that particular audience and will rank that site accordingly. Be sure that your country specific site is in the native language for that country to avoid any duplicate content issues. Do also be careful of linking from that domain to your main domain as once again the site will be penalized. 

Directories and Files 
Here's where the fun stuff comes into play. Just as your computer organizes your word doc, excel, and other such files into folders, a server structures your website files in the same way. A "directory" or "path" is much like a "folder" is on your computer. In standard (old school) html development (before the days of creating dynamic websites powered by databases and user interactivity) a file would be created and named "index.html" or "default.html" and placed either on the main domain folder (in which the DNS resolves to on the server) or placed in a named folder (in order to help the webmaster organize the site's files). As the technology grew and more ways to develop websites with user interactivity and database driven websites advanced, the structure has pretty much stayed the same with the addition of "parameters" that reference a part of the database and returns content and such on a page based on those parameters. (have I lost you yet?) Let's go back to the basic structure of the static html files and go from there...

A Dynamic website is one that has a few static pages (in other words the pages are coded and are only editable by a developer) that have parameters that will pull in content or trigger specific actions from a database. The basics of a dynamic page is one that pulls words, images, etc from a database and can do so creating multiple pages with different content from one basic page. A more complex dynamic page (or site) is something like Facebook, or Twitter in which they recognize whether or not you are signed in with a username and password and will show you either your profile page (if you are signed in) or a "please sign up" page (if you are not signed in or don't have an established username).
In order to help understand this let's talk about how a database works. A database is essentially similar to that of an excel spreadsheet or table in a word document that has a unique identifier for each line (or row) and holds different content for each line item. Example:
Email
First Name
Last Name
Sujo234
bob@bobsemail.com
Bob
Sujo
Forjill23
jill@jillsemail.com
Jill
Forman
Username
In this example the username is the unique identifier with the email, first name, and last name as different parameters for that username.

The content will be different on each page. With dynamic content the possibilities are endless as far as how many pages you can create from developing and design just one file. A great example of how a dynamic page is created for search optimization purposes is on usedcars.com - If you search for "used cars in oslo mn" you see the "UsedCars.com Oslo MN" page in the results. Look at the URL in the address bar when you go to that particular page - http://www.usedcars.com/browse/mn-24/oslo-163.aspx. In this case the page is pulling in the unique ID that is equal to "OSLO 163" and "MN 24", just as the username is the unique ID in the above table.  

SEO Friendly URL 
In order to make your dynamic URL friendly for search engines you must use a rewrite. A great resource for rewriting a URL is the Apache Rewriting Guide. Some open source content management systems (such as Wordpress, Drupal, etc) already do the rewriting for you and all you have to do is enter what you want the URL to be (be sure to include your key terms separated with dashes "-" and not underscores "_" for search happiness) Who would have thought a URL could be so complicated? But when it comes to search optimization and understanding basic website development it is very important to understand how the URL works, how it is structured, and how to make sure your site is URL and search engine compliant. *http://en.wikipedia.org/wiki/Http_protocol


Sunday, July 28, 2013

Facetwitetiquette - How to Suck at Facebook and Twitter


Facebook

What not to do on Facebook

The constant Gardener, Mafia Hitman, Virtual Pet Owner, or Whatever

the updates and invites never end...

The Bad Marketer

become a fan, join my group, comment on one of my million updates...

The Crude Photo Tagger

tagging everything from that party last night you don't remember to taking shots off of a girls belly button.

The Rash

following you around commenting on all your posts and liking every photo...

The Unfiltered

they post everything and anything...

The Most Popular Person EVAR

inviting everyone to everything...

The Twitterfied

linking twitter updates to Facebook flooding their profile with meaningless tweets...

The Bored Quizzer

taking every quiz available...

The Passive Aggressor

posting well thought out updates without mentioning names...

The Annoyingly Proud Parent

using their child as their profile pic...

Twitter

What not to do on Twitter

The Unproductive Tweeter

tweets updating every minute...

The Retweeter

retweeting everything they see...

The Conversationalist

bouncing back and forth between one person...

The Untweeter

Tweeting from Foursquare, tweetmeme, or the like...

Tuesday, July 23, 2013

Categorizing Keywords

For those of you SEO's that manage very large sites and map your keyword categories to sections of your website - you know how difficult it is to categorize your terms and track their performance. Well, I have to say that after searching, asking, and digging around for a tool that does exactly what I am talking about, I finally came up with a solution. It's a bit of a workaround in Excel - but it's the best I can do until someone comes up with a tool that categorizes keywords for SEO.

Know Your Keywords and Categories


Before you get started categorizing the terms that come to your site, you should know what keywords you are targeting, and the combinations of terms as well. I'm going to use a flower shop's website as an example for this particular blog post. Categorizing is something you can do with any website. At the very least, you can categorize terms into "Broad" and "Branded", to get you started.

Most keyword tools can help you establish what categories to target. Google's Keyword Tool or WordTracker are just a couple of the many tools available on the web.

Another way to figure out terms that fit in categories is by grabbing search data (referring terms in Google Analytics) on your site for the past few months or year. I personally spent some time going through and categorizing keywords in Excel by using the filters and then having the sheet show all words including "anniversary" for terms around "anniversary flowers". It takes a lot of work and time, but in the long run you will have a more accurate account of the terms you will need to do the Lookup against.

Setting Up Your Template

Download the Template

Now that you have all the terms possible in all of your categories it's time to start setting up your template. You are going to want to Download the template I have set up in Excel. You can start from a fresh Excel document if you want, but the template has directions (in case you lose this blog post somehow) and the Lookup formula is in there.

Once you have downloaded the template it's time to get it set up to work for your keywords.

In the following steps - I am going to walk you through setting up the template and then categorizing the terms. If you don't have terms that you can use already, I have a zip file you can download and walk through the example with me to get familiar with how this works.

Copy and paste your first set of categorized terms and paste them into the first Tab marked "Broad". Since every site usually has a "Broad" category of terms, I figure that's probably the best to get started with. In the case of this example "flower shop", "online flower shop", and "best flower shop" terms are the ones that fit under the Broad category.

If you have the .zip folder downloaded, open up the "Terms" Excel doc and you will see the words already categorized for you. There are "Broad", "Branded", "Birthday", "Anniversary", and "Wedding". Click the Drop Down next to "Category" and click "select all (to deselect all) and then click "Broad". You will see all of the terms sort by just that "Broad" category.

Next select all of the terms in the "Keyword" list - copy and paste them into the "Broad" Tab.
We will then need to sort the terms in alphabetical order so that the Lookup string can go through them in order. If you don't then the Lookup won't work.


Highlight the Column with your keywords
Click "data" > "sort"
Select "My data has headers"
Select under "sort by" the column you keywords are under (should be column A)
Click OK

Double click the Tab and rename it with the one word name of your category.
Highlight all of your keywords in the column (just the cells that have words, not any blank cells).
Type the name of the category (stick to one word naming) into the upper left field. You have now named your table.

Do this for "Branded" and the other categories as well. You are going to have to create a new tab in the template to fit all the categories.

If you have not downloaded the .zip file and are working off of your own terms, creating new tabs and naming them is probably going to be something you will need to do. But don't worry, the template will still work.

Now that you have all of your keywords in your Template's Tabs with names and sorted it's time to set up your Lookup string.

Setting up Your Lookup


The way the Lookup works in this case is we are going to ask Excel to look at one Keyword (one cell) and match it up to one of the terms in the Tabs we have set up. If it matches one of those terms then we tell Excel to place the word into that Cell. If it doesn't, then we just leave that cell blank.

The string looks like this:
=IF(ISNA(VLOOKUP(B2,Broad,Broad!A$2:Broad!A$999998,FALSE)),"","Broad")
  • B2 is the cell of the keyword we want to look for.
  • the first "Broad" is the Table name we want to look for that keyword in.
  • Broad!A$2:Broad!A$9999998 is the Tab and range that the Table exists in.
  • FALSE is telling the Lookup to do an exact match. TRUE would look through to see if letters from that Keyword exist in the Cells we are looking in, so in this case it won't work.
  • We leave the ,"", as a blank - but you can put "not categorized" or "misc" to show that it isn't in a category. Though for our purposes here, we keep it blank.
  • ,"Broad" is telling Excel to put the word "Broad" in the cell if the keyword matches one of those in the Broad Table or Tab.


See - it's that easy...

What you are going to do next is replace the word "Broad" or "Cat1" with the name of your table, Tab, and category. This is why we name the Table, the Tab, and the Category the same so that our life is much easier when setting this string up.

Now your template is ready for you to paste some keywords with data and grab some numbers.

Gathering Your Data


Open up your Google Analytics account - if you don't have Google Analytics, pretty much any tracking tool that has a list of referring terms with some sort of data is fine. You can expand and contract the columns to the right of the terms as you wish. The template you will download will have the columns set up just for the purpose of exporting referring terms with visits and such from Google Analytics though.

Log into your Google Analytics account.
Click "Traffic Sources" > "Sources" > "Search" > "Organic"
Select the date range you would like to report on.
Scroll to the bottom of the report and show 5,000 rows.
Scroll back to the top and click "Export" the select "CSV".
After the file has downloaded, open the excel file.
Highlight JUST the cells that include the keywords and your data (ignore the first few at the top with date and information, and the bottom that summarize the data and below).
Copy those cells, and paste into your "Master" Tab.

Note: If you have multiple dates you would like to track, you can export the different date ranges, and then add which keywords go with what date in the Master Tab. This will allow you to see trends of categories.

I added an Excel doc called "Analytics Organic Search Traffic" with some terms and fake data that you can play with. There are three tabs that I added dates for each day's data. Start with just the one day and play with that to get familiar with percentages. From there you can play with all three dates and work on your trends to see what categories are trending up and down.

Completing Your Lookup


Now that you have copied and pasted the keywords into the "Master" Tab it's time to get all of those terms categorized.

Select the top row with your categories and your "All Categories" cell
Copy just those cells in the top row
Highlight the next row (same cells just below) hold down the "shift" key
Scroll down to the last keyword record
Holding down the shift key select the last cell under the "All categories" - this highlights all of those cells for those categories to Lookup the keywords.
Hit "CTRL+V" on your keyboard (this quickly pastes the Lookup formulas for each line)
Be patient, as it may take a while for your Lookup to complete (depending on how many keywords, and records you have)
The "Master" Tab should look something like this:

Playing With Your Data

The most efficient way to gather information from your data is to copy the entire "Master" Tab and paste as values into a new Excel sheet.  This way you won't have to wait for the Lookup to complete each time you sort, pivot, etc.

Click the top left "Arrow" in the "Master" Tab
Right Click and select "Copy"
Open a new Excel Doc
Right Click and select

From here you can create pivot tables then sort them into pie charts, graphs, and all sorts of fun reports to see how your keywords are performing.

I personally like to start with a quick pie chat to see what category of terms brings int he most traffic. At times we will have a drop or rise in traffic, and it's good to understand which category of terms are fluctuating. By copying and pasting terms by dates (weeks, months, or even a set of a few days) will help me see which categories are fluctuating on a timeline trend. Knowing which categories bring int he most traffic, I can then make decisions on which parts of the website we need to focus our efforts on to increase traffic.

See how much fun categorizing your terms can be?
Now that I have a template I work off of, when traffic goes up I can quickly categorize the terms and let our executives know if our recent efforts have worked.

Thursday, June 20, 2013

How long does it take for Google to recognize 301s?

Or Better Yet - 

It's been over a year and Google still doesn't have the new URLs in the Index


Just over a year ago, I started working on this website that had over 900k top level domain files. We changed the structure of the URLs to a more organized hierarchy. The pages content changed slightly, but most importantly instead of all of the site's pages residing directly under the main domain (Let's use a computer broad to longtail term structure for example - like domain.com/computer.html and domain.com/laptop-computer.html and domain.com/500gb-laptop-computer.html) we changed them to a more representative hierarchy directory to file structure (example - domain.com/computer/ to domain.com/computer/laptop-computer/ then domain.com/computer/laptop-computer/500gb.html).

Why the URL Hierarchy?


The quick and simple explanation as to why we did this is that while URLs are fairly dynamic these days,  the bots like to see and understand how a website is organized on a server. Remember your old school folder and file structure back when sites were in html were built?  The URLs you have today should represent that organize file structure as much as possible. I cover this in my SEO Workshop  (slide 23)- but I also found a pretty good article that explains the hierarchy relatively simply and quickly.

The process in setting the 301


The Since the entire 27+ million pages on the site were mostly files located directly under the main domain, it was difficult to understand what pages fit under what category so that we could organize them. I went to our keyword analysis and bucketed each focus term out and then organized the correlating URLs to fit within that bucket. Once that was done, I worked with the Developers to pull the naming from the database (dynamically) into the directory and file structure that fit the buckets. Some of the keywords I knew I eventually wanted to build out with supporting pages, so those got directory levels instead of page levels for future optimization (and limit more 301 redirecting later on).

I mention a bit about breaking the site up into sections for analytics purposes in my previous post "SEO Issues - is it Penguin? Is it Panda? or is it me?" under "Figuring out what was hit by Penguin". The "video" to the left is a quick (and very raw) animation to help explain exactly what we did. Now that the site was organized it not only helps the bots understand the structure, but helps us understand what sections bring in what SEO traffic in Google Analytics.

How Long Does it Take Google to See New URL via 301 Redirect?


This whole undertaking was completed over a course of 2-3 months starting in June 2012 (last year) and finished up with the last of the redesigns and URL changes in August with one last directory change (no redesigned pages) in January of this year (2013). The most important ones are still are showing 550,000 pages in Google's Index (11 months later):
As I Google to see if others have a solution for speeding up the indexing of these old URLs, or if even if anyone has had the same problem I found a lot of questions in various forums (both reliable and unreliable) but no real articles, blog posts, or anything from reputable SEO's. The most common answer in the forums is to just "wait". It's, of course, what I tell others when they ask me "Be patient, Google will eventually hit those pages again and recognize that they have changed then correct the index then." But after nearly a year and so many pages, this is getting ridiculous.

I spoke with my friend (and SEO mentor) Bruce Clay who came back with the suggestion to add an .xml sitemap and submit it to Google with the old URLs we want removed.

It was kinda making sense that because those old URLs are no longer linked to, and there are so many, that Google wasn't crawling them as much anymore. They are just sitting there in the index - and not getting "updated"

Unfortunately getting a sitemap added is not an easy feat. I would have to define the strategy, present it to the powers that be with data to backup the success metrics in order to get the project prioritized. With so many other initiatives needed for SEO, all of which were more important and affect the business in a positive way, it was in my best interest to keep pushing those and not deal with the sitemap.

My work around, though, was about as black hat as I would get (Matt Cutts if you are reading this, I apologize and throw myself at your mercy, but it had to be done). One weekend over a month ago, I grabbed one of my many impulsive purchased domains and quickly set up hosting and an old school html site that consisted of one page. I then exported all of the links on the Google "site:" search through a Firefox plugin called SEOquake that exports the results into a csv file. It's not the prettiest, and there was a lot of work still needed to get to just the URLs, but it was the best solution I could find (note: if any SEO reading this knows of an easier way to do this - please add to the comments for prosperity). I then parsed out the parameters in the URLs in a separate document and used those as the anchor text for each URL. Finally, using excel I then concatenated the URLs and parameters (that were now anchor text) into an html href string.
Then copying and pasting the "string" column into the html code, the page looked like:
The page wasn't the prettiest, and it had thousands of links (the above is just an example) so it was bad all around, but the point was to get those links crawled by Google.

Of course every SEO knows that you can't just build a website and expect it to immediately get crawled - right? 

So I set it up in Google Webmaster Tools and submitted the page to the the index:
I even got more fancy to ensure Google would see the page and crawl all of those old URLs and +1'd it on Google. 

Did it work?


I checked the URLs this evening to see how many Google is seeing and the number has dropped from 550,000 to now only 175.

I took the domain off of the server, and now have it parked elsewhere (back where it belongs) and removed the webmaster tools account. All traces of it ever existing are now gone, and the small moment of my attempt to get those URLs removed has passed.

Thank For the Advice Jenn - Now I'm Going to Try This!


If you have come across this post and you need to do something similar - I'm going to put the same disclaimer they do when a very dangerous stunt is performed in commercials. 
Do not attempt this at home - this stunt was performed by a trained professional on a closed course.

So, don't go adding a bunch of links to a random domain thinking that your attempt just weeks ago to 301 pages isn't working. The links on the external domain were too many for the domain and page, and were extremely spammy. In addition, all those links pointing to pages that were redirecting and were supposed to pass value to the new URLs, now had many spammy links pointing to them from a very spammy domain. If left up too long, or not done correctly, it could actually cause more damage than ever helping.

If you have any questions, or feel you need to try this same strategy, please don't hesitate to contact me. I'm here to help, and want to ensure that your website has considered all possible options before attempting any such trickery.

Some Helpful Links on the Very Subject:



Wednesday, February 20, 2013

Conference Adventures Part..: Dernier Versement (final installment)

All-in-all the event went as well as could be expected. Financially, I am left with having to put up ~$1,000 to still cover the expenses of the event. The total cost for the event ran me ~$9,000.00 (slightly under) with ~$8,500 (give or take a few hundred) in registrations and no money from sponsors (sponsors this year received their sponsorships in exchange for distributing swag, or offering award prizes).


Why do I do This?


When I started writing my first post documenting the organizing of this conference, I mentioned in my post titled "Conference Adventures Part Un - 18 Days 'til"

Planning and organizing a conference is not an easy feat by any means, and I often ask myself why I keep doing it...

Which is now bringing me back to asking myself once again why do I keep doing it? I honestly can't find a one sentence answer that rationalizes the time, effort, stress, financial strain, or the pressure it all puts on me.

I'm sure you're probably reading frustration in my post here, and probably in my previous posts, but know this - the last few days I have been coming up with ideas, excited that registrations are already coming in, and I almost have a full speaker list for EmMeCon Seattle that isn't happening until June. I have even started the groundwork of organizing a 2 day Search and Social Series, and/or an SEOGoddess 4 hour SEO Workshop in April so that I don't have to wait too long to do another event).

So why this odd addiction to holding events? I got to talking with a few of the attendees and speakers at this last EmMeCon and the one word that kept popping out of my mouth (and repeated back to me) was integrity. My events, though small, and not highly profitable, still have integrity. I haven't sold out with mindless topics, uninspiring speakers, and selling tickets at an insanely low cost just to appease sponsors with more attendees. Even after all these years of organizing conferences, I still constantly remind myself through the entire process what it is that I wanted to do when I started these events. With EmMeCon, I want people to gain inspiration from the amazing people I have been lucky enough to have access to. People like David Evans Ph.D. who has taught at the University of Washington educating Masters students on Psychographic Segmentation and the importance of understanding the minds of the users they are marketing to. Or Gillian Muessig who has guided not one, but 3 children into thriving adults and in the process molded 2 of them into very successful and inspiring SEO's. The list of inspirational individuals that I feel privileged at the ability to pick their brains, gain inspiration from, or have been helped by in some way is a mile long, and ever growing.

Because of this desire to share, I take careful consideration into the details of every event I organize. It may run me rampant and I get flustered and exhausted from it, but reading the tweets, hearing the feedback, and knowing that at least one person (if not many) has gained inspiration from the event is what I deem as success. 

On Thursday night we wrapped up the event with a packed house for the Meetup Group organized by Chase McMichael (CEO of Infinigraph) the tweets were still coming through strong, and the room was full of questions and discussions. After the Meetup wrapped I began packing things up, and while exhausted I was bouncing around with excitement as the folks that lagged behind thanked me for putting on such a great event, and asked me all sorts of questions on how I got into this, how I came up with the idea for the event, and even asked more about the event in Seattle.

It's that feedback that I get that keeps me going. 

I promise that I won't "sell out" and start making this about the money, I won't ever forget what this event (or any of my events) are there to accomplish, and I promise never to lose the integrity that I still hold onto.

If I do - someone please take me out back and put me out of my misery...?