Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Showing posts with label seo techniques. Show all posts
Showing posts with label seo techniques. Show all posts

Monday, August 4, 2014

SEO

That's right, I am an SEO. So, what does that mean? It means that I optimize websites so that they show up on the search engines for certain terms. Those terms are usually focused on what your key audience might be searching.

Some Stats About SEO:
  • 93% of online experiences begin with a search on Google, Bing, or Yahoo!
  • Google owns 65%-70% of the market share.
  • 70% of users will click on SEO results over paid.
  • 70-80% of users ignore the paid ads, focusing on the SEO results.
  • 75% of users stay on the first page of search results (1-10th position).
  • SEO beats social media by more than 300% in traffic for most content sites.
  • Traffic from SEO has a 14% close rate, while outbound leads (such as direct mail or print advertising) has a 1.5% close rate.
  • For Google, 18% of clicks from SEO are on the 1st, 10% of clicks from SEO are from the 2 to 3rd.
I primarily work as an in-house SEO, which means I work for large companies within the organization rather than an agency or as a consultant. I have, in the past, helped some companies out as an SEO consultant, but if it takes time away from my job and career I will generally offer to recommend someone else to consult. Most SEO consulting consists of myself reviewing the website and any possible issues the company is finding. I review the analytics looking at the SEO traffic as well as traffic from other sources. I also look at Google's Webmaster Tools data to gauge how the current SEO is doing for the site, and how the impressions for key terms look compared to clicks. Sometimes just a simple change to meta tags for a different title and description can increase the click through rate from SEO therefore increasing traffic. In most cases, a complete restructuring of the site along with basic SEO implementation is needed in order to increase rankings. After I review the site, I will come up with a list of recommendations along with how much effort it should take along with the measure of impact. Your report will also include potential traffic and current traffic from SEO so that the client can see where the biggest gaps are. At times the reporting for SEO that I send over can be pretty technical, but rest assured I spend time making sure all the data is easy to understand, and a clear direction is not only explained, but in full detail in the final recommendations. From there it is up to the client to decide on whether they can do the work themselves, hire someone, have it done by their current employees, hire an agency, another consultant, or have me do the work for SEO. Since I have a background in design and development any work needed for SEO or to simply just increase conversion rates from SEO traffic, are fairly easy for me to do, and can happen pretty quickly. It all depends on how much I have on my plate at the time the work needs to get done with my full time job.

If you're not sure you want to have me, or someone else, optimize your site for SEO, it's no problem. Most people can pick up on the basics of SEO themselves. I always like to see clients having some understanding of SEO before I work with them. If they don't have time to learn, that's perfectly acceptable, as I can explain how things work in ways most people understand and pick up quickly. The following is a check list I have come up with for SEO that will help anyone understand and get started in SEO quickly and easily. Of course, there are so many algorithms that Google and other search engines use to determine which site gets to show up for their respective terms, but this at least gets you on your way to understanding the basics of SEO.
  1. Keywords – you can't do anything with SEO until you know what keywords you are optimizing for. Once you have your basic list, then structuring your site, and any work you do with the site, around them will all fall into place. I usually recommend one or two broad terms that describe a website. These terms should only be one work, and very rarely more than two. From there a few two to three word terms that might describe a sub-category will help you structure your plan and organize for SEO. Your longtail (as SEO's will put it) or exact match (as Paid Search people call them) are the phrases that are more specific. These phrases then to be the biggest payoff for SEO since they represent terms that users will use when they really know what they want and are ready to buy. Therefore they tend to convert a lot faster and higher. I talk more about this in-depth in my workshops, and in my book titled “Search and Social” that is currently in the works. So stay tuned for the book that helps you really understand SEO on a very detailed level.

    Keywords in
     – Keywords should be in the following items for SEO.
    • Keywords in title tag  - The title tag is what show up in the browser top. It is also what search engines use for the title in the “snippet” that displays in the results after a search has been completed. Having your keyword in the title tag not only helps SEO, but will aid in the click as the user will recognize the word they searched for within your title encouraging them to click your result over the other's on the page.
    • Keywords in URL – Getting the key words in the URL is very important for SEO. Start with the broad terms in the domain if possible. If not, then in a directory with the category terms (2-3 word terms mentioned before) as a sub-directory, and then the exact match longtail terms as the name (or in the name) of the file. Your URL hierarchy is very important for SEO and having those keywords in there even moreso.
    • Keyword density in document text – Listing out your keywords over and over again in a short paragraph will harm your SEO more than doing any good. A good way to explain how to watch your densities is to look at a page that has 3 paragraphs, each having about 150 words. Let's say you need to mention your keyword 9 times in order to get rankings. If you mention your keyword 9 times in your first paragraph and then not in the others that's bad. The trick it to distribute your keyword evenly among the three. So mention that keyword 3 times in each paragraph and evenly distribute it throughout each one of the paragraphs.
    • Keywords in anchor text – The anchor text is the text that a user will click on within a page's content that sends them to another page. The text that links back to your website should include the main broad keyword that describes the site. The trick to this is to make sure that the page and the whole site linking to the site is relevant to the word in the anchor text. If the site linking to your site isn't relevant than that will actually get your site in trouble, and too many will cause you to lose rankings.
    • Keywords in tags - The alt tag is the alternative text that displays in the rare case that an image doesn't show up. It's a simple line of code that goes in the html that generates the image. For SEO purposes, the alt tag containing the keyword is important, and will actually help rankings. Be sure to stick with only the words relevant on that page, and don't list all of the keywords out with commas. That will get a site in trouble.
    • Keywords in metatags – Be sure to get your keyword in your SEO meta tags, that's the description, title, and keyword tag that resides in the background of the html.
  2. Metatags – meta tags are (as explained above) the lines of code within html for SEO that describe your page. This includes the title, description, and keyword tag.
    • meta description tag - The description tag should be no more than 150 characters, and include your keyword(s). Try to describe the page as much as possible for SEO while keeping in mind that the user will see this in the search results.
    • meta keyword tag - Some SEOs will say that keyword meta tags don't make a difference. Google doesn't really pay attention to them, but the meta driven search engines will, and there are thousands of other search engines aside from Google. So, for SEO purposes, and to help keep the focus of the page of the site, I recommend listing out the keywords in the keyword tag with the broad terms first, then the category, and the longtail. You never know, it might actually help SEO.
    • meta language - If the site is in English then adding the language meta tag will help the search engines know which language to display the site on. If you have other languages, then try to make sure the language is in the meta tag. In some cases it can really benefit SEO.
  3. Links – Internal – Linking internally to other pages of the site that are related to the page you are optimizing can be quite important for SEO. Almost as much (if not more) than external links coming in.
  4. Anchor text has key term(s) in links – As much as the links pointing to other pages, the keyword in the anchor text is important. I cannot stress just how important it is for SEO to have those internal links, and the keywords in the anchor text.
  5. Content Around-the-anchor text is relevant - If a section of pages are relevant to other pages, the cross linking with a paragraph mentioning the page before and after the link is very helpful for SEO.
  6. Content – content, content, and more content is the key to optimizing a site for SEO. Pages don't have to have large chunks of paragraphs, but can have words here and there throughout the page. Too many SEOs will put big blocks of content on the homepage of a website thinking that it will benefit it. Sure, it helps for SEO, but it looks horrible and users don't fall for it. A paragraph of 10 words at the top describing the site, and then perhaps another clock of text highlighting the value proposition of the product or service in blocks around the page are just fine. The trick is to search your term you are trying to rank for, look at the first few pages or sites ranking, and then look at how many words they have on their page with the number of mentions of keywords. Then, simply just do a little more. Once you have that content in place for SEO, you're on your way to rankings.
  7. Unique content – Unique content is very key to making sure your SEO is in place. Not only do you need to watch out for other sites having the same content that you have, but look at other pages of your site. If a block of content is repeated on more than one page, then the content just won't be counted towards SEO. If a page has less than 15% content then it will even work against SEO and even get a site penalized. Sites that use tracking tags, parameters, or might have issues with validating URLs can often run into the issue of duplicate content, and really harm the work they have done for SEO without even realizing it.
  8. Frequency of content change – A site that is recognized as a publication and pushes content several days a week (or even several times a day) will train the search engines to visit and see updates regularly. In this case SEO will work to their benefit with fresh content getting recognized and ranked quickly.
  9. Age of document - If a site is a brochureware site that doesn't update content often, the search engines will visit less, but give more value to the pages the longer they stick around. Pages that are years old will rank better than new ones added. SO keep this in mind for your SEO and your site. Are you a publication that pushes out content frequently and needs to get rankings fast, or are you a site that holds true with valuable content that gets better with age?
  10. File size – A page that takes a long time to load, or is extraordinarily large can be quite detrimental to SEO. So be wary of how big that file is that you are creating.
  11. Content separation – As mentioned before breaking up your content throughout your page is more beneficial to SEO than blocking out whole paragraphs. If the site is a known publication, or the section is a blog or article section of a site then whole blocks of content is perfectly acceptable for SEO. But pages that go up and stick around a while with the purpose of providing marketing information, should have content broken up throughout for SEO.
  12. Poor coding and design – This one gets overlooked a lot. Sure, search engines can't determine good design from poor design, but your user's sure can. If a user comes to your site from Google and then immediately bounces, Google will mark the value of that page for SEO down. Therefore, affecting your rankings. So pay attention to design, look at your bounce rate data in Google Analytics, and improve it as much as possible.
  13. Duplicating Content = NO – DO NOT DUPLICATE CONTENT… Just as mentioned earlier, this is very bad for SEO. If one page has more than 80% duplicate content to any other page on the site, then it can harm your SEO. So be sure that the content on every page of the site has more than 80% unique content.
  14. Invisible text = NO – Invisible text is content a site has hidden from users but allows the search engines to see. A div that is only on pixel high with the attribute to hide overflow, or white text on white background (both allowing search engines to see it in the code) is a huge no no for SEO and can actually get your site penalized. So don't do it!
  15. Domains & URLs – Check you domain and URLs often. Look for your keywords, check to make sure the hierarchy is clear and set properly for SEO, and make sure that there are no funky issues like parameters, easily changed (by typing anything in), or redirects to some odd UR. Check your trailing backslash or file extenions as well to make sure it either 404s if wrong or resolves to the correct one.
  16. Keyword-rich URLs and filenames – Watch for those keywords in URLs and filenames. Long URLs that mention more than one keyword will cause issues, so always check and double check the URL for SEO before going live.
  17. Site Accessibility – In some cases having an accessible for those with disabilities. Whether it be sight, or even hard of hearing if you have video. It can actually help your SEO.
  18. Sitemap – creating an page that links to all of your pages can ensure that all of your pages are getting crawled for SEO. Many times I have seen website have pages that they don't link to and wonder why those pages aren't getting rankings. If search engines can't crawl the page, then they don't know to rank it for SEO. You can also create an .xml file for Google, Bing, and Yahoo! site submission. But do remember that an xml sitemap alone just won't cut it, you have to have links pointing to pages from multiple locations. Otherwise it just won't do any good for SEO.
  19. Website size - Keep an eye on the size of your website. Large corporate sites like Amazon.com and MSN.com are expected to have thousands if not millions of pages. If your site is a mall to medium size company and website, yet the search engines somehow crawl millions of pages, then you need to relook at your SEO. Check your paramaters, or other issues that might be causing more pages than your site should have.
  20. Website/Domain age – The older the website the better. A brand new site that is loaded with pages and pages of content all in one day will get added to a sandbox as us SEOs call it. It will sit there for a few months before the search engines even give it the time of day. The reason for this is that search engines want to make sure the site is legitimate and not just a spam site there to just get rankings. To keep your site from falling under this category, having an older domain is key. If you have a new domain, then roll out your pages slowly. Push a section one week, wait a few weeks and push out another section. Having a blog is also good for SEO as you can add posts with content encouraging search engines to keep coming back regularly and learn that this site has something interesting and unique. Of course, the more traffic you can get in those first few months the better, so get your social media and advertising up and going.
  21. File Location on Site – This falls under the URL hierarchy category. Watch out for where pages and files are located on the site. For SEO and for your uses, the structure and location should make sense.
  22. Domains versus subdomains, separate domains – Watch out for the use of sub-domains for your site. Too many websites will put their blog on a sub-domain and not in a directory. This won't hurt your SEO, but it won't help either. What happens is that the search engines count the subdomain as it's very own website, and doesn't link the content with the rest of the site. It is more beneficial for SEO to have all of your content no matter what it is, on your main domain in a directory. Keep it out of the sub-domain unless absolutely necessary.
  23. Top-level domains (TLDs) – A top level domain is the main domain for the site. Even in the case of a www.yoursite.com, the “www” is considered a sub-domain. Yes, a sub-domain… So try to use http://yoursite.com if you can. If the search engines already recognize your www.domain.com then leave it alone, and let Google know that you prefer to use your www. Vs. just the domain. You can do this in your Webmaster Tools.
  24. Hyphens in URLs – For SEO, it is recommended that you use “-“ in your URL rather than “_” or even just a space (which ends up rendering to %20). Search engines just happen to prefer the hyphen to underscore or space.
  25. URL length – For SEO purposes try to keep your URL under 2000 characters, but really the shorter the better. Pay attention not to have more than 3-4 parameters, or a URL that has a really long sentence.
  26. IP address – Your IP address should reside in the country your website is ranking in. US and English should have an IP located in the US. French and Canadian, should have an IP in Canada.
  27. robots.txt – Blocking irrelevant content in the robots.txt will really make a difference for your SEO. It has been recommended in the past to block external css and image directories, but now Google has said they would like to crawl them. Search engines are getting more and more sophisticated to where they can decipher all of the code and really get a good idea of what the whole website is about. Some only block pages and content you really don't want search engines to crawl.
  28. Redirects (301 and 302) – For SEO, redirecting an old URL to a new URL will usually pass the old URLs value to the new URL. But be careful to use 301 redirecting sparingly. I personally have witnessed and dealt with sites that had issues with too many 301 redirects causing rankings to drop.
  29. Social Actions – Social actions like Facebook ‘like's, tweets, shares, Google +1s, and so on will really add  lot of value for SEO. Anytime a user has to take action to show that they see the value in the page will show the search engines that the page is relevant and valuable. Therefore, increasing your rankings for SEO.
    • Google+– Yes, Google loves their social media site, and providing a way for users to +1 your page and site will drive up rankings in Google.
    • Facebook 'Like' or 'Recommend" – The action of ‘Liking' a page for Facebook will sometimes help with Google, but really helps with Bing more than anything. Microsoft anf Facebook have a very close relationship allowing for Bing to use social actions that happen in Facebook to help drive rankings for sites.
    • Facebook comments – If you can, try to pull comments that happen in Facebook related to your site and the page into the page itself. It not only allows for more and unique content, but shows Bing and other search engines that the content on the page is valuable to the user, therefore driving up your SEO.
    • Twitter "tweet" - A simple tweet with your page's URL will always be counted as a “vote” for your page and website. The more you can get, the better for SEO.
    • OGP - Open Graph Protocol – OGP was developed and adopted by Facebook as  way to manage how a page or website looks when shared in social channels. Twitter, and other social sites have followed suit, and my prediction is that Google will start to pay attention to OGP soon. So be sure to spend the time and make sure your basic OGP tags are set for all of your pages. It could really help your SEO.
  30. Links – External – Links pointing to your site are important. As mentioned early, tread very carefully with your link building. Make sure that the page(s) linking to your site and pages are relevant to your site. Do not use directory submission websites, don't buy links, and be weary of link exchange requests. Just as external links can benefit SEO, they can also harm if not done properly. Keep the following in mind for your SEO:
    • Quality of source of inbound links
    • Links from similar sites
    • Links from .edu and .gov sites
    • Age of inbound links
    • Links from directories
    • Links from Social Media
    • Links on pages that include social actions
  31. Schema – Google places a high emphasis on schema tags and information. In the past they have said that if you can get it in there, then great. Now they look at schema information to help drive rankings for SEO. Not to mention that you can manage what is displayed in your snippet from star reviews, author information, embedded video, etc.
Of course there are thousands, if not millions, of algorithms that search engines use to determine rankings, leaving the list I gave you here a small set of what really goes into optimizing a site. In all of my years optimizing websites, I try to write blog posts when I come across issues or get into deep level discussions with my peers on SEO topics. But again, there is so much involved, and sites are all different from one another. I have been teaching workshops since 2007, and have been through thousands of individuals trying to learn SEO and optimize their own sites, only to find that they still need the help of an expert.

What I suggest is that you learn the basics, as much as you can, and start optimizing your site yourself. If you have a site that is older and hasn't been touched in years, go through and see if there are sections and pages you can add with some unique content to add to what you already have. If your site is larger and the traffic just isn't where it should be, then look at what you can do to restructure it to reflect the categories and longtails terms you found in your keyword analysis.

If you want to see how your different categories of terms are performing, you can use this handy template I created along with instructions on how to grab the traffic you are seeing. For some clients, I have used the template to show the estimated traffic I see in the keyword analysis compared to the actual current traffic to show what is missing. I will use the top few terms in the keyword analysis to see how aggressive the category terms are going to need to be to get rankings during the competitive report for SEO. The categories with the most potential, the largest gaps, and the least aggressive with competition are the ones I recommend to tackle first. The competitive report will also help determine what all will need to get done to generate rankings. Is it just one page with a bunch of content and the word mentioned several times, or is it a whole directory with files and filenames that include a mired of terms for SEO that all link to one another?

For usedcars.com the location pages where we generated rankings for the terms “used cars in”… with city and state searches was fairly easy for SEO. The content has a few lines of text seeded with the city and state from the database (also known as templatized content). Content for the page also came from inventory (car listings) provided from the database, with a block from normal listings in that city and a block of deals in which there is a calculation done in the back end that looks at the price of the car and looks up that VIN and price against the Kelley Blue Book value and returns the percentage difference showing cars that are priced under value and are a good deal. Users love those listings. There is also a large map that shows dealerships in the usedcars.com system that are located in that area. The map is generated from Google and helps those pages get rankings for that location.

Those pages were pretty easy to get rankings (after a lot of the mess was cleaned up), and have help rankings providing close to 50% of the traffic from SEO for that site.

A more complex project for usedcars.com that required more pages, and the SEO to be more aggressive is what we called the Make/Model project. The goal was to get rankings in SEO for the brand of cars and the cars with years search trends. We found that users that search the “year make model” search know exactly what they are looking for and are more likely to purchase. So, ranking for all of those year, make and model combinations were highly valuable to the business. The problem is, that all the other car sites know the same strategy and have been very aggressive for their SEO.  A set of rules for syndicated and dynamic content was set in place along with a plan to roll out pages and content in phases. When I left in May of 2014 the project was still underway, but the pages were already seeing some traction. You can see how the pages were developed at http://www.usedcars.com/car/ - considering they are still intact and working on the pages as specified in the project.

I'm always happy to talk SEO with anyone anytime. You can find me on Skype (as SEOGoddess) or fill out the contact form on my site here with any questions. I'm usually pretty quick to respond, and can help you in any quick SEO issues or questions as you try to optimize on your own. I have even been known to look at a website when an agency is working on the SEO just to make the site owner or boss feel comfortable that their agency really knows what they are doing.

There are also many resources other than myself or this blog, and plenty of SEOs with a lot of great experience. Ian Lurie is one of my favorite people in the world, and has a very successful agency with a lot of great SEOs he has taken under his wing and turned into skilled professionals. His company Portent can also help with website design, social media, and paid search marketing. Give them a glance over and see if they fit your needs. Bruce Clay is also a very close friend and someone I go to regularly myself for help. He works with very large corporations on a large scale including AT&T, CNN.com, Edmunds, and more. He is what some of us in the SEO industry call the “Godfather of SEO” since he was one of the original SEOs that has set the standards for quality in optimizing.

I do have a larger list of SEOs I know and trust, so feel free to contact me and ask me for someone in your area, or who might specialize in a site that is much like yours.

Either way, SEO can be fun and you can really learn a lot quickly if you want. You can know enough to be dangerous, but if you stick with the general rule of “don't trick the search engines” you should, for the most part, be just fine.

In the end a site that has increased traffic from SEO is a site that I generating a lot of money, and that's just good for business.

Tuesday, July 30, 2013

Anatomy of the URL and Stuff

I'm sure you are looking at the URL above and thinking to yourself; "Wow, I never realized that all that stuff meant something." Oddly enough it actually does... As the world wide web has changed into a search friendly, user interactivity playground, the formation and meaning of the URL has evolved considerably in to a very significant factor in not only search engine compliance but in how people use websites. Lately I have been helping clients understand how their website's are structured and how servers to browsers to users work. It's something us search optimizers view as something so simple yet can be so complex to someone who doesn't understand how it all works. So here is the URL broken down piece by piece and explained.  

First - What is a URL? 
A Uniform Resource Locator is a website address that holds very important information between each "." and "/" much like an address to your home contains a house number, city, state, country, etc. This allows the browser to connect to a specific website, directory/path, and/or file in order for the user to see what it is on the page you want them to see. A URL consists of the following:


Hypertext Protocol Established by English physicist Tim Berners-Lee in 1990, hypertext protocol is a request/response standard typical in which the client is the application (User on a web browser such as IE, Firefox, safari, etc) and the server which hosts the web site itself. The client submitting HTTP requests is typically referred to as a user-agent (or user) with the responding server—which stores or creates resources such as files (html, .asp, .php, css, etc) and images—referred to as the origin server.*

  WWW (World Wide Web) or "sub-domain" 
The WWW is typically placed before the main domain of your website URL, referencing the World Wide Web. Remember the game you played in elementary school where you could start your home address with your house number, street, city, state and then go off as far as your country, continent, and even earth. The WWW is the address starting with "earth". In some cases, what we call a "sub-domain" can replace the WWW in your URL, which references a whole new website within your existing domain. Search optimizers can use this as a way to target certain key terms. For example, a real estate agent targeting a specific city will use http://city.domain.com and thus will have a leg up when ranking for anything within that city. In most cases the sub-domains will link to the main domain and, since they are treated by most search engines as a domain all it's own, then it will count as an external link credit, boosting the rankings for the main domain it is linking to. It is highly recommended that you avoid this technique as it is only tricking the search engines and in the end will hurt your rankings rather than help. 

  Domain Naming System (or DNS) 
The domain naming system was established so that the common user can understand in simple terms the location of a web site. A web site's files are usually stored on a server that points to a specific IP address (much like a phone number directs someone's call to your phone). In order for the general public to understand where to locate a certain website and it's files, the specific domain name resolves to that particular IP address. In addition, the Domain Name System also stores other types of information, such as the list of mail servers that accept email for a given domain (such as you@yourdomain.com). 

Top-level Domain Extension 
The domain extension originally consisted of the generic gov, edu, com, mil, and org. With the growth of the internet, the addition of country extensions and other such categories have come into play. The most recognized of the extensions is of course the .com. If you are optimizing for a specific country and language, then the best route to take is to register your domain with that specific country's extension. This will help the search engines recognize that you are targeting that particular audience and will rank that site accordingly. Be sure that your country specific site is in the native language for that country to avoid any duplicate content issues. Do also be careful of linking from that domain to your main domain as once again the site will be penalized. 

Directories and Files 
Here's where the fun stuff comes into play. Just as your computer organizes your word doc, excel, and other such files into folders, a server structures your website files in the same way. A "directory" or "path" is much like a "folder" is on your computer. In standard (old school) html development (before the days of creating dynamic websites powered by databases and user interactivity) a file would be created and named "index.html" or "default.html" and placed either on the main domain folder (in which the DNS resolves to on the server) or placed in a named folder (in order to help the webmaster organize the site's files). As the technology grew and more ways to develop websites with user interactivity and database driven websites advanced, the structure has pretty much stayed the same with the addition of "parameters" that reference a part of the database and returns content and such on a page based on those parameters. (have I lost you yet?) Let's go back to the basic structure of the static html files and go from there...

A Dynamic website is one that has a few static pages (in other words the pages are coded and are only editable by a developer) that have parameters that will pull in content or trigger specific actions from a database. The basics of a dynamic page is one that pulls words, images, etc from a database and can do so creating multiple pages with different content from one basic page. A more complex dynamic page (or site) is something like Facebook, or Twitter in which they recognize whether or not you are signed in with a username and password and will show you either your profile page (if you are signed in) or a "please sign up" page (if you are not signed in or don't have an established username).
In order to help understand this let's talk about how a database works. A database is essentially similar to that of an excel spreadsheet or table in a word document that has a unique identifier for each line (or row) and holds different content for each line item. Example:
Email
First Name
Last Name
Sujo234
bob@bobsemail.com
Bob
Sujo
Forjill23
jill@jillsemail.com
Jill
Forman
Username
In this example the username is the unique identifier with the email, first name, and last name as different parameters for that username.

The content will be different on each page. With dynamic content the possibilities are endless as far as how many pages you can create from developing and design just one file. A great example of how a dynamic page is created for search optimization purposes is on usedcars.com - If you search for "used cars in oslo mn" you see the "UsedCars.com Oslo MN" page in the results. Look at the URL in the address bar when you go to that particular page - http://www.usedcars.com/browse/mn-24/oslo-163.aspx. In this case the page is pulling in the unique ID that is equal to "OSLO 163" and "MN 24", just as the username is the unique ID in the above table.  

SEO Friendly URL 
In order to make your dynamic URL friendly for search engines you must use a rewrite. A great resource for rewriting a URL is the Apache Rewriting Guide. Some open source content management systems (such as Wordpress, Drupal, etc) already do the rewriting for you and all you have to do is enter what you want the URL to be (be sure to include your key terms separated with dashes "-" and not underscores "_" for search happiness) Who would have thought a URL could be so complicated? But when it comes to search optimization and understanding basic website development it is very important to understand how the URL works, how it is structured, and how to make sure your site is URL and search engine compliant. *http://en.wikipedia.org/wiki/Http_protocol


Tuesday, July 23, 2013

Categorizing Keywords

For those of you SEO's that manage very large sites and map your keyword categories to sections of your website - you know how difficult it is to categorize your terms and track their performance. Well, I have to say that after searching, asking, and digging around for a tool that does exactly what I am talking about, I finally came up with a solution. It's a bit of a workaround in Excel - but it's the best I can do until someone comes up with a tool that categorizes keywords for SEO.

Know Your Keywords and Categories


Before you get started categorizing the terms that come to your site, you should know what keywords you are targeting, and the combinations of terms as well. I'm going to use a flower shop's website as an example for this particular blog post. Categorizing is something you can do with any website. At the very least, you can categorize terms into "Broad" and "Branded", to get you started.

Most keyword tools can help you establish what categories to target. Google's Keyword Tool or WordTracker are just a couple of the many tools available on the web.

Another way to figure out terms that fit in categories is by grabbing search data (referring terms in Google Analytics) on your site for the past few months or year. I personally spent some time going through and categorizing keywords in Excel by using the filters and then having the sheet show all words including "anniversary" for terms around "anniversary flowers". It takes a lot of work and time, but in the long run you will have a more accurate account of the terms you will need to do the Lookup against.

Setting Up Your Template

Download the Template

Now that you have all the terms possible in all of your categories it's time to start setting up your template. You are going to want to Download the template I have set up in Excel. You can start from a fresh Excel document if you want, but the template has directions (in case you lose this blog post somehow) and the Lookup formula is in there.

Once you have downloaded the template it's time to get it set up to work for your keywords.

In the following steps - I am going to walk you through setting up the template and then categorizing the terms. If you don't have terms that you can use already, I have a zip file you can download and walk through the example with me to get familiar with how this works.

Copy and paste your first set of categorized terms and paste them into the first Tab marked "Broad". Since every site usually has a "Broad" category of terms, I figure that's probably the best to get started with. In the case of this example "flower shop", "online flower shop", and "best flower shop" terms are the ones that fit under the Broad category.

If you have the .zip folder downloaded, open up the "Terms" Excel doc and you will see the words already categorized for you. There are "Broad", "Branded", "Birthday", "Anniversary", and "Wedding". Click the Drop Down next to "Category" and click "select all (to deselect all) and then click "Broad". You will see all of the terms sort by just that "Broad" category.

Next select all of the terms in the "Keyword" list - copy and paste them into the "Broad" Tab.
We will then need to sort the terms in alphabetical order so that the Lookup string can go through them in order. If you don't then the Lookup won't work.


Highlight the Column with your keywords
Click "data" > "sort"
Select "My data has headers"
Select under "sort by" the column you keywords are under (should be column A)
Click OK

Double click the Tab and rename it with the one word name of your category.
Highlight all of your keywords in the column (just the cells that have words, not any blank cells).
Type the name of the category (stick to one word naming) into the upper left field. You have now named your table.

Do this for "Branded" and the other categories as well. You are going to have to create a new tab in the template to fit all the categories.

If you have not downloaded the .zip file and are working off of your own terms, creating new tabs and naming them is probably going to be something you will need to do. But don't worry, the template will still work.

Now that you have all of your keywords in your Template's Tabs with names and sorted it's time to set up your Lookup string.

Setting up Your Lookup


The way the Lookup works in this case is we are going to ask Excel to look at one Keyword (one cell) and match it up to one of the terms in the Tabs we have set up. If it matches one of those terms then we tell Excel to place the word into that Cell. If it doesn't, then we just leave that cell blank.

The string looks like this:
=IF(ISNA(VLOOKUP(B2,Broad,Broad!A$2:Broad!A$999998,FALSE)),"","Broad")
  • B2 is the cell of the keyword we want to look for.
  • the first "Broad" is the Table name we want to look for that keyword in.
  • Broad!A$2:Broad!A$9999998 is the Tab and range that the Table exists in.
  • FALSE is telling the Lookup to do an exact match. TRUE would look through to see if letters from that Keyword exist in the Cells we are looking in, so in this case it won't work.
  • We leave the ,"", as a blank - but you can put "not categorized" or "misc" to show that it isn't in a category. Though for our purposes here, we keep it blank.
  • ,"Broad" is telling Excel to put the word "Broad" in the cell if the keyword matches one of those in the Broad Table or Tab.


See - it's that easy...

What you are going to do next is replace the word "Broad" or "Cat1" with the name of your table, Tab, and category. This is why we name the Table, the Tab, and the Category the same so that our life is much easier when setting this string up.

Now your template is ready for you to paste some keywords with data and grab some numbers.

Gathering Your Data


Open up your Google Analytics account - if you don't have Google Analytics, pretty much any tracking tool that has a list of referring terms with some sort of data is fine. You can expand and contract the columns to the right of the terms as you wish. The template you will download will have the columns set up just for the purpose of exporting referring terms with visits and such from Google Analytics though.

Log into your Google Analytics account.
Click "Traffic Sources" > "Sources" > "Search" > "Organic"
Select the date range you would like to report on.
Scroll to the bottom of the report and show 5,000 rows.
Scroll back to the top and click "Export" the select "CSV".
After the file has downloaded, open the excel file.
Highlight JUST the cells that include the keywords and your data (ignore the first few at the top with date and information, and the bottom that summarize the data and below).
Copy those cells, and paste into your "Master" Tab.

Note: If you have multiple dates you would like to track, you can export the different date ranges, and then add which keywords go with what date in the Master Tab. This will allow you to see trends of categories.

I added an Excel doc called "Analytics Organic Search Traffic" with some terms and fake data that you can play with. There are three tabs that I added dates for each day's data. Start with just the one day and play with that to get familiar with percentages. From there you can play with all three dates and work on your trends to see what categories are trending up and down.

Completing Your Lookup


Now that you have copied and pasted the keywords into the "Master" Tab it's time to get all of those terms categorized.

Select the top row with your categories and your "All Categories" cell
Copy just those cells in the top row
Highlight the next row (same cells just below) hold down the "shift" key
Scroll down to the last keyword record
Holding down the shift key select the last cell under the "All categories" - this highlights all of those cells for those categories to Lookup the keywords.
Hit "CTRL+V" on your keyboard (this quickly pastes the Lookup formulas for each line)
Be patient, as it may take a while for your Lookup to complete (depending on how many keywords, and records you have)
The "Master" Tab should look something like this:

Playing With Your Data

The most efficient way to gather information from your data is to copy the entire "Master" Tab and paste as values into a new Excel sheet.  This way you won't have to wait for the Lookup to complete each time you sort, pivot, etc.

Click the top left "Arrow" in the "Master" Tab
Right Click and select "Copy"
Open a new Excel Doc
Right Click and select

From here you can create pivot tables then sort them into pie charts, graphs, and all sorts of fun reports to see how your keywords are performing.

I personally like to start with a quick pie chat to see what category of terms brings int he most traffic. At times we will have a drop or rise in traffic, and it's good to understand which category of terms are fluctuating. By copying and pasting terms by dates (weeks, months, or even a set of a few days) will help me see which categories are fluctuating on a timeline trend. Knowing which categories bring int he most traffic, I can then make decisions on which parts of the website we need to focus our efforts on to increase traffic.

See how much fun categorizing your terms can be?
Now that I have a template I work off of, when traffic goes up I can quickly categorize the terms and let our executives know if our recent efforts have worked.

Friday, January 18, 2013

SEO Issues - is it Penguin? Is it Panda? or is it me?

The following story is one that has been several months in the making. It's one that I have lived through one too many times as an SEO, and it is one that I am sure other SEO's have faced. I fought with the thought of writing this for fear that someone from the company might read it and get angry that the story is told. But, it's something I think that not only people out there could learn from, but speaks to so many others in this industry to show them that they are not alone.

It's long, it's a bit technical (I tried to keep it simple), and it has some personal frustrations laid out in words. My only hope is that you get value out of reading this as much as living it has made me a better person (or well, a better SEO).

It Begins


I started working on this website's SEO in May 2012 at which time I was told the site's traffic was declining due to Panda updates. In February of 2012 the traffic from SEO was the best they had ever seen, but soon after that there was a steady decline.
Traffic from February 2012 - May 2012
Before digging into any possible SEO issues, I first checked the Google Trends to ensure that the decline isn't searcher related. Often times a drop in traffic could just mean that users aren't searching for the terms the website is ranking for as they were in the past.

Top Key Terms in Google Trends
Looking at the same time frame as the traffic data, I noticed an increase in searches for the top 3 terms the website ranked for, and there appeared to be a decline around the same time from March to April that the traffic was reflecting. But there was a drop in the website's traffic in April from the 23rd to the 24th and then significantly on the 25th. The website I was working on had two SEO's already working on it: an agency and a consultant. Both had already done a numerous amount of research and some work to get the website on track. Both were stressing that the drop in traffic was due to the Panda updates by Google. I looked at SEOmoz's Google Algorithm Change History and found an update to Google's Panda on April 19th and an update to Penguin on April 24th. Given that the traffic significantly dropped on the 24th my best guess is that it was possibly Penguin related, but still needed further exploration.

Figuring Out What Was Hit by Penguin.


The site is/was broken up into sections by keyword focus. At one point, I could tell that someone really had a good head on their shoulders for SEO, but the strategy that was used was outdated. Perhaps the site was originally optimized several years before, and it just needs some cleanup now to bring it up to 2012's optimization standards. So, understanding Penguin and identifying which part of the site was driving the bulk of the organic traffic was going to be my next step in solving this mystery. Once I understood why, and where, then I could start to establish a what to do to solve the problem.

I broke the site traffic report by sections as best I could in Google Analytics. There was a bit of a struggle as all of the pages of the site resided on the main domain. Without a hierarchy in place, breaking out the sections had to be accomplished with a custom report and a head matching for landing pages. I hadn't had to do this before, so the agency that was working with the site already helped build the first report, and I began building out the other reports from there.
Click to View Larger
Section 1 over 72% of traffic

Just focusing on April and May I created a Dashboard in Google Analytics focusing on organic Traffic and identifying the sections of the site. Looking at the different sections - Section 1 was the bulk of the traffic with over 72% and Section 2 coming in second with just over 15%. Subs of Section 3 and other one-off pages make up the difference.

Both Section 1 and Section 2 dropped off after the April 24th date, so clearly they were the bulk of what was pulling the overall traffic numbers down. Since Section 1 was the majority of the traffic, I presented to the executive responsible for the site that we address any issues with that page first.

Actual screenshot of Section 1 presented
I took all of the research from the agency and consultant and we quickly reworked the pages to represent a hierarchy in the URL structure, and cleaned up any issues from the outdated optimization that was done.

Soon after Section 1 was addressed, we did the same with Section 2, and then worked on Section 3 (and sub pages, rolling them up into a solid section) and then added a few pages to grab any new opportunity.

Not Quite As  Easy as it Looks


The projects were launched in increments - first URL hierarchy fix to Section 1 and then the page redesign. Next was a full launch of URL fixes and page redesign to Section 2, and then lastly Section 3 and the new Section 4.
Section 1 - Section 2- Section 3 Launch Dates and Organic Traffic
Soon after Section 1 was launched traffic started declining rapidly. I was asked several times why traffic was getting worse, and I started digging some more. Every time I looked at the Impressions of the new URLs from Section 1 they weren't getting any traction, but the previous URLs were still.  I began looking at the history of the website, trying to find out why it was doing so well at one point, but was not doing well at that time. One of the things I noticed was that there was a lack of priority linking to these pages, but at some point there were links to some of them individually from the homepage. Google matches a hierarchy of pages to a directory structure that links are presented on a site. This site had every page on the first level, and linking to those pages from the homepage, which was telling Google that every page was the most important page. It worked at one time, but as Google has been rolling out their 2012 updates these pages were getting hit, and those links on the homepage weren't there anymore. Before the launch of Section 2, I had them put links to the main directory for each section on the homepage. The links would tell the search engines that these are important pages of the website, but not be so obnoxious with a dozen or more links on the homepage to discourage users (avoiding the appearance of spamminess).

But - even after adding the links to the homepage, the traffic to those pages was still declining. Pressure was put on me to figure out what was wrong. In addition, accusations were flying that I single-handedly ruined the SEO for the site, I spent every waking hour looking at reports, and trying to figure out what was going on. I consulted friends in the industry, and read every article I could find to figure out what Panda or Penguin updates were affecting these pages.

Then it hit me - just as the links to these sections would help them get recognized as important pages, so were the other pages that were being linked to from the homepage. In fact a set of them linked to the website's search results with queries attached to them mimicking pages, but showing search results. On those search results pages, there were over 200 links with multiple (we're talking hundreds - possibly thousands) combinations of parameters. The bots were coming to the homepage, going to the links to the search results pages, and then getting stuck in this vortex of links and combinations of parameter generating URLs - not allowing any crawl time for the pages that once were getting rankings. This also explains why the new URLs weren't showing very many impressions in the Webmaster Tools Data - those pages just weren't getting crawled.

There was a project underway that would solve the many links on the search pages, and there was also talk of using ajax to show the results. When this project would launch, the bots would go to the URL from the homepage, but would then essential not go much further. With this project a few months out, I made the case to add the search page to robots.txt to allow the bots to then recognize the Sections as important pages. After several weeks of attempting to convince the powers that be, the URL was eventually added to the robots.txt file.

Immediately after the search page was added to the robots.txt Google Webmaster tools presented me with a warning:
Warning in Webmaster Tools
In most cases, a warning from Google should never be taken lightly, but in this case it was exactly what I wanted. In fact it proved to me that my theory was correct, and that the site was hopefully headed down the right path.


Panic, Questioning, and a Third Party


As with every up in the SEO world, there must be a down. Soon after the search result page was added to the robots.txt the organic traffic to the site dropped, and continued to drop. Throughout those grueling three months there were several Google Panda and Penguin updates. I documented each and every one of them in Google Analytics, and continued to answer questions, gathering data, and dealing with being under close scrutiny that the work I was doing was complete BS.
Organic Traffic from September 2012 - November 2012
I sat in numerous meetings, some of which I walked out crying (I'm not afraid to admit it), being questioned about the road I had taken and why we weren't seeing results. There were people within the company recommending that they roll the pages back to where they were before, and even changing the URLs. I fought hard that they don't touch a thing. I sent an article posted on Search Engine Land by Barry Schwartz citing Google's patent that "tricks" search spammers.

The patent states:

When a spammer tries to positively influence a document’s rank through rank-modifying spamming, the spammer may be perplexed by the rank assigned by a rank transition function consistent with the principles of the invention, such as the ones described above. For example, the initial response to the spammer’s changes may cause the document’s rank to be negatively influenced rather than positively influenced. Unexpected results are bound to elicit a response from a spammer, particularly if their client is upset with the results. In response to negative results, the spammer may remove the changes and, thereby render the long-term impact on the document’s rank zero. Alternatively or additionally, it may take an unknown (possibly variable) amount of time to see positive (or expected) results in response to the spammer’s changes. In response to delayed results, the spammer may perform additional changes in an attempt to positively (or more positively) influence the document’s rank. In either event, these further spammer-initiated changes may assist in identifying signs of rank-modifying spamming.
 But the article and my please fell on deaf ears...

It had gotten so heated and there was fear that nothing was being done while traffic was significantly declining that the company brought in yet another SEO consultant to look at the site objectively.

Just as the consultant was starting his audit, and the traffic hit the lowest I ever thought it could possibly go, the next day traffic went up. The last week in November (roughly 3 months after we blocked the search result page) I saw an increase in traffic in Google Analytics to Section 1:
Section 1 Organic Traffic
I quickly pulled up my report to check the Section's impressions from the Webmaster Tools data, and there was a significant increase as well:
Section 1 Impressions from Webmaster Tools Data
On December 3, 2012 I logged into Webmaster Tools and saw that the warning had gone away:
It was the "halleluiah" moment that every SEO dreams of, and very few get. All the work I had done, the fighting for what I believed in, it all finally paid off.

To this day traffic continues to increase - we can now focus on some of the cleanup still left to do, and then onto projects that will attract new opportunity.
Organic Traffic from November 2012 - January 17, 2013 (day before this post is written)
Quick Note: 
I forgot to mention a post I wrote months ago while going through all of this - SEO - Panda and the Penguins. It helps to give a bit of perspective of some of the linking stuff I didn't get into in this post. 

Monday, December 3, 2012

SEO Buzzwords - don't get sucked into the hype

I am asked often by people wanting to get into the SEO business where to get training. There are a lot of online resources available; articles, blog posts, videos, and even downloadable presentations. It's difficult to know what to believe, who to pay attention to, and what will work for any particular website. Most experienced SEO's will tell you to learn as much as you can and then simply start optimizing and learning from trial and error. But who has the time? Let alone wanting to risk a website losing rankings or, even worse, getting banned for using the wrong techniques? This industry is very fickle and is always changing - what may work for one website, may not work for another, and what that big company that dictates how we should be doing our job changes it's mind often.

I have sat back and watched how the industry began, has grown, and developed throughout the years. On one hand it's been fun to be a part of something big that started from one company's idea and development that turned out an entire industry as a result.  On the other had, because it is still a very young industry, and that industry is dictated by the company that sparked it, we are all still developing standards, strategies, and learning every day.  In fact, just the other day I saw a post on Facebook for a workshop on how to use the Google Disavow Tool. It scares me to see SEO's already taking advantage of a strategy that is to not be taken lightly and making money off of "teaching" people on how to do it themselves. It's like a surgeon trying to teach a child how to patch up a kidney. Any seriously wrong move and the patient could die, and any slightly overlooked part of the process then the kidney could fail over time not knowing if it was the surgery or that the patient drinks vodka all day long.

In trying to learn and keep up with the latest in this capricious industry we often find ourselves having to look-up and research what the "experts" are talking about - those SEO buzzwords - coming across contradicting opinions, and quite frequently second guessing ourselves (even the highly experienced SEO experts second guess themselves). I have too often seen people trying to do what they think is right, and completely messing up their own site, and even client's websites because of all of the hype and misinformation out there.

The truth is that it is all viable, it's all in how you approach it. Of course, hearing that probably doesn't help, so the following are some of the most common strategies and some SEO buzzwords and hopefully clear up any confusion you might have. 

Link Building


Yeah, I started with the most common, yet most controversial buzzword of all. The term "Link Building" began with the birth of the almighty Google itself. What was a very simple and quick way to get rankings for a website for the most popular search engine, is slowly becoming an art-form within itself. The basic idea is that a link from one site pointing to another site is counted as a "vote". The more links pointing from other sites to one site the more votes, and thus higher rankings. With such an easy strategy to implement, and the growing popularity of the search engine that uses the algorithm, more and more spammers began to take advantage. By offering website owners to pay money fro a link pointing to a website (a.k.a. purchasing links), asking a website to link to a site in exchange for a link back (a.k.a.link exchanging), submitting a website to directories (a.k.a. directory submissions), commenting on blog posts (a.k.a. commenting), and even submitting articles with links in them to article distribution sites - all of these means of obtaining links tricked the search engines into ranking websites that might not have otherwise deserved the positions they were given.  In December of 2007 Google began cracking down on such strategies not only with increasingly new algorithms that catch sites that might be purchasing links, but by allowing webmasters to report one another manually. In the years since, we have seen a dramatic increase in the quality of the websites appearing in search results as a result.


SEO Buzzwords from Link Building:

Text Links
Links that point a page that contain a descriptive keyword or phrase. Many SEO's have used this strategy in the past because they give a context and a relevance to a link. This means that the search engines can read and index a page with all the text links and assign a ranking based on the quality if the content, the links and the destination of the links.

With Google's latest updates, the search engine no longer looks at the text within the link itself, but rather the words and relevance around the link. By recognizing that a page on a dog breed website with a link to a pet related website contains terms like "puppy", "hound", "paws", and other pet related terms that the dog site pointing to the pet site is, in fact, related. What I have seen in the past is a automotive website with an article on candy that contains one text link for "chocolate bon bons" and points to a chocolate website just isn't going to count (believe me, I've seen it). In fact, it will hurt the website's rankings.

Link Bait
The idea behind ‘link bait’ is to encourage people to bookmark or link to your website from theirs. Personal blogs, social media sites, and other communities will usually link to a site if the site offers something useful. Because of this, the search engines place a high value on the link.The best way to obtain these types of links is to write articles or white papers, a very valuable blog posting, or any sort of information your audience will find relevant. The more they share, the better the website ranks. The trick is to not force it - don't go out hiring people to share your posts, just let them happen naturally.

Link Juice
The ‘search equity’ that is passed to one page from another is called "link juice". The more relevant a page is, how often it has been shared, and how many times it is visited places a high value from the search engines. From that page (or website) the pages that link from it will also gain extra value because the original content is deemed useful to users.

Internal Linking
Almost self explanatory, most individuals tend to overlook the importance of linking within their own website. In fact, in most cases, the link to a page from a homepage can be just as valuable, if not more, than an external link. This, of course, does not mean that you should go adding a link to every page of your website from your homepage; nor does it mean you should link to a few pages, then rotate them, so that every page gets a chance at a high vote. What it means is that the pages that are most relevant to your users and make the most sense to continue from the homepage to, are the ones you should link to, and are the second most valuable pages (next to your homepage) that the search engines will rank.

Taxonomy
Categorizing a website with a hierarchy and linking to one another internally is one of the best ways to show the search engines which pages are most important, and where they should rank. If a website is about cupcakes and selling supplies, the site should be organized by types of cupcakes (perhaps flavors) and then categories of supplies. Then place pages within that category that make sense. From there, pages should link to one another where relevant to show the search engines that this is X category and a set of pages, and this is Y category with a set of pages.

Internal Optimization


Often overlooked by agencies simply due to the fact that so many clients will hire an agency to "optimize" their site only to tell them in the end that they don't have the resources to make the suggested changes, or that they simply just can't make changes (whether it be because of design, usability, business reasons, etc).  Unfortunately this leaves agencies in the predicament that they have to please the client and do what they were hired to do (which is to get the website rankings and increase traffic) but left with no other choice but to start link building. But a good SEO knows that internal optimization is really the heart and soul into obtaining legitimate rankings that will stick throughout all of the spam algorithm updates like Panda and Penguin. Below is a quick list and brief explanations for internal optimization.

Metatags


Title Tag - This often shows up as the title in your search engine result. The title tag should never be more than 70 characters, and should only contain your most broad term that describes your website.

Description Tag - The description tag will often appear in the search result as the description text if the key term searched is within the tag. If not, then the search engine will pull from the content on the page itself where the key term is located. If a page on your site is specific to a certain term, then this is a good time to get that term within the description.

Keyword Tag - The keyword meta tag was once the main source of how search engines determined what site would show up for what search. Now it isn't as relevant, but is still used by some meta crawler search engines (not Google - but Excite, and often Bing). List out a few of your target terms for the page you are optimizing to help you focus on what you want the page to rank for, and just in case a search engine is paying attention.

Content


Keyword density in document text - simply put, search engines look at how often a term shows up within the content of a page. If a word is mentioned 10 times within 300 words on a page, then the page won't get very good rankings. If a word is mentioned 10 times within 1200 words and spread out once or perhaps twice in a paragraph or two, then that page is more likely to rank better. A quick way to check densities is to put the content of a page within Microsoft Word, do a search within the document (Find), type in the word, and click "Highlight All". it's a great visual to see where a term is placed.

Content around the anchor text - As mentioned earlier, the words and context around an internal link is representative of the relevance of that page. The more a page will have of terms similar in context to the term you are optimizing for, the better.

Unique content - Any content borrowed, rented, or just stolen is considered a felony in the SEO world. There are algorithms in place that look for not only content within a site that exists elsewhere on a site, but content that exists on other sites as well. A quick way to check to see if your site has unique content is by searching on copyscape.com. Content that you have on yoru site that exists on other pages (or every page) will simply just not get counted (sort of just overlooked by the search engine), so any key terms within duplicate content on your site won't count. Duplicate content outside of your website is another story. If a website has content that you have copied (in other words, they had it first) then your site will get penalized. If your site had content first, and then someone copied you, then they would get penalized.

Frequency of content change - Search engines don't know the difference between a blog, a new publication, or a brochure-ware site that remains static. The best way they have developed to recognize a cutting edge news site and a static site, is how often new content is generated. The more often a new page is created with a robust amount of text, the more the search engine will come back and index, and therefore the higher the priority those new pages will get. If your site is something that is updated often, and is generating new content regularly, then the search engines will adjust accordingly. If your site is static, then don't worry, let it be, and the age of the pages will determine where they belong in the world of rankings (mentioned later).

Anchor text has key term(s) in links - What was a solid strategy of obtaining rankings for key terms in the past, is now less relevant, and even considered bad SEO. It's more about keyword "essence" and the relevance of the terms around the anchor text, than the anchor text itself (as mentioned above). Some of the more experienced SEO's are even finding that linking the word "more" or "click here" are helping their rankings more so than putting the key term within the anchor text.


Duplicating content - As mentioned before in the "Unique Content" bullet item, duplicating content on a site, or from another site is a very bad technique.

Invisible text - Nope, don't use white text on a white background with a bunch of keywords in it that only the search engine can see. Even 1 pixel high div's with the overflow hidden set in the stylesheet is a bad thing. Not only will you not get rankings, but your site will get penalized for it.

Overall Website


Age of website - the older a domain (or website) is, the higher a priority it will get within search rankings. A typical spam strategy is to buy a new domain and optimize it as much as possible to obtain quick rankings. Because of this, search engines will tend to ignore a website until it has been around for a few weeks, sometimes even months or years. If you have an older domain, then don't go thinking you should change it because it's "stale", it's actually a good thing.

Poor coding and design - Search engines can't tell what good design is, but they can tell from the popularity of the website. Social sharing, articles, blog posts, and all of the buzz about a website will only happen when a website is easy for the visitor to use, and gives all of the value a user is looking for. So, make sure your website is easy on the eyes, gives a clear and concise value proposition with a call to action, and is easy to navigate.

Exact Match Domain - Many spammers create website with a descriptive key term in the domain in attempts to get rankings. Google announced in October of 2012 that they were updating with an algorithm that will weed out any exact match domains. For example: http://www.compareinterestrates.com/ or http://www.best-interest-mortgage-rates.com/

Keyword-rich URLs and filenames - Just as the exact match domain is taking a hit in the recent updates, the keyword rich URL and filename strategy is as well. SEO's used to put their keyword within the URL with dashed between words in order to obtain ranking for long tail terms.
Site Accessibility - it's not talked about often, but can be potentially beneficial when your website is designed with accessibility in mind. Someone that has poor vision, hard of hearing, or may have trouble clicking links and buttons, is going to have trouble with most websites. If your website audience contains users that might need some extra help, keep this in consideration. Search engines know, and it could help you rank over your competition that hasn't.

Website size - Big or small, size doesn't matter. Some SEO's stress that a website needs to have millions upon millions of pages, but I have often personally witnessed websites that get penalized for having too many pages. Don't let this happen to your site, keep the pages down to a manageable and reasonable number. If your site is a publication with thousands or even hundreds of thousands of pages with unique content, then you should be fine. Just watch your webmaster tools notifications. Most of the websites that trigger the warnings are ecommerce websites with masses of pages for each product. If you find your site is showing this kind of error, it's best to seek out an experienced professional to help you get your pages under control and managed properly.

Domains versus subdomains - A subdomain is a subset of a main domain. Often used as a place to store images, or for other purposes, a subdomain looks something like images.mysite.com. Too often websites will put their highly valuable unique content of their blog on a subdomain. Unfortunately search engines don't know the difference between a main domain and the subdomain. Because of this, they treat each one as a separate entity. In the past SEO's have taken advantage of this and tried to get multiple rankings on one page with multiple subdomains. Just this year (2012) Matt Cutts has announced that they no longer treat them separately for separate rankings, but rather as an extension of the main domain. Because of this, subdomains not only won't see rankings, but the content is still not counted as part of the main domain. When setting up a blog, or any section of your website, it's best to simply just add a new directory (ex: www.mysite.som/blog) so that any of the content within that directory supports the domain as a whole.

Hyphens in URLs - When creating URLs for your website, it's still considered best practice to separate each word with a hyphen rather than a space, or an underscore. For example, if you write a blog post or article titled "The ten best puppies everyone should own" the URL should be "www.mysite.com/the-ten-best-puppies-everyone-should-own.html" or to avoid getting pegged for keyword rich URLs and a set hierarchy, it should be "www.mysite.com/puppies/ten-best.html".

URL length - A URL that is too long is a red flag for a keyword rich URL. try to keep your URL simple, and keep that site hierarchy.

IP address - The IP address is the unique identifying number (like a phone number) of where the server that hosts your website is located. If you are targeting a local audience, or maybe even just focusing on one country, be aware of where your website is hosted. A website that targets users searching in Canada, and is hosted in the U.S. will have an IP that resides within the U.S. In this case, search engines will only rank the site for U.S. searchers, and not for their Canadian searchers. If you aren't' worried about focusing your SEO by location, then don't worry about your IP.

robots.txt - The robots.txt file is a very simple text file (like Notepad) that resides on the main server. The only case in which you need a robots.txt is when you want to block certain sections of your website. Some search engines will allow you to put links to your xml sitemap for better indexing. For more information on setting up your robots.txt you can visit robotstxt.org.

XML Sitemap - Sitemaps are an easy way to let search engines know about all of the pages within your website that you would like to see indexed.

Redirects (301 and 302) or Status Codes - 404, 301, 302... Each one of these numbers has a different meaning to a search engine. The most common is a 404 or "page not found" it basically means that the UIRL existed, and now it doesn't. In the SEO world, the 301 is another code that is mentioned often. A 301 one lets the search engine know that the URL existed and has been moved, so we let the search engine know by redirecting the old URL to the new URL. My favorite explanation of these codes is from a dear friend of mine Lindsay Wassell at SEOmoz in which she uses pictures to best explain the different codes, and what they mean.

Some basic SEO buzzwords


Long Tail - A long tail is what most SEO refer to when talking about a 3-5 or more word term. When a user is looking to buy a computer and begins their search with the word "computers", they will often start to get specific as they search focusing on the specifics like "500 GB laptop computer". This is what a long tail key terms is - the more specific you can target your audience, the more likely they will be to convert as they find what they are looking for.

Indexed - Indexing is a term SEO's use when a search engine has crawled a website and it's pages, and then starts to display them within the search results. This doesn't effect rankings, but merely expresses that a page is within the database, and recognized by the search engine. A quick and easy way to see if your website is indexed is to search with site: before your domain. For example: search for "site:oceansofpets.com".

SERP - Simply meaning the "search engine results page" and rolled off of the tongue of SEO's quite often. Pronounced just as it looks (serp) the search engine results page is the page that the user sees after completing the search.

Snippet - A search snippet is what SEO's use to describe the title and description a search engine displays on the search results page.


I think that should just about do it to get you started. With SEO there is no standard way of doing things. There is no true right and no true wrong, there is only what we try, fail or succeed, and try again.

Please feel free to add anything I might have missed in the comments below. I'm hoping this will become a pretty comprehensive list that newbie SEO's can get started with.