Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Thursday, January 10, 2008

Remove URL from the Google Index

There has been repeated discussion about how and why you should remove a URL (whether it a whole website, directory or just one file) from the Google index.
The most common issue I have ran into is when two or more domains resolve to the same DNS resulting in more than one website with the exact same pages and content in the google index. I discuss this in one of my earlier posts "Problems with Multiple Domains".

Matt Cutts has presented us with a video explaining more scenarios and the best practices of how to remove URLs from the Google index even more in depth.

Sunday, December 2, 2007

Ha! to all you Linking black hatters...

You know I have been talking about is for several years now. When I started as a full time SEO technician several years back, the company I worked for (Visible Technologies) used linking as their main strategy for rankings. While they obtained the rankings quickly, the search engines slowly started catching up to all those websites that used such sites at linkmarket, and linkworth in order to increase their rankings. Funny thing is that now I get to tell them "I told you so".

Does this mean that Google doesn't approve of linking? By all means, NO... Google supports linking in every way, in fact they encourage website owners to obtain external links in order to help their rankings. What they don't approve of is the purchasing of links in order to increase rankings.

The point of rankings within the search engine results pages (SERPs) is to help the user find what they are looking for efficiently. If a website has next to no relevant content and has rankings just because of a load of external links is that helping the user find what they need?

No

If the website has relevant content to what the user is looking for, then chances are the user isn't only going to find what they need, but the user is going to be so excited about the site that they are going to either blog it, or tell others about the site by adding a link to it.

This was the whole basis to Google's algorithms from the beginning. The problem is that SEO's have been using black hat techniques in order to increase rankings quickly (linking, doorway pages, duplicate content, etc) leaving Google and the other search engines having to adjust the algorithms in order to bring the most relevant results.

When optimizing a website, always be sure to provide your user with relevant content, and landing pages that reflect what each user would be looking for. For example, if someone is looking to start dating online they would want a website that offers advice for those wanting to date online. Thus when a user types in "Online Dating Advice" They would find a website and a webpage that reflects the online dating advice.

Google talks more about this and Matt also adds a bit about how Google made some algorithmic changes recently which resulted in a lot of websites loosing pagerank and search results - Purchasing Links is  BAD -

Read it and memorize it well...

Always remember that if you have to adjust because your site lost rankings, then you aren't optimizing correctly.

Tuesday, November 27, 2007

Google snippets

Matt Cutts visited us here in Seattle by heading to Google Kirkland office. They had decided to make a few videos of Matt while he was there and post it to the Google Webmaster Blog - Anatomy of a Search Result. Matt's video explains the snippets (title and description that shows up in your search results). He used Starbucks as an example. The Starbucks site is a good one since it has limited content on the page, but uses meta descriptions in their code.



Matt mentioned in the video that we have no control over the site links presented underneath the snippet which isn't entirely true. While we are unable to pay for the extra sitelinks within the snippet, we are able to control them through the webmaster tools. Under the "links" and then "sitelinks" within webmaster tool, if Google has generated sitelinks for your website, you are able to control them from there.


Always remember how the snippet works:


  1. If the search term is within the meta description of the web page then the description will appear in the snippet (with the term highlighted in bold).

  2. If the term isn't in the meta description or there isn't a meta description then the words around the content within the body of the page will be displayed (with the term highlighted in bold).

  3. If the page does not have content and does not have a meta description, then the description is generally pulled from the open directory project

  4. If there isn't any content on the page, no description in the meta tags, and the site hasn't been submitted to the open directory project then the descriptiontion will be blank.

A great example of the fourth scenario is a client I have been working on that came to me with his website that was created entirely in graphics. The website was a great design, and had a lot of great information on it, but the text was put in images and placed on the site that way. The website wasn't even ranking for the name. The only way to pull it up int he results was to do a site: search to see that the pages were getting indexed, but the content wasn't getting recognized (because of the images).


I am currently in the process of pulling out the text from the images and the html is recoded to keep the design the same while keeping the text still recognizable. I also added a webmap for more efficient indexing, and landing pages for the terms that the client wanted to rank for. There is an xml sitemap submitted to the webmaster tools, and a Google analytics account setup for tracking conversions.
The client should start to see better results soon after we get the new site launched.