Sunday, December 2, 2007
Does this mean that Google doesn't approve of linking? By all means, NO... Google supports linking in every way, in fact they encourage website owners to obtain external links in order to help their rankings. What they don't approve of is the purchasing of links in order to increase rankings.
The point of rankings within the search engine results pages (SERPs) is to help the user find what they are looking for efficiently. If a website has next to no relevant content and has rankings just because of a load of external links is that helping the user find what they need?
If the website has relevant content to what the user is looking for, then chances are the user isn't only going to find what they need, but the user is going to be so excited about the site that they are going to either blog it, or tell others about the site by adding a link to it.
This was the whole basis to Google's algorithms from the beginning. The problem is that SEO's have been using black hat techniques in order to increase rankings quickly (linking, doorway pages, duplicate content, etc) leaving Google and the other search engines having to adjust the algorithms in order to bring the most relevant results.
When optimizing a website, always be sure to provide your user with relevant content, and landing pages that reflect what each user would be looking for. For example, if someone is looking to start dating online they would want a website that offers advice for those wanting to date online. Thus when a user types in "Online Dating Advice" They would find a website and a webpage that reflects the online dating advice.
Google talks more about this and Matt also adds a bit about how Google made some algorithmic changes recently which resulted in a lot of websites loosing pagerank and search results - Purchasing Links is BAD -
Read it and memorize it well...
Always remember that if you have to adjust because your site lost rankings, then you aren't optimizing correctly.
Tuesday, November 27, 2007
Matt Cutts visited us here in Seattle by heading to Google Kirkland office. They had decided to make a few videos of Matt while he was there and post it to the Google Webmaster Blog - Anatomy of a Search Result. Matt's video explains the snippets (title and description that shows up in your search results). He used Starbucks as an example. The Starbucks site is a good one since it has limited content on the page, but uses meta descriptions in their code.
Matt mentioned in the video that we have no control over the site links presented underneath the snippet which isn't entirely true. While we are unable to pay for the extra sitelinks within the snippet, we are able to control them through the webmaster tools. Under the "links" and then "sitelinks" within webmaster tool, if Google has generated sitelinks for your website, you are able to control them from there.
Always remember how the snippet works:
- If the search term is within the meta description of the web page then the description will appear in the snippet (with the term highlighted in bold).
- If the term isn't in the meta description or there isn't a meta description then the words around the content within the body of the page will be displayed (with the term highlighted in bold).
- If the page does not have content and does not have a meta description, then the description is generally pulled from the open directory project
- If there isn't any content on the page, no description in the meta tags, and the site hasn't been submitted to the open directory project then the descriptiontion will be blank.
A great example of the fourth scenario is a client I have been working on that came to me with his website that was created entirely in graphics. The website was a great design, and had a lot of great information on it, but the text was put in images and placed on the site that way. The website wasn't even ranking for the name. The only way to pull it up int he results was to do a site: search to see that the pages were getting indexed, but the content wasn't getting recognized (because of the images).
I am currently in the process of pulling out the text from the images and the html is recoded to keep the design the same while keeping the text still recognizable. I also added a webmap for more efficient indexing, and landing pages for the terms that the client wanted to rank for. There is an xml sitemap submitted to the webmaster tools, and a Google analytics account setup for tracking conversions.
The client should start to see better results soon after we get the new site launched.
Tuesday, November 20, 2007
I hate to say this but the fact that tis page is ranking is wrong on so many levels.
From a usability standpoint - most people looking for the classmates address (such as myself) while Google ha it on the map, to see this page i the results is very confusing. How many of these pages are ranking? I have come across the final form page on several occasions while searching schools, or other such affiliations (researching schools for the kids). The problem is that a user coming to this page for any other reason than to sign up for classmates.com is very confused as to what they should do. This page is meant to be the last in a 5 page registration process in which a user starts from www.classmates.com and then flows through to the school they attended when they graduated. I had mentioned this while the SEO Manager at classmates.com but they chose to keep the pages as is, and decided not to make the fix in order to get the actual landing pages that we had designed and launched in December 2006. The pages were designed to recognize who would be landing on these pages, give the user a clear understanding as to why they were there, and what we would like them to do once there (The target audience, value proposition, and call to action). Those pages went from a 10% conversion rate (that's being nice) to a 50% to often 60% conversion rate.
It's a shame that the registration form page is ranking higher than the intended landing page.
The second reason why it's a bad thing to see this page listed is that the page is the test high school that the QA department uses in order to ensure there are no bugs in the registration process.
If it were me, I would remove it from the index through the webmaster tools, and then add it to the robots.txt to ensure that it wouldn't ever rank again.
Maybe even suggest that we not let it go to prod, and leep it on the QA servers just for testing.