Featured Post

Update: SEO Issues - is it Penguin? Is it Panda? or is it me?

It was a little over a year ago that I posted the " SEO Issues - is it Penguin? Is it Panda? or is it me? " in which I detailed o...

Thursday, October 24, 2024

SEO

Published in 2014 and updated in 2024

That's right, I am an SEO. What is SEO, and what does it mean to be an SEO? Search Engine Optimization is the method of enhancing a website's design and content to make it more visible in search engine results for relevant searches, allowing users to find it easily when searching on platforms like Google and Bing. Being a Search Engine Optimizer means optimizing websites so that they appear on search engines for certain terms. At the base of it all, we optimize for terms (keywords people use in search engines to find what they need) focused on what your key audience might be searching for.

Some Stats About SEO:
  1. Mobile Dominance: Mobile devices account for 59% of all global web traffic and generate 58% of all Google searches. This highlights the increasing importance of mobile optimization for SEO. All In One SEO

  2. Local Searches: Nearly half (46%) of all Google searches have local intent, indicating the significance of local SEO for businesses looking to attract nearby customers. All In One SEO

  3. Search Result Clicks: The first featured snippet on a search engine results page (SERP) captures 42.9% of clicks, showcasing the value of optimizing content to appear in these prominent positions. All In One SEO

  4. SEO and Organic Traffic: Including videos in search results can drive 157% more organic traffic, underscoring the growing impact of video content in SEO strategies. All In One SEO

  5. User Engagement: A staggering 72.6% of internet users are expected to access the web exclusively via their smartphones by 2025, emphasizing the need for mobile-first SEO strategies. All In One SEO

  6. Conversion Rates: SEO typically delivers a higher conversion rate (2.4%) compared to pay-per-click (PPC) advertising (1.3%), making it a more cost-effective strategy for driving business results. WPBeginner

  7. Speed and User Experience: Sites that load quickly and offer a good user experience are prioritized by search engines. Only 33% of websites meet the optimal thresholds for Core Web Vitals, which are critical for high SERP rankings. aHrefs

I primarily work as an in-house SEO, which means I work for large companies within the organization rather than an agency or as a consultant. In the past, I have helped some companies as an SEO consultant, but if it takes time away from my job and career, I will recommend someone else to consult. Most SEO consulting consists of my reviewing the website and any possible issues the company is finding. I reviewed the analytics by looking at the SEO traffic as well as traffic from other sources. I also look at Google's Webmaster Tools data to gauge how the current SEO is doing for the site, and how the impressions for key terms look compared to clicks. Sometimes, just a simple change to meta tags for a different title and description can increase the click-through rate from SEO, therefore increasing traffic. In most cases, a complete restructuring of the site and basic SEO implementation is needed to increase rankings. After I review the site, I will come up with a list of recommendations, along with how much effort it should take and the measure of impact. Your report will also include potential and current SEO traffic so that the client can see the biggest gaps. At times the reporting for SEO that I send over can be pretty technical, but rest assured I spend time making sure all the data is easy to understand, and a clear direction is not only explained, but in full detail in the final recommendations. From there, it is up to the client to decide whether they can do the work themselves, hire someone, have it done by their current employees, hire an agency or another consultant, or have me do the work for SEO. Since I have a background in design and development, any work needed for SEO or to simply increase conversion rates from SEO traffic is fairly easy for me to do and can happen quickly. It all depends on how much I have on my plate at the time the work needs to get done with my full-time job.

If you're not sure you want me or someone else to optimize your site for SEO, it's no problem. Most people can pick up on the basics of SEO themselves. I always like to see clients having some understanding of SEO before I work with them. If they don't have time to learn, that's perfectly acceptable, as I can explain how things work in ways most people understand and pick up quickly. The following is a checklist I have come up with for SEO that will help anyone understand and get started in SEO quickly and easily. Of course, there are so many algorithms that Google and other search engines use to determine which site gets to show up for their respective terms, but this at least gets you on your way to understanding the basics of SEO.
  1. Keywords – you can't do anything with SEO until you know what keywords you are optimizing for. Once you have your basic list, then structuring your site and any work you do with the site around them will all fall into place. I usually recommend one or two broad terms that describe a website. These terms should only be one work and very rarely more than two. From there, a few two to three-word terms that might describe a sub-category will help you structure your plan and organize for SEO. Your longtail (as SEOs will put it) or exact match (as Paid Search people call them) are the phrases that are more specific. These phrases are the biggest payoff for SEO since they represent terms that users will use when they really know what they want and are ready to buy. Therefore, they tend to convert a lot faster and higher. I talk more about this in-depth on my site, masteroptimization.com. 

    Keywords in
     – Keywords should be in the following items for SEO.
    • Keywords in title tagThe title tag shows up at the top of the browser. It is also what search engines use for the title in the “snippet” that displays in the results after a search has been completed. Having your keyword in the title tag not only helps SEO but will aid in the click, as the user will recognize the word they searched for within your title, encouraging them to click your result over the others on the page.
    • Keywords in URLGetting the keywords in the URL is very important for SEO. Start with the broad terms in the domain if possible. If not, then in a directory with the category terms (2-3 word terms mentioned before) as a sub-directory, and then the exact match longtail terms as the name (or in the name) of the file. Your URL hierarchy is very important for SEO; having those keywords in there is even more.
    • Keyword density in document text – Listing out your keywords over and over again in a short paragraph will harm your SEO more than do any good. A good way to explain how to watch your densities is to look at a page that has 3 paragraphs, each having about 150 words. Let's say you need to mention your keyword 9 times in order to get rankings. If you mention your keyword 9 times in your first paragraph and then not in the others, that's bad. The trick it to distribute your keyword evenly among the three. So mention that keyword 3 times in each paragraph and evenly distribute it throughout each one of the paragraphs.
    • Keywords in anchor text – The anchor text is the text that a user will click on within a page's content that sends them to another page. The text that links back to your website should include the main broad keyword that describes the site. The trick to this is to make sure that the page and the whole site linking to the site is relevant to the word in the anchor text. If the site linking to your site isn't relevant, then that will actually get your site in trouble, and too many will cause you to lose rankings.
    • Keywords in tags - The alt tag is the alternative text that displays in the rare case that an image doesn't show up. It's a simple line of code that goes into the html that generates the image. For SEO purposes, the alt tag containing the keyword is important and will actually help rankings. Be sure to stick with only the relevant words on that page, and don't list all the keywords with commas. That will get a site in trouble.
    • Keywords in metatagsBe sure to include your keyword in your SEO meta tags, which are the description, title, and keyword tag that reside in the background of the HTML.
  2. Metatags—Meta tags are (as explained above) the lines of code within HTML that describe your page for SEO. This includes the title, description, and keyword tag. Beginning in late 2020, the meta tags have become less important for optimizing SEO. However, some SEOs still put an emphasis on these tags. 
    • Meta description tag—The description tag should be no more than 150 characters and include your keyword(s). Try to describe the page as much as possible for SEO, keeping in mind that the user will usually see this in the search results.
    • Meta keyword tagSome SEOs will say that keyword meta tags don't make a difference. Google doesn't pay attention to them, but the meta-driven search engines will, and there are thousands of other search engines besides Google. So, for SEO purposes and to help keep the focus of the site's page, I recommend listing out the keywords in the keyword tag with the broad terms first, then the category, and the longtail. You never know; it might actually help SEO.
    • Meta language—If the site is in English, adding the language meta tag will help search engines know which language to display the site in. If you have other languages, try to make sure the language is in the meta tag. In some cases, it can really benefit SEO.
  3. Links—Internal—Internal links to other pages of the site related to the page you are optimizing can be quite important for SEO, almost as much (if not more) than external links coming in.
  4. Anchor text has a key term(s) in linksAs much as the links pointing to other pages are important, the keyword in the anchor text is also important. I cannot stress enough how important it is for SEO to have those internal links and the keywords in the anchor text.
  5. Content Around the Anchor text is relevant—If a section of pages is relevant to other pages, cross-linking with a paragraph mentioning the page before and after the link is very helpful for SEO.
  6. Content - content, content, and more content is the key to optimizing a site for SEO. Pages don't have to have large chunks of paragraphs but can have words here and there throughout the page. Too many SEOs will put big blocks of content on the homepage of a website, thinking that it will benefit it. Sure, it helps for SEO, but it looks horrible, and users don't fall for it. A paragraph of 10 words at the top describing the site and then perhaps another clock of text highlighting the value proposition of the product or service in blocks around the page are just fine. The trick is to search the term you are trying to rank for, look at the first few pages or sites ranking, and then look at how many words they have on their page with the number of mentions of keywords. Then, simply just do a little more. Once you have that content in place for SEO, you're on your way to rankings.
  7. Unique Content - is key to ensuring your SEO is in place. Not only do you need to watch out for other sites with the same content, but look at other pages of your site. If a block of content is repeated on more than one page, then the content just won't be counted towards SEO. If a page has less than 15% content, then it will even work against SEO and penalize the site. Sites that use tracking tags and parameters or might have issues with validating URLs can often run into the issue of duplicate content and really harm the work they have done for SEO without even realizing it.
  8. Frequency of content change - A site that is recognized as a publication and pushes content several days a week (or even several times a day) will train search engines to visit and see updates regularly. In this case, SEO will work to their benefit, with fresh content getting recognized and ranked quickly.
  9. Age of document - If a site is brochureware that doesn't update content often, search engines will visit less but give more value to the pages the longer they stick around. Pages that are years old will rank better than new ones added. So, keep this in mind for your SEO and your site. Are you a publication that pushes out content frequently and needs to get rankings fast, or are you a site that holds true with valuable content that gets better with age?
  10. File size - A page that takes a long time to load or is extraordinarily large can be quite detrimental to SEO. So be wary of how big that file you are creating is.
  11. Content separation - As mentioned before, breaking up your content throughout your page is more beneficial to SEO than blocking out whole paragraphs. If the site is a known publication or the section is a blog or article section of a site, then whole blocks of content are perfectly acceptable for SEO. But pages that go up and stick around for a while with the purpose of providing marketing information should have content broken up throughout for SEO.
  12. Poor coding and design - This one gets overlooked a lot. Search engines can't determine good design from poor design, but your users sure can. If a user comes to your site from Google and then immediately bounces, Google will mark the value of that page for SEO down, affecting your rankings. So pay attention to design, look at your bounce rate data in Google Analytics, and improve it as much as possible.
  13. Duplicating Content = NO - DO NOT DUPLICATE CONTENT… As mentioned earlier, this is very bad for SEO. If one page has more than 80% duplicate content to any other site, it can harm your SEO. So be sure that the content on every page of the site has more than 80% unique content.
  14. Invisible text = NO - Invisible text is content a site has hidden from users but allows the search engines to see. A div that is only one pixel high with the attribute to hide overflow or white text on a white background (both allowing search engines to see it in the code) is a huge no-no for SEO and can actually get your site penalized. So don't do it!
  15. Domains & URLs - Check your domain and URLs often. Look for your keywords, check to make sure the hierarchy is clear and set properly for SEO, and make sure that there are no funky issues like parameters, easily changed (by typing anything in), or redirects to some odd UR. Check your trailing backslash or file extensions as well to make sure it either 404s if wrong or resolves to the correct one.
  16. Keyword-rich URLs and filenames - Watch for keywords in URLs and filenames. Long URLs that mention more than one keyword will cause issues, so always check and double-check the URL for SEO before going live.
  17. Site Accessibility - In some cases, your site is accessible for those with disabilities, whether they are sight-impaired or hard of hearing. If you have video, it can actually help your SEO.
  18. Sitemap – creating a page that links to all of your pages can ensure that all of your pages are getting crawled for SEO. Many times I have seen website have pages that they don't link to and wonder why those pages aren't getting rankings. If search engines can't crawl the page, then they don't know how to rank it for SEO. You can also create a .xml file for Google, Bing, and Yahoo! site submission. But do remember that an xml sitemap alone just won't cut it, you have to have links pointing to pages from multiple locations. Otherwise, it just won't do any good for SEO.
  19. Website size - Keep an eye on the size of your website. Large corporate sites like Amazon.com and MSN.com are expected to have thousands, if not millions, of pages. If your site is a small to medium-sized company website, yet the search engines somehow crawl millions of pages, then you need to relook at your SEO. Check your parameters or other issues that might be causing more pages than your site should have.
  20. Website/Domain age - The older the website, the better. A brand new site that is loaded with pages and pages of content all in one day will get added to a sandbox, as we SEOs call it. It will sit there for a few months before the search engines even give it the time of day. This is because search engines want to ensure the site is legitimate and not just a spam site there to get rankings. Having an older domain is key to keeping your site from falling under this category. If you have a new domain, then roll out your pages slowly. Push a section one week, wait a few weeks, and push out another section. Having a blog is also good for SEO as you can add posts with content encouraging search engines to keep coming back regularly and learn that this site has something interesting and unique. Of course, the more traffic you can get in those first few months, the better, so get your social media and advertising up and going.
  21. File Location on Site - This falls under the URL hierarchy category. Watch out for where pages and files are located on the site. The structure and location should make sense for SEO and for your use.
  22. Domains versus subdomains, separate domains - Watch out for the use of sub-domains for your site. Too many websites will put their blog on a sub-domain and not in a directory. This won't hurt your SEO, but it won't help either. What happens is that the search engines count the subdomain as its very own website and don't link the content with the rest of the site. It is more beneficial for SEO to have all of your content, no matter what it is, on your main domain in a directory. Keep it out of the sub-domain unless absolutely necessary.
  23. Top-level domains (TLDs) - The site's main domain is a top-level domain. Even in the case of www.yoursite.com, the “www” is considered a sub-domain. Yes, a sub-domain… So try to use http://yoursite.com if you can. If the search engines recognize your www.domain.com, leave it alone and let Google know that you prefer to use your www. Vs. just the domain. You can do this in your Webmaster Tools.
  24. Hyphens in URLs - For SEO, it is recommended that you use “-“ in your URL rather than “_” or even just a space (which ends up rendering to %20). Search engines just happen to prefer the hyphen to underscore or space.
  25. URL length - For SEO purposes, try to keep your URL under 2000 characters, but the shorter, the better. Pay attention not to have more than 3-4 parameters or a URL with a really long sentence.
  26. IP address - Your IP address should reside in the country your website is ranking in. US and English should have an IP located in the US. French and Canadian, should have an IP in Canada.
  27. robots.txt – Blocking irrelevant content in the robots.txt will really make a difference for your SEO. It has been recommended in the past to block external CSS and image directories, but now Google has said they would like to crawl them. Search engines are getting more and more sophisticated in deciphering all of the code and getting a good idea of what the whole website is about. Some only block pages and content you really don't want search engines to crawl.
  28. Redirects (301 and 302) - For SEO, redirecting an old URL to a new URL will usually pass the old URL's value to the new URL. But be careful to use 301 redirecting sparingly. I have witnessed and dealt with sites with issues with too many 301 redirects, causing rankings to drop.
  29. Social Actions - Social actions like Facebook likes, tweets, shares, Google +1s, and so on will add much value for SEO. Any time a user has to take action to show that they see the value in the page, it will show the search engines that the page is relevant and valuable, therefore increasing your rankings for SEO.
    • Facebook 'Like' or 'Recommend'—The action of ‘Liking' a page on Facebook will sometimes help with Google, but it really helps with Bing more than anything. Microsoft and Facebook have a very close relationship, allowing Bing to use social actions that happen on Facebook to help drive site rankings.
    • Facebook commentsIf you can, try to pull Facebook comments related to your site and the page into the page itself. This not only allows for more unique content but also shows Bing and other search engines that the content on the page is valuable to the user, therefore driving up your SEO.
    • X "tweet" - A simple tweet with your page's URL will always be counted as a “vote” for your page and website. The more you can get, the better for SEO.
    • OGPOpen Graph Protocol—OGP was developed and adopted by Facebook as a way to manage how a page or website looks when shared on social channels. Twitter and other social sites have followed suit, and my prediction is that Google will start to pay attention to OGP soon. So be sure to spend the time and ensure your basic OGP tags are set for all your pages. It could really help your SEO.
  30. Links – External - Links pointing to your site are important. As mentioned earlier, tread very carefully with your link building. Ensure that the page(s) links to your site and pages are relevant to your site. Do not use directory submission websites, don't buy links, and be wary of link exchange requests. Just as external links can benefit SEO, they can also harm you if not done properly. Keep the following in mind for your SEO:
    • Quality of source of inbound links
    • Links from similar sites
    • Links from .edu and .gov sites
    • Age of inbound links
    • Links from directories
    • Links from Social Media
    • Links on pages that include social actions
  31. Schema – Google places a high emphasis on schema tags and information. In the past, they said that if you can get it in there, then that's great. Now, they look at schema information to help drive rankings for SEO. You can also manage what is displayed in your snippet, from star reviews, author information, embedded video, etc.
Of course, there are thousands, if not millions, of algorithms that search engines use to determine rankings, leaving the list I gave you here a small set of what really goes into optimizing a site. In all my years optimizing websites, I try to write blog posts when I encounter issues or get into deep-level discussions with my peers on SEO topics. But again, there is so much involved, and sites are all different from one another. I have been teaching workshops since 2007 and have been through thousands of individuals trying to learn SEO and optimize their own sites, only to find that they still need the help of an expert.

I suggest that you learn the basics as much as possible and start optimizing your site yourself. If you have an older site that hasn't been touched in years, go through and see if there are sections and pages you can add with some unique content to add to what you already have. If your site is larger and the traffic isn't where it should be, then look at how you can restructure it to reflect the categories and longtail terms you found in your keyword analysis.

If you want to see how your different categories of terms are performing, you can use this handy template I created, along with instructions on how to grab the traffic you are seeing. For some clients, I have used the template to show the estimated traffic I see in the keyword analysis compared to the actual current traffic to show what is missing. I will use the top few terms in the keyword analysis to see how aggressive the category terms are going to need to be to get rankings during the competitive report for SEO. The categories with the most potential, the largest gaps, and the least aggressive with competition are the ones I recommend tackling first. The competitive report will also help determine what will need to be done to generate rankings. Is it just one page with a bunch of content and the word mentioned several times, or is it a whole directory with files and filenames that include a mired of terms for SEO that all link to one another?

For usedcars.com, the location pages where we generated rankings for the terms “used cars in”… with city and state searches were fairly easy for SEO. The content has a few lines of text seeded with the city and state from the database (also known as templatized content). Content for the page also came from inventory (car listings) provided by the database, with a block from normal listings in that city and a block of deals in which there is a calculation done in the back end that looks at the price of the car and looks up that VIN and price against the Kelley Blue Book value and returns the percentage difference showing cars that are priced under value and are a good deal. Users love those listings. There is also a large map showing dealerships in the usedcars.com system located in that area. The map is generated from Google and helps those pages get rankings for that location.

Those pages were pretty easy to get rankings (after a lot of the mess was cleaned up), and have helped rankings provide close to 50% of the traffic from SEO for that site.

A more complex project for usedcars.com that required more pages and more aggressive SEO is what we called the Make/Model project. The goal was to get rankings in SEO for the brand of cars and the cars with years of search trends. We found that users who search the “year make model” search know exactly what they are looking for and are more likely to purchase. So, ranking for all of that year, make, and model combinations were highly valuable to the business. The problem is that all the other car sites know the same strategy and have been very aggressive in their SEO.  A set of rules for syndicated and dynamic content was set in place, along with a plan to roll out pages and content in phases. When I left in May of 2014, the project was still underway, but the pages were already seeing some traction. 

I'm always happy to talk SEO with anyone anytime. You can find me on Skype (as SEOGoddess) or fill out the contact form on my site here with any questions. I'm usually pretty quick to respond and can help you with any quick SEO issues or questions as you try to optimize on your own. I have even been known to look at a website when an agency is working on the SEO just to make the site owner or boss feel comfortable that their agency really knows what they are doing.

There are also many resources besides myself or this blog, and plenty of SEOs with great experience. 

Either way, SEO can be fun and you can really learn a lot quickly if you want. You can know enough to be dangerous, but if you stick with the general rule, as John Mueller of Google stresses, “don't trick the search engines,” you should mostly be just fine.

In the end a site that has increased traffic from SEO is a site that I generating a lot of money, and that's just good for business.

Thursday, August 22, 2024

The AI Search Revolution

The introduction of AI-powered search tools like Google Gemini and ChatGPT has created a seismic shift in how users interact with online information. As these tools gain traction, they are changing how people search for information and signaling the potential end of traditional Search Engine Optimization (SEO). In its place, a new concept is emerging, one that might be dubbed "Search Optimization" (SO) or "Research Optimization" (RO). This shift marks a radical departure from how we have understood search engines and optimization practices for decades.

The Statistics Behind the AI Search Revolution

The release of ChatGPT in November 2022 marked the beginning of a new era in online search. Within just five days, ChatGPT gathered a million users, and within two months, it had reached 100 million users, making it one of the fastest-growing consumer applications in history. This explosive growth underscored the public's hunger for more interactive and conversational ways to search for information. Unlike traditional search engines, which rely on keyword-based queries and provide a list of links, ChatGPT and its successors engage users in a dialogue, delivering results in a conversational format that feels more intuitive and personalized.

Despite this rapid adoption, integrating AI into search engines like Bing and Google has not been without challenges. For example, when Google rushed to release its AI chatbot, Bard, it faced a series of missteps that led to a significant loss in market value—around $100 billion. These early failures highlighted the risks associated with AI-powered search, particularly the potential for inaccuracies in the generated results. However, the momentum behind AI search has not slowed, with tech giants and start-ups alike racing to develop the most advanced and reliable tools.

Recent surveys indicate that while over a quarter of U.S. adults have tried AI-powered search tools, the majority remain cautious or skeptical about their accuracy. This hesitancy is particularly pronounced among older generations, who may be less accustomed to the conversational style of AI search engines. However, younger users, particularly those in Gen Z, increasingly turn to AI for their search needs, suggesting that AI-powered search could become the norm in the coming years.

The Implications for Traditional SEO

The rise of AI-powered search engines poses an existential threat to traditional SEO. SEO has been the cornerstone of digital marketing for years, with businesses optimizing their websites to rank highly on search engines like Google. This involved a combination of keyword optimization, backlink building, and technical improvements to ensure websites were easily discoverable by search algorithms.

However, AI search tools operate on a fundamentally different model. Instead of providing a list of links based on keyword relevance, AI search engines generate answers in real time, drawing from vast datasets and deep learning models. This shift diminishes the importance of traditional SEO practices, as AI's ability to synthesize information across multiple sources reduces the need for users to click through to individual websites.

From SEO to SO: The Birth of Search Optimization

As AI search tools continue to evolve, the focus will likely shift from traditional SEO to what could be called "Search Optimization" (SO). In this new paradigm, the goal is to rank highly on a search engine results page (SERP) and ensure that your content is included in the AI's response. This requires a different approach to content creation and optimization.

Search Optimization (SO) will prioritize creating content that is not only informative but also structured so that AI models can easily interpret and use it. This might involve the use of structured data, clear and concise language, and content designed to answer specific questions comprehensively. Unlike traditional SEO, which often focuses on getting users to click through to a website, SO will be about ensuring your content is integrated into the AI's narrative.

Research Optimization: A New Frontier

Another term that might emerge in the wake of AI search engines is Research Optimization (RO). This concept would extend beyond search engines to include how AI tools are used in research and content generation. As AI becomes more adept at synthesizing information, businesses and researchers will need to optimize their data and findings to be easily accessible and usable by AI models.

Research Optimization could involve curating datasets, publishing research in machine-readable formats, and ensuring that all content is backed by credible sources. This would be particularly important in academic and scientific fields, where the accuracy and reliability of information are paramount.

The Future of Online Search

The rise of AI-powered search engines like Gemini and SearchGPT is likely to have profound implications for the future of online search. As these tools become more sophisticated, they will change how users interact with information and how businesses optimize their content.

While traditional SEO may not disappear entirely, its importance will likely diminish as AI search engines become the primary means of information discovery. In its place, new forms of optimization—Search Optimization and Research Optimization—will emerge, focusing on ensuring that content is accessible, accurate, and useful to AI models.

For businesses, adapting to this new landscape will be crucial. Those who can effectively optimize their content for AI-powered search engines will have a significant advantage in capturing traffic and engaging users. However, this will require a shift in mindset from focusing on keyword rankings to ensuring that content is structured and credible enough to be used by AI.

In the end, the rise of AI search tools represents both a challenge and an opportunity. While they may disrupt traditional SEO practices, they also offer new ways to connect with users and provide value. The key will be to embrace these changes and adapt to the new realities of online search. 

The question is, will your business be ready to make the leap?

Tuesday, February 20, 2024

Exposing the Truth Behind SEO Conferences: Does the Investment Hold Value?

Are traditional SEO conferences worth the hype and hefty price tags? 
Well... I have navigated the maze of industry events and uncovered some gems and pitfalls. As I mentioned in my article on Conferences in 2024 for MasterSEO.io, from the endless loop of newbie topics to cliques playing high school, speakers thinking they're rockstars, and the not-so-fun reality of harassment—SEO conferences are a mixed bag.

Traditional ones?

Meh. They often dish out beginner-level stuff seasoned pros find as thrilling as watching paint dry. Do we need a conference to explain "What is a robots.txt?" I think not. Then there's the agency parade, with success stories and pitches. As pros, we're here to level up, not be sold.

Networking?

It can feel like Survivor. My early solo conference adventures were like being in a sea of faces until a hero or two rescued me. And speakers? Some get the VIP treatment, creating an exclusive vibe. The Groundhog Day of repeated talks? Seriously?

I did things differently with my events—no basic talks, egos, or repeats. Speakers engaged, controversial topics were fair game, and it became a hit. Never fear though, 2024 has some cool conferences breaking the mold:
  • Botify Connect: Diving into adapting SEO strategies to a changing search landscape and the fascinating world of AI.
  • State of Search: 12 years strong, covering the nitty-gritty of SEO, PPC, and digital marketing.
  • WTSFest USA: Ladies in tech SEO, unite! A platform for leading women to share insights and build a kickass community.
So, when picking your conference, look for fresh content, killer networking, and inclusivity. Break away from the ordinary and snag your ticket to career growth and skill development. 

Tuesday, February 6, 2024

The Essential Guide to Canonical Tags and Best Practices in 2024

In 2009, Google introduced a game-changer in the SEO world—the canonical tag (rel="canonical"). This tag, discreetly placed in the <head> section of a webpage, allows website owners to declare their preferred version among similar or duplicate content. Let's delve into the historical context and why understanding canonical tags is crucial in 2024.

TL;DR

  • Canonical tags were introduced by Google in 2009 for SEO.
  • Tags help webmasters control preferred version among similar content.
  • Google's announcement in 2009 addressed identical or similar content.
  • Canonical tags consolidate link popularity and aid search engine indexing.
  • Matt Cutts video emphasizes best practices for canonical link element.
  • Canonicalization serves key purposes, including solving duplicate content issues.
  • Understanding canonical tags crucial for SEO in 2024.
  • Canonical tags defend against content theft and optimize crawl budget.
  • Canonical URLs found in HTML source or using Google Search Console.
  • Best practices for canonical tags include one URL per page and consistency.

Google Introduces the Canonical Tag

Google's announcement in February 2009 relieved webmasters grappling with identical or similar content accessible through different URLs. The canonical tag became the hero by helping webmasters control the URL displayed in search results, consolidating link popularity and other essential signals.

Imagine having two pages on your site, like "example.com/page" and "example.com/page?sort=alpha." You should inform search engines that these are essentially the same. By designating one as the canonical version, you guide search engines to index your preferred page, ensuring it receives the deserved ranking signals.

In the ever-evolving landscape of search engine optimization (SEO), one significant milestone occurred in February 2009 when Google introduced the canonical tag. This innovative feature aimed to address concerns related to duplicate content, providing website owners with a tool to specify their preferred version of a URL.

Google's announcement on February 12, 2009, marked a pivotal moment for webmasters grappling with identical or substantially similar content accessible through multiple URLs. The canonical tag allowed for greater control over the URL displayed in search results, ensuring that link popularity and other properties were consolidated to the preferred version.

The canonical tag operates as a simple yet powerful <link /> tag that is added to the <head> section of duplicate content URLs. It serves as a hint to search engines, indicating the preferred version of a URL. For instance, if a site sells Swedish fish, and the preferred URL is https://www.example.com/product.php?item=swedish-fish, the canonical tag would be added to URLs with slight variations, such as parameters for sorting, categories, tracking IDs, or session IDs.

Fast forward to 2024, and the canonical tag remains a crucial aspect of SEO strategy. However, its misuse has become a common challenge. Website owners sometimes neglect to specify the canonical URL, leading to confusion for search engines and potential negative impacts on search rankings.

Understanding the significance of the canonical tag is essential for maintaining a healthy SEO strategy. The tag helps search engines interpret the preferred version of the content, preventing the dilution of link popularity and other signals. It also addresses common questions, such as whether rel="canonical" is a hint or a command (it's a strong hint), if relative paths can be used (yes, they can), and the tolerance for slight differences in content.

Google's algorithm is lenient, allowing for canonical chains, but it strongly recommends updating links to point to a single canonical page for optimal results. The tag can even be used for cross-domain canonicalization within a domain but not across different domains.

One notable update in December 2009 expanded support for cross-domain rel="canonical" links, providing more flexibility for webmasters. An example from wikia.com showcased the successful implementation of rel="canonical" on the URL https://starwars.wikia.com/wiki/Nelvana_Limited, consolidating properties and displaying the intended version in search results.

Matt Cutts Explains the Canonical Tag

Matt Cutts (I bet you haven't heard that name in a while) launched a video on February 22, 2009, that explained the canonical tag that helps understand its use in today's standards.
TL;DR
  • Matt Cutts discusses the canonical link element, an open standard for addressing duplicate content on the web.
  • The element is supported by Google, Yahoo!, and Microsoft and was announced in 2009.
  • Cutts emphasizes best practices, including standardizing URLs, consistent linking, and using 301 redirects.
  • The canonical link element allows webmasters to specify a preferred, clean URL version to reduce duplicate content issues.
In the opening of the video, Matt Cutts sets the stage by introducing the topic of discussion – the canonical link element. This element, he explains, is an open standard jointly announced by major search engines, including Google, Yahoo!, and Microsoft, back in 2009. Its primary purpose is to tackle the prevalent issue of duplicate content on the web, a complication that often disrupts the effectiveness of search engine rankings. Cutts underscores the pivotal role of the canonical link element in enhancing the overall quality of the web and provides additional context by mentioning its announcement date.

Cutts delves into the complexities associated with duplicate content as the video progresses, using different URLs as illustrative examples. He sheds light on the challenges webmasters and SEOs confront when dealing with multiple versions of the same page. The discussion expands to encompass various strategies for resolving duplicate content issues, with Cutts highlighting the significance of standardizing URLs, practicing consistent linking, and employing 301 redirects. In a metaphorical analogy, he likens the canonical link element to "Spackle" – a tool that effectively repairs the cracks in the metaphorical wall of duplicate content.

Continuing the conversation in the third segment, Cutts provides further insights into best practices to mitigate duplicate content challenges. These practices include standardizing URLs, ensuring consistent linking, and utilizing 301 redirects. He elaborates on the role of Google's Webmaster Tools and Sitemap in addressing duplicate content. He acknowledges the persistent challenges that may arise, citing examples like session IDs, tracking codes, and breadcrumbs. The video concludes with practical advice for users to exercise caution, plan proactively, and avoid abusing the canonical link element. Cutts also recognizes the substantial contribution of Google engineer Joachim and expresses gratitude to others who played a role in developing the canonical link element.

The Essence of Canonicalization


Canonical tags serve several key purposes:
  • Solving Duplicate Content Issues: Addressing identical or similar content problems.
  • Guiding Search Engine Indexing: Helping search engines identify the most relevant page among duplicates.
  • Specifying Preferred Domains: Offering a way for webmasters to express their preferred domain.
  • Consolidating Incoming Links: Aiding in concentrating link influence on a specific page.
  • Protecting PageRank: Safeguarding your site's authority from content theft or duplication.

Why Canonical Tags Matter in 2024


Understanding the advantages of canonical tags in the SEO landscape is crucial:
  • Define Your Preferred Domain: Specify your chosen domain format for optimal results.
  • Control Search Results Inclusion: Decide which version of a page you want to see in search results.
  • Boost PageRank: Consolidate links to improve the authority of specific pages.
  • Defense Against Content Theft: Protect your site's integrity when others republish your content.
  • Optimize Crawl Budget: Efficiently manage crawls while avoiding duplicate content issues.

Unveiling Canonical URLs

    Finding the canonical URL is a behind-the-scenes process, visible only to search engine crawlers. The format is simple: <link rel="canonical" href="CANONICAL-URL"/>. 
    Here's how you can find it:
  1. View HTML Source: Check the HTML source of a page for the canonical tag.
  2. Use URL Inspection Tool: Leverage Google Search Console's tool to identify the canonical URL selected by Google.

When to Deploy Canonical URLs

The primary reasons to use canonical URLs include:
  • Avoid Duplicate Content Issues: Prevent problems arising from similar or unintentionally duplicated content.
  • Syndicating Content: Inform Google when republishing content on other platforms.
  • Specify Your Preferred Domain: Clarify your preferred domain format to avoid confusion.

Canonical Tags Best Practices

Follow these best practices for effective use of canonical tags:
  • One Canonical URL Per Page: Ensure each page has only one canonical URL.
  • Valid and No "Noindex": Ensure the specified canonical URL is valid and doesn't have a "noindex" attribute.
  • Consistent Format: Maintain consistency in canonical tags to help Google identify your preferred domain.

Canonical Tags vs. 301 Redirections

Canonical tags and 301 redirections serve different purposes. Canonical tags are ideal when you want users to see both pages, guiding search engines on the preferred version. In contrast, 301 redirects hide the source page, showing only the target.

In the End - Understanding Canonicals will Save Your SEO

Understanding canonical tags is pivotal for maintaining a robust SEO strategy. As we navigate the evolving digital landscape, these tags are an essential tool for webmasters striving to optimize their online presence.

In summary, the canonical tag introduced by Google in 2009 remains crucial for effective SEO in 2024. This tag addresses duplicate content issues, guides search engine indexing, and serves various purposes, including specifying preferred domains and consolidating links. Despite its significance, misuse is common, with some neglecting to specify the canonical URL, impacting search rankings.

Matt Cutts emphasized the tag's importance in a 2009 video, providing insights into best practices such as standardizing URLs and using 301 redirects. In the evolving digital landscape, understanding and correctly using canonical tags are essential for webmasters aiming to optimize their online presence. Following best practices enables webmasters to define their preferred domain, control search results, boost PageRank, defend against content theft, and optimize crawl budget—contributing to a more effective SEO strategy.

Monday, January 22, 2024

Robots Tags Explained

So, you're diving into the world of making your website shine on search engines, right? It's quite a journey! Now, here's the thing – there's a nifty trick that beginners sometimes miss out on, and that's using robot tags. These little or meta tags are like secret agents for your website. They play a big role in telling search engines, especially Google, how to organize and show off your awesome content.

Curious to know more?

This beginner-friendly guide is all about the different robot tag settings, why they're a big deal, and when you might want to sprinkle some of that magic on your website.

What are Robot Tags?

Robot tags are snippets of code embedded in the HTML of your web pages to communicate instructions to search engine bots. These instructions guide the bots on how to treat your content in terms of indexing, following links, displaying snippets, and more. Let's dive into some common robot tags and their meanings:

1. all

This is the default setting, indicating that there are no restrictions for indexing or serving. If not specified otherwise, this rule has no effect.

2. noindex

Use this tag when you don't want a particular page, media, or resource to appear in search results. It prevents indexing and displaying in search results.

3. nofollow

By using this tag, you instruct search engines not to follow the links on the page. It's useful when you want to keep search engines from discovering linked pages.

4. none

Equivalent to combining noindex and nofollow, it prevents both indexing and following links.

5. noarchive

This tag stops search engines from showing a cached link in search results. It prevents the generation of a cached page.

6. nosnippet

Use this tag if you don't want a text snippet or video preview in the search results. It prevents Google from generating a snippet based on the page content.

7. indexifembedded

Allows Google to index the content of a page if it's embedded in another page through iframes, despite a noindex rule.

8. max-snippet: [number]

Specifies the maximum length of a textual snippet for search results. You can limit the snippet length or allow Google to choose.

9. max-image-preview: [setting]

Sets the maximum size of an image preview in search results. You can choose between 'none,' 'standard,' or 'large.'

10. max-video-preview: [number]

Limits the duration of video snippets in search results. You can set a specific duration or allow Google to decide.

11. notranslate

Prevents the translation of the page in search results. Useful if you want to keep user interaction in the original language.

12. noimageindex

Stops the indexing of images on the page. If not specified, images may be indexed and shown in search results.

13. unavailable_after: [date/time]

Specifies a date/time after which the page should not appear in search results.

Why Use Robot Tags?

Using robot tags is essential for controlling how your content is treated by search engines. It allows you to tailor the indexing, linking, and display settings based on your specific needs. Let's look at an example scenario to illustrate when you might use these tags.

Example Scenario:

Imagine you have a temporary promotion page on your website that you want to exclude from search results after a specific date. In this case, you would use the noindex tag to prevent indexing and the unavailable_after tag to specify the date after which the page should not appear in search results.

<meta name="robots" content="noindex, unavailable_after: 2024-02-01">

This ensures that the promotional page is not indexed and won't appear in search results after February 1, 2024.

In conclusion, understanding and correctly implementing robot tags is a valuable skill for any website owner or developer. It gives you the power to control how your content is presented in search results, ultimately influencing the visibility and accessibility of your website.