Friday, August 29, 2008

Optimize Your Landing Pages Without Disturbing Your Tests

When I was first approached by Mark (CEO of Smartsheet) with the opportunity to work for them in marketing I was excited about the product and all that it has to offer. I was even more impressed while in the first week in my new position to find out that they had already implemented landing pages with a complete user flow. From paid search advertising that pointed to a landing page that then sent the user through a very simple registration process, then showing the user a template that helped them get started based on the landing page or ad they came from. The user experience was unique depending on what each user was interested in. How brilliant!

The VP of Marketing (Maria) had done an excellent job working with Widemile in optimizing the original landing pages. With over 40-50 landing pages they were able to take the results from a select few pages and apply them across the rest of the 40 to 50 pages that weren’t tested, resulting in an increase from a 4-5% conversion rate up to over 6-12% by redesigning a select few and testing the designs against the original. (read the article on btobonline.com here)
With the website going through a complete redesign and a new product rolling out I wasn’t sure if we were going to see the same results, or if I could try some of the strategies I had used in the past with Classmates or Concur to increase the conversions. Luckily I had 40+ landing pages to work with that all had templates associated with them. I started with the most popular based on the natural search traffic. A lot of the landing pages were indexed and had a pagerank of 4-6 (depending on the page). So I took the pages that were driving the most conversions naturally and applied three different designs across them. We launched the new site and our BETA product in July 2008 and thus the landing pages were setup for A/B testing (A being a design based off of winning combinations against 3 new design layouts) in 3 groups. I captured the paid search marketing campaigns that were setup before and split them up by content targeting, and search only instead of one lump in one campaign (difficult to track effectiveness of ads on content, vs. search). The first week of launch our paid search marketing campaigns were converting at 6%. Over the course of the few weeks I noticed that the content network was converting higher, and the landing pages were doing much better with the new designs. Over the course of the few weeks I optimized the campaigns and created banners to see if I could drive more traffic to the landing pages for testing. By the end of July 2008 I was able to get the overall conversion rate up to over 9%.

In August I created adgroups for each landing page (20 of the top converting pages) complete with text ads, banners, and a template that the user would see once signed up. I implemented the winning test design on the pages, and added some new elements for multivariate testing in 2 groups. One group had an emphasis on “Free Account” and the other had an “easy as 1-2-3 to sign up” block to the right. Our conversion rate jumped up to over 11% (getting close to what we were at in the past). With the “Free Account” traffic we noticed that the users were less likely to stick around so we pulled the “Free Account” wording off of the pages. Since I was testing other elements on those pages and not the “Free Account” I grabbed a footprint of where the conversions were so I could see if removing the words would affect the conversion rate. At the same time I added a block with a list of our value props to some of the pages. Once again, since I didn’t have enough data to stop the test, and adding the block wasn’t a part of the tested elements I took a footprint of where the pages were at.
I did notice a drop in the conversions from the “Free Account” removal, with an increase on the pages I added the value props to.

I charted a few of our results below for you to see:






I am happy to say that as a result of the small changes I made (while keeping the tests going) I was able to increase our overall conversion rate to nearly 15% by the end of August 2008.
The lesson learned is that you don’t have to keep stopping and starting new tests just to make small changes, and the small changes you make can still be tracked if you just grab the data before you make the change. You do still want your tests to run long enough to collect enough data for the small changes you make, but the idea is to leave the multivariate testing for 1-2 elements (that way you get faster results with less combinations). Do remember that these are tests, and the data your are collecting through multivariate, and especiall A to B, shouldn't be manipulated too often otherwise your results could be skewed be all the changes you made along the way.

Just as I was able to get templates to show up when our users sign in, I took the time to setup the sheet that I keep track of my landing pages as a template - complete with tips on how to keep track of it all. Just click the link here, sign up and you should see the template once you have verified your email.
Happy Testing!

Saturday, July 19, 2008

SEO Survey




After signing up for one of Google's beta testing applications through their form that posts directly to the google doc spreadsheet I thought I would give it a try and create an SEO survey through Google docs myself. The results are published directly to the doc as the survey is submitted.
Take a moment and fill out the survey yourself. The results will update automatically as people fill it out and submit it, and I will post the data in all its glory to the public with pie chart sand everything for all to see. It will be interesting to see what comes of the answers.

The form is no longer available







Monday, June 16, 2008

The Misinterpretation of the internet...

On Monday June 16, 2008 Gail Geronimos had posted an article on her blog about Brent Frei (co-founder and chairman of the board at smartsheet.com) summarizing his very informative article that was posted to Venturebeat.com on April 21, 2008. She starts by stating that the article "was written by Brent Frei, founder of Bellevue."
Now mind you - the Bellevue she is referring to is my hometown that is located across Lake Washington from the major metropolitan city Seattle, WA. A few of us in the office got a chuckle out of Brent being the founder of Bellevue, WA - but in actuality it was founded in 1869 by William Meydenbauer.
This to me is yet another example of how the internet can be the misinterpretation of facts. Given that Gail Geronimos is from Australia she would have had no idea that Brent wasn't the founder of Bellevue, or even that Bellevue is a city all it's own. Through no fault of her own she had misinterpreted what she had read for at the bottom of the article Brent had posted stating "Brent Frei is founder of Bellevue, Wash.-based Smartsheet.com, a privately-held Software as a Service (SaaS) provider."
Remember the old rumor game that we played where we would whisper a statement into someone's ear and that person would then whisper to the person next to them, and so on until the person at the end would then say what they had been whispered out loud and it would never be what the original statement was said? Could it be that the internet is becoming the rumor mill that we experienced in our younger days?
It's up to us marketing and PR people to keep track of not only our company as a whole, but the company's representatives as well as our own personal interpretations.
...and no - I am not actually a Goddess, I just play one on IT ;o)

More and more companies are hiring agencies or even in-house teams to monitor the rumors that fly on the internet as quickly as they become posted. But with today's technology it's becoming increasingly difficult to catch all that there is out there...

Jenn