Two weeks ago I got into a bear fight. Or rather I made two bears fight. The winner would grace the cover of my website, Alaskagold.com: A traveler’s guide to South Central Alaska. This is a follow up to that post.
In all seriousness, the main image of your website is important. It sets a tone. It communicates something about you and the content of the site. It’s prime real estate in the market of user attention. Don’t waste it. It’s warrants careful consideration and shouldn’t be left to HiPPO [highest paid person’s opinion].
When getting started with A/B testing with Google Website Optimizer, I choose to test the front page image of the site. Seems like a good place to start. I choose two very different images of bears – cultural icon of the wild North.
Nice Bears Come in First!
I kinda figured that the Angry Bear would lose. It’s a scary picture and you can almost smell that bear’s rotten, salmon stinky breath. It does have an impact. Unfortunately, the impact was to make the visitor’s back away from the page – possibly running scared to for their life [like this guy].

I imagine this guy is like a site visitor scrambling to click the "Get me out of here" button
And in this corner, a cute, docile bear. A pleasant Alaskan scene with sweeping vista of the mountains rising up from the sea, and there you spot a bear moving away from you. Perhaps his name is Teddy and is off to an important tea time with his friends. Or a pot of honey.
Optimizer Fails
If you remember, the Google Optimizer A/B test required that you have four URLs – the original, the A variation, the B, and the conversion URL. I only really cared about one metric – bounce rate. I wanted the results to tell me which image would give me the best chance of getting the user to click further into my site. My problem was the Conversion URL – any URL in my domain would signify a conversion.
I picked the conversion URL of the highest percentage ‘next page’ – the page most people go to after the front page. But there wasn’t enough conversions after running the test for 3 weeks. The Optimizer shrugged – “I can’t help you with that”. It worked great splitting up the traffic to the A and B pages, but could only say “No high-confidence winner found” in the end.
I closed the test and chalked it up to ‘good experience’. Base on my opinion, not going out on a limb, I put the cute bear on the front page. But those questions remained – perhaps my audience really wants to be scared by bears (is scaredbyAlaskanbears.com taken?)
Google Analytics to the Rescue
Just so happens, I’m doing my monthly review of my stats in Google Analytics and I noticed the index-a and index-b variations showing up in the top content. For your viewing pleasure…
Bang! I had my answer. Index-a.html, the teddy bear, had a 16% reduction in bounce rate. And, an increase in time on page by 22 secs. I think that’s substantial improvement.
Conclusion
I’m pretty happy to come away from my A/B test with the data to support the change in the site. [And, to have my metrics improve]
Now I’m interested to find the best image for a travel site front page. There is a convention to travel marketing photography. For instance, people like to view people, not just scenery. And, there are many other elements that affect the bounce rate on the page. The one that stands out to me is getting the vistor oriented and clearly setting their expectations for the rest of the site. But that’s a story for another day. I’ll take this little A/B testing victory and move along.
Pingback: The limits of user testing: Iteration vs. Innovation | A Better User Experience