Email triage is a daily headache, but one email shows up like clockwork each week, and no matter how full my in box or how busy I am, I take two minutes to read it. I always learn something new. Marketers, are you a fan of “Which Test Won” yet?
Here’s how it works: you get an email with a “Test of the Week”. Side by side, two versions of a marketing communication are presented — email headlines , ecommerce sites, landing pages for PPC and the like — and then you get to vote on “which test won”. How good is your judgement? In an instant you find out, as the next screen shows you the results, and an analysis. (re: the test below, Version B won. If that doesn’t make you want to test, I don’t know what will!)
What have I learned over the past 134 weeks? (list of 134 “Which Test Won” past tests here) As much as we THINK we know, actual results can surprise. As founder Ann Holland says, ” no matter how big an expert you are, you are going to guess wrong sometimes because you’re not a true representative of who the marketplace, the page or email was designed for.” Yup.
So … if you could improve the chance of conversion, or stop people from abandoning your site, your registration forms or your cart, wouldn’t you want to do that? Yet, 73% of marketers aren’t doing any testing whatsoever.
Problem is, you need a certain level of conversions per month just to run a conclusive test, so for smaller marketers, sometimes you have no choice: you have to use judgment. For those cases, here are some lessons that have come out of “Which Test Won”:
- Use bigger, more prominent buttons.
- Match your headline to the headline of the ad or offer that drove the traffic.
- Get rid of extraneous navigation.
- Test your headline copy, offer copy, and button copy. (The word “Submit” for button copy can be easily improved!).
for instance: there was one test where a colon versus a dash was used in the subject line of an email, which really made a difference in responses. Who knew such a tiny factor would make a difference?
- Overlays can garner email opt-ins, among other things.
- Images: it completely depends on the market and the product. For instance, a happy smiling human: will that help or depress responses? Size, use of video, or even no image at all can all make a difference. (In the test below, version A won, due to combo of image, offer headline and layout).
Marketers, do you rely on judgment all the way? Have you tried splitting your email lists, or testing headlines prior to important mailings? Have you tried testing alternate visuals in ad campaigns, website layouts or landing pages? What do you test, and what results have surprised you?