At the beginning of this year, one of our clients (a secure faxing agent) kicked off their CRO campaign. After walking them through Browser Media’s Conversion Rate Optimisation service, I conducted some really detailed user testing and analysis, went through a rigorous design process, and then created two strong variations for their service’s landing page and plans/pricing page. We looked at layout, we looked at copy, and we looked at navigation. These variations not only differ massively from the control, but also from one another – no meek tweaks here!

So why isn’t this post actually a case study about a crazy lift in conversions? Well, because the test is still running, and I reckon it’ll be another three weeks before I’ve got statistically significant results…

First: a little background

The client offers secure faxing plans on both a monthly and an annual subscription basis. They also offer an ‘enterprise level’ plan for organisations that do a lot of faxing, and an integration option.

They don’t do bad for sign ups to the monthly plan at all, but had almost no interest in the annual and enterprise plans comparatively. After some feedback from videos, we found that although the annual plans were obviously better value for users, they just weren’t promoted enough on the plans/pricing page, so users were missing out. We created variations that promote the annual and enterprise plans more effectively.

In order to eliminate users who are already subscribers to the faxing service, only new users are included in the test.

Learning from an inconclusive split test found that a third of tests run by an agency specialising in CRO drove a statistically significant result. I’m not calling my test ‘inconclusive’ just yet… it’s just bloody slow. While I’m obviously disappointed that I’m not reporting a landslide in terms of a winning variation, I am enjoying what I’m learning about my client’s users in the process.

Do inconclusive CRO tests reveal anything interesting about engagement?

You’d be forgiven for thinking that an increase in engagement leads to an increase in conversions, but it’s not always the case. If people are spending longer on your site without converting, they may well be dithering; struggling to complete their desired action, or failing to find the information they’re looking for.

Thanks to an improved call to action, visits to the client’s plans/pricing page have shot up, but this seems to be at the expense of visits to the service’s features page… where most of the juicy, persuasive content is. The increase in visits to the plans/pricing page has lead to loads more visits to the sign-up pages for both the monthly plans and the annual plans… but it (almost) stops there. Despite dramatically improved engagement, we don’t have dramatically improved conversions.

Onwards and upwards

If I were measuring the success of the test based only on completed sign-ups, I’d be ignoring the improvement in engagement. Instead, because I can see that more users are moving deeper into the conversion funnel (although not completing it), I’ve identified a new sticking point in the user journey, and so have the focus for my next test: the sign-up pages.