Life After A/B: The Curious Case of Mad Mimi

Picture of the Mad Mimi logo

In early 2014, Mad Mimi commissioned Copy Hackers to optimize the Mad Mimi pricing page for an A/B test.

The goal? To increase click-thrus to each plans’ sign up page.

The Control

Mad Mimi pricing table control

During a 6-week period, Mad Mimi and Copy Hackers tested two variants of the page:

Variation C

In one variant (Variation C), they replaced the infinity symbol — an abstraction that is only immediately intuitive for mathematicians and pool boys — with the layman’s term, Unlimited.

They also added Premier support and White glove support as benefits; and changed the copy in the main CTA buttons from Sign Up to Get Started.

Mad Mimi pricing table - variation C

Variation B

In the other variant (Variation B), in addition to the optimizations listed above, they decreased the number of showcased plans from four to three; changed the order of presentation so the most expensive plan was listed first; and changed the copy in the main CTA buttons to Get Started Now.

Mad Mimi pricing table - variation B

According to Copy Hacker’s Joanna Wiebe, who worked on the project, both variants resulted in a massive, triple-digit increase in click-thrus — ranging from 238% to 465% in lift for Variation B; and 492% to 598% for Variation C.

Making Variation C the clear winner; and the A/B test a high-five-able success.

Proving once more the awesome power of the A/B — that, oftentimes, a few simple copy edits can make a HUGE difference.

Now fast forward a few months…

And guess what…?

The Mad Mimi pricing page is EXACTLY as it was before the test. Nary a change, optimization or copy edit has been made.

Now, am I shocked to discover this?

Only slightly.

For the reality is that, far too often, A/B test results end up going the way of the dodo.

A lot of [businesses] never actually update their page when a test wins. …Why that is the case is another post entirely… because I have noooo idea what’s holding people back from hard-coding winners.

~ Joanna Wiebe, Copy Hackers

So why do so many organizations devote time, budget and resources to conduct an A/B test, only to ignore the results in the end?

Now, granted, I am not privy to the internal conversations of the Mad Mimi UX squad. I do not know why they have left their pricing page as-is in the wake of a statistically-significant test that revealed the page’s shortcomings… and opportunities.

But I have been privy to such conversations in other organizations and, in my experience, one of the main reasons A/B tests go un-acted upon is POLITICS… and not the elephant and donkey kind, either.

Internal politics is a necessary evil in the corporate world. Sometimes it has its uses: resulting in a higher-level of quality control. And sometimes it is a mind-numbing abyss that grinds innovation to a soul-crushing halt.

Here’s how it tends to go:

Someone of a certain title forms a hypothesis before the start of the A/B test and, upon hearing results that differ from said hypothesis, they proclaim that the test must be flawed in some way.

In which case, either another soon-to-be-equally-ignored-unless-it-proves-the-aforementioned-hypothesis A/B test is commissioned; or the initial test findings are abandoned entirely in favor of no change at all or optimizations based on the gut feeling of that someone with a certain title.

Now, this is not to say that you and your organization should blindly follow the outcome of every A/B test.

Digital is both a science and an art; and should be informed by both data and human inspiration. There was many a time when Steve Jobs completely ignored user research — to Apple’s, and the world’s, benefit.

However, in the case of Mad Mimi, given the huge uptick in click-thrus, and the minimal copy edits involved in achieving that result, not permanently switching over to Variation C is a huge miss for Team Mimi — not just in achieving a better user experience, but increased revenue as well.

Advertisements