A/B Testing Summary and Review

by Dan Siroker and Pete Koomen

Has A/B Testing by Dan Siroker and Pete Koomen been sitting on your reading list? Pick up the key ideas in the book with this quick summary.

What are the first things you do when setting up a business? You come up with a name, and maybe a logo, and you create a website. Most potential customers will find out about you by clicking on your website; many will buy what you sell online, too. Clearly your website is very important.

So you need to optimize it, to use the best possible format for people to use. And how do you do this? Through A/B testing. This book summary will show you how using this method can significantly increase the number of people clicking on your site.

In this summary of A/B Testing by Dan Siroker and Pete Koomen,In this book summary you’ll discover

  • how A/B testing helped Obama get elected;
  • why when it comes to websites, less is more; and
  • why even a failed experiment is a good one.

A/B Testing Key Idea #1: If you want to optimize your website and engage your visitors, try A/B testing.

Thanks to the internet, companies have unprecedented opportunities to find new customers. All you need is a website, and then anyone can see – and buy! – whatever you’re selling. But although it sounds easy, creating a site that entices people to spend is difficult. Luckily, there’s a surefire way to figure out what works: an A/B test.

The principle behind A/B testing is simple: By showing different versions of your site to randomly selected test groups over a short period of time, you generate data on which option is the most effective. This data offers organizations huge opportunities to increase visitor engagement.

For instance, the 2008 Obama campaign website allowed users to leave their email addresses if they wanted to sign up for a newsletter or contribute to the campaign. The page displayed a photograph of the candidate in an ocean of supporters waving “Obama 2008” flags next to a field for entering email addresses, along with a “sign up” button.

The Obama team wondered whether this was the best possible image and button combination. After a series of A/B tests using different images and texts, they found a winner: Visitors were 40.6 percent more likely to share their email address when the website showed a photo of Obama surrounded by his family, next to a button that said “Learn more.” For the Obama campaign, this change resulted in 2.8 million more email subscribers and an additional $57 million in donations.

As you can see, A/B testing offers huge advantages. And it’s worth noting that although this testing technology used to be complicated and expensive, that’s no longer the case. In recent years, A/B testing has become an established method to improve web pages. And to facilitate this, web optimization platforms, such as Optimizely, have made it easy for anyone to implement A/B testing.

A/B Testing Key Idea #2: In order to start A/B testing, you need a clear hypothesis and a definition of success.

Are you ready to start optimizing your website? Well, GREAT. But before you begin, you need to consider exactly what you want to test. Because without a definition of success and a hypothesis, the test won’t tell you much.

So start by clearly defining quantifiable success metrics – that is, metrics which measure whatever data is most relevant to you – in order to evaluate the testing later.

For instance, let’s say you’re an online magazine. Well, it wouldn’t be very good strategy to implement A/B testing that simply measured clicks, because clicks don’t indicate what readers think about the content. Instead, shares, comments or repeat visits might say more about what kind of content resonates with your readers.

On the other hand, if you’re an ecommerce company, success metrics might be completed purchases, products added to cart, or product pageviews. Ultimately, success metrics will vary according to your site.

But as we mentioned earlier, success metrics aren’t the only thing to think about: You should also come up with a clear hypothesis.

To better explain this principle, let’s start with an example: The Clinton Bush Haiti Fund was established during the 2010 Haiti earthquake. And to optimize the text-only donation page, the organization brought in Optimizely. They hypothesized that adding a picture above the donation information field would increase contributions, but surprisingly the opposite happened.

So they formed a new hypothesis: Maybe, they speculated, the picture deterred contributions because visitors had to scroll further to find the donation field, making the process a little more time consuming. Testing the theory, they placed the image beside the field instead – and this was a success! The optimized layout led to over one million dollars in additional aid. Imagine if they had just tried things at random instead of using testing based on hypotheses – they might never have found this solution.

A/B Testing Key Idea #3: A/B testing is a great way to explore major changes for your site.

Although A/B testing is great for tweaking and refining your online presence, it can also lead you to make major, widespread changes in how you structure your site.

This was what Disney experienced when it performed an A/B testing experiment on the homepage of one of their TV networks, ABC Family. When they examined their search logs, the Disney team noticed that a lot of visitors were searching for specific shows. So instead of making minor tweaks, they decided to create a completely different homepage structure which listed all the shows, making them easier to find.

Disney’s goal was to increase the number of clicks on the experiment page by ten to 20 percent – which they easily surpassed. Because in fact, as a result of the A/B tests, engagement went up by 600 percent.

The apparel manufacturer Chrome also used A/B testing to radically rethink their site. The company had three boxes of promotional content on their homepage, and they assumed that A/B testing would show that the left one did best because visitors saw it first. But instead, they learned that the central box always outperformed the others, regardless of content. So eventually, this insight led to a completely different website design.

Along those lines, it’s important to note that when you’re undergoing a major site redesign, A/B testing can be an important element during – and not just after – the process.

In fact, Netflix used A/B testing in this way when the company redesigned its user interface in 2011. The original interface suggested just four titles, each with a star rating and a play button underneath. But then the video service tested another variation, which showed near-endless rows of thumbnails with pictures and titles users could scroll through. This variation was hugely effective: It not only improved retention, it also increased engagement!

A/B Testing Key Idea #4: Declutter your site design to boost user engagement.

Imagine visiting a website and being immediately bombarded with images, pop-ups and requests to sign up. No doubt you would leave as soon as possible and never return.

This is something most of us intuitively grasp: When it comes to site design, less really is more. In fact, A/B testing has shown that removing any fields that aren’t absolutely crucial will have a huge impact on engagement.

For instance, returning to Optimizely’s work on the Clinton Bush Haiti Fund’s donation page, the optimization firm noticed that there were fields for users to input “Phone number” and “Title.” However, since the charity had no use for this information, Optimizely set up an A/B test for a version without these fields. The result were impressive: This small change led to an 11 percent increase in donated dollars per pageview.

So what can you do to leverage these findings and decrease clutter on your own site? Well, you can make use of hide functions and also break up longer forms onto multiple pages.

By using a hide function (a feature that displays information only when you ask for it) for the promotion code and shopping options forms in their checkout dialogue, retailer Cost Plus World Market increased the revenue per visitor by 15.6 percent.

Of course, there are occasions when you can’t declutter or remove fields. In those cases, you should break a longer form into more pages.

This approach worked for Obama’s 2012 re-election campaign. The donation page simply couldn’t be pared down any more – every element was vital. But then the campaign team had an idea: In order to make the form appear shorter, they broke it down into two separate pages – one with the amount donated, the other with personal details. Together with other similar alterations, this tiny change amounted to an extra $190 million in donations.

A/B Testing Key Idea #5: Using the right language will engage people and make your website more effective.

An image says more than a thousand words, as the saying goes. And although there’s a lot of truth to that, A/B testing shows us that if you really want to engage people, you need to find the right words.

And one way language can help make your website more effective is by making the meaning of clicks clearer to the end user.

So for instance, with the Clinton Bush Haiti Fund donation page, instead of using the standard “Submit” button on the donation form, the A/B testers experimented with a variation that said “Support Haiti.” Their hypothesis was that this language would make the meaning of the donation clear.

And they were right – it worked! The trial version made a huge difference, one that resulted in several extra dollars donated per page view.

There’s another way to use language more effectively, especially if you want your website visitors to take action: use verbs and not nouns whenever possible.

This principle has been borne out by A/B testing, which has shown time and time again that if you want someone to do something, you need to tell them to do it. And this requires using the call-to-action imperative of a verb rather than the more descriptive function of a noun.

For example, LiveChat, a company that sells software which allows businesses to converse with visitors in real time, was hosting a free trial campaign to attract new clients. To improve the campaign’s impact, visual designer Lucy Frank wanted to test whether using a verb would increase customer action, so she set up an A/B test that pitted “Free Trial” against “Try It Free.” Changing just two words and using a verb improved the campaign click-through rate by 14.6 percent.

A/B Testing Key Idea #6: When it comes to A/B testing, failure is a blessing.

Scientists often say that there’s nothing more exciting than discovering something that doesn’t work, because you can learn so much from failure. Well, exactly the same applies to A/B testing: Failure can be a blessing.

This principle applies to every sort of failure, including the small ones. Because even dead ends that seem insignificant now can be useful later on.

For example, when e-commerce company Chrome used images to promote its urban biking products, it wondered whether using videos instead would improve conversion rates. But after a three-month trial period, A/B testing revealed that using a video instead of an image had no significant impact.

Although you could say that this test “went nowhere,” it was actually quite enlightening. Because now, if Chrome ever wants to implement video for future products, they can safely assume that it won’t hurt conversion rates.

There’s another way failed A/B tests can help you, and that’s by giving you important information about your users.

In fact, this was gaming website IGN’s experience. In order to drive more traffic to the dedicated video site (which brought in a substantial part of their ad revenue), IGN tested the effect of moving the “Videos” link in the navigation bar from the far right to the far left. As it turned out, the move had a disastrous effect: Video link clicks decreased by an astonishing 92.3 percent.

But there was a silver lining! IGN quickly realized that this had occurred because most of the site’s users were return visitors, who weren’t able to find the link once it was moved. So although this test didn’t immediately improve traffic, it did teach IGN a valuable lesson about how many of its visitors were returning users, and not new ones.

A/B Testing Key Idea #7: Convince your colleagues to implement data-driven approaches to business.

Just as political movements need activists and lobbyists to bring about change, A/B testing needs evangelists. And if you want to implement this practice within your own organization, you might need to become a spokesperson for it.

And one of the best ways to convince other people of the impact of data-driven approaches is to communicate the results of simple tests that display the immense value of A/B testing. Simply presenting images of variations and test results on a regular basis will convince stakeholders that data-driven testing works, thus bringing enormous benefits.

For example, Scott Zakrajsek, the global web analytics manager at Adidas, often uses safe bets – that is, A/B tests that are likely to quickly result in a clear win – to help others understand how A/B testing works.

Doing this on a regular basis and emphasizing the value of continuous improvement via A/B testing can lead to it become part of company culture.

This was Lizzy Allen’s experience. When Allen first joined IGN as an analyst in 2010, she was surprised that no one at the company had ever heard of A/B testing. She dedicated a month to introducing the concept to the company, and in order to make sure that her colleagues were engaged, she created the A/B Master Challenge. IGN’s employees were asked to vote on likely outcomes of A/B tests, and to predict which variations would prove to be the most effective.

Although pretty much everyone failed at making accurate predictions, the company developed a shared sense of humility. Thus people realized that there was a lot they could learn from data-driven approaches, which led to A/B testing becoming an established part of company policy.

In Review: A/B Testing Book Summary

The key message in this book:

If you want to optimize your website and engage your visitors, try A/B testing. But in order to adopt this data-driven approach, you need a clear hypothesis and a definition of success. And ultimately, doing this will engage your users, your employees, and make your website more effective.

Actionable advice:

Use a testing platform.

If you don’t have your own A/B technology, using a testing platform like Optimizely will allow you to conduct tests simply and easily without hiring a whole department of data analysts.

Suggested further reading: Contagious by Jonah Berger

Contagious examines what makes a product, idea or behavior more likely to be shared among many people. The book explores the question of whether contagious things are accidents or the results of good marketing, or whether contagiousness is an inherent feature of a product, idea or behavior. It argues that, far from being merely a matter of luck, the majority of very popular products and ideas are the result of a combination of savvy planning and execution.