<p>In this post we will go through all the steps to adopt A/B testing as a common practice on your project.<br></p><p>TL;DR Follow up our checklist with all the insights gathered from doing A/B tests on our projects: <a href="https://devchecklists.com/ab-testing-checklist/">A/B Testing Checklist</a> & <a href="https://github.com/vintasoftware/ab-testing-checklist">A/B Testing Checklist Repository</a><br></p><h2 id="why-a-b-tests">Why A/B Tests</h2><p>Knowing what customers want can be a matter of life or death for many companies. From the most basic components of your site to major changes in the flow, it’s important to know: the type of communication that performs better, the images that please your target users the most, and what makes the customer follow the flow.<br></p><p>Any product can be tested and it leads to heaps of value to your company. For a small product, this is possible by opening a communication channel with the users or elaborating usability tests for instance. However, not all precious things can be on the radar on these tests, besides the fact, this strategy has poor scalability: imagine yourself having to schedule exploratory meetings with hundreds, perhaps thousands of users and crunching all that data, added to the long list of features I’m sure you already have to implement. That is followed by methodological questions: How do you evaluate what scenario performs better? And how do you make sure that your test scenario lines up with the real scenario? Things just started to get a little more complicated for you to deal by yourself while developing, managing or mentoring.<br></p><h2 id="what-are-a-b-tests">What are A/B tests</h2><p>A/B tests are a remarkable way of getting this information, regardless of what kind of question you want to answer. It delivers on a simple matter, with crystalline goals. A simple definition for A/B testing is the practice of setting up two outcomes of the same scenario to different segments of visitors at the same time and comparing which variation achieves one specific goal the best. "A", is the control group, the current state of your system. "B", is the modified experience that tries to achieve a specific goal. <br></p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/xznTOiUKxMRh693IniK3uixaSPOcaurwSZjMe3wSW8TGIwkJ1haX3c1emCUHtp1IHQwlXaFhcB6Vw0QsoP_-hs3oK3DIpfq26Y7qLdGaoyyVCnidYuXQnw49E8dKsqVNZEK2C9zx" class="kg-image"></figure><p>This goal can be one or many to help you enhance the experience. Such as the following:</p><p>- <strong>Content Engagement</strong>: You want customers to engage with your product, evaluating and measuring every aspect of the content, giving both you and your clients a better perspective on the product.</p><p>- <strong>Conversion rates</strong>: Conversion metrics are unique to each website. For e-commerce, they may be the gross sales of the products, or revenue, while for a B2B, they may be the generation of qualified leads.</p><p>- <strong>Abandonment rates</strong>: This metric is very useful to e-commerce sites because is directly associated with the use of virtual shopping carts. Abandonment of virtual carts is a quite common thing, knowing the ratio abandoned shopping carts over the number of initiated transactions can be crucial to increase conversions.</p><p>- <strong>Bounce Rates</strong>: Bounce rate is a measure of "stickiness", you need to know the percentage of visitors who enter the site and then leave rather than continuing to view other pages within the same site.</p><p>- <strong>Others</strong>: Other metrics can be used as well, like page views, session duration, clicks, share rate, churn rate or any other success metrics that may impact your business. <br></p><p>Through A/B testing you will be able to see which variant impacted your customers the most, and then alter your strategy to be more appealing to your target audience. Check further examples on this Basecamp post: [How we lost (and found) millions by not A/B testing](https://signalvnoise.com/posts/3945-how-we-lost-and-found-millions-by-not-ab-testing) and see how much monetary value they found in A/B testing.<br></p><h2 id="where-on-your-product-to-do-a-b-tests">Where on your product to do A/B tests?</h2><p>Let’s take the Basecamp example from the above link. In 2014, there was an odd drop of sign up rates on Basecamp’s website. The sign-up step had been removed from their homepage by a design iteration.<br></p><p>After that, metrics plunged. The issue could have been assessed and addressed within 2 weeks of A/B testing on the homepage, gathering information to make an informed decision about which homepage was best for their business. After bringing the sign up back, ratings spiked. Basecamp kept the sign up on their homepage until recent days:<br></p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/IzqTAGhyu7H6ErYrOtGJZsdMEwewBqlogZ3kJIZ8J14zOkT_EajH1YvL2T1Lq84xvlVOF8fH8AtK_pBthEzQYVz_lHgXPBwmDLYRKWZclMBtPgKqOyn-vW5eOvrtP-3P2ElT6y3T" class="kg-image"></figure><p>Now, they just change it again to a call to action button.<br></p><p>Landing pages are pages designed to convert visitors into leads, and testing different types of them is just one of the possible things you can do. You can A/B test all components of your website that might influence visitor behavior:<br></p><p>- Headlines</p><p>- Sub-headlines</p><p>- Paragraph Text</p><p>- Testimonials</p><p>- Call to Action Button</p><p>- Links</p><p>- Images<br></p><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/KsyNEyOCwtlvVYxeiNMf7W1oE9LZPqnyjYVYBKy2TtSvWEAvCttAzqDmg91vfUoSfn_5CwRRTmBZp62_Fkv0a2TCk-yk_skAre-mdol4mLLu-f2AoGOnzLtMgmhArXLA6wd3l-bm" class="kg-image"></figure><p></p><p>Let's take another example: Netflix is also a strong adopter of the experimenting philosophy. Every product change Netflix considers - yes, I'm surprised by that too - goes through a rigorous A/B testing process before becoming the default user experience. In their blog post <a href="https://medium.com/netflix-techblog/selecting-the-best-artwork-for-videos-through-a-b-testing-f6155c4595f6">Selecting the best artwork for videos through A/B testing</a> they show a commitment to test everything possible, even the images associated with films’ titles are A/B tested, sometimes resulting in 20% to 30% more views for that title.</p><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/6pqzSRWg9-XYWAaplGiQZV9e9UCQGKSqJ_zhh6D9lbXY0uejaFzjqR6g1SePeVPx84J6EtvsjUf3teocD3qjNiIUS-FYA6dWWAyuL3SezI2MkUjk7bhkmbKCb-Xgzo0KijlB_LiS" class="kg-image"></figure><p>Finishing up the round of examples, Facebook has recently carried out some experiments in their own app.</p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/6ISLLJJKFdEVHo4IhVkdhd94kSD0eSW_-L3Pni3OM5LOU4LNmRiP9Sn9b6ki3oStQgQitgwi2qsRDU6QRY-fcAYWm7HkjPrdLeOlsI8VvZQAUQDGwmI_GaU8F5JcJt2JqEbxIDiC" class="kg-image"></figure><p></p><p>Check out the usual header of the app. They tested minor decisions around the design and information architecture, assuming that these things also need to be validated in the real world, resulting in more than 12 different icons being tested on the tab bar.</p><p>[newsletter widget]</p><h2 id="how-to-perform-a-b-tests">How to perform A/B tests</h2><p>Although it’s relatively easy to set up an experiment, don't fall for adding experiments on all of your features. You need to be strategic about what you’ll test. Testing everything just for the sake of testing isn’t experimenting, neither will it validate everything. It's not possible to test every single possibility, and even if you could, testing every possibility would only make you good at A/B testing, and would delay the development of the right product faster.<br></p><p>From our research and experiences here at Vinta a well-structured flow to A/B testing is:</p><p>- <strong>Start with a Research</strong>: Every project has room for improvement, search for them, discuss them with the team. Back these insights with data: use the analytics tools available, such as Google Analytics, heatmaps, surveys, to collect data on visitor behavior and metrics. Use all this data to find out what is stopping visitors from converting. Remember this: <em>One in every seven A/B tests is a winning test</em>, doing this step right can improve this statistic, and save you some time.</p><p>- <strong>Formulate a Hypothesis</strong>: This is the time to build a well-defined hypothesis. Based on the insights from your research, build a hypothesis to increase conversions or achieve one specific goal. Here’s an awesome article about formulating a good hypothesis: <a href="https://www.abtasty.com/blog/formulate-ab-test-hypothesis/">A/B Test Hypothesis Definition, Tips and Best Practices</a></p><p>- <strong>Creating a variation</strong>: This is when you build the variation based on your hypothesis, to be A/B tested against the existing version. If you don't use a service to build experiments, here would be the part in which the dev team would work to bring your variation to life.</p><p>- <strong>Testing</strong>: Flag off the test and wait for the stipulated time in order to achieve a statistically significant result. Unsure about how to define the experiment duration? Here’s a great way to do it: <a href="https://vwo.com/ab-split-test-duration/">A/B split & multivariate test duration calculator</a></p><p>- <strong>Analyzing results and drawing conclusions</strong>: Analyze the test results keeping in mind monthly visitors, current conversion rate, the expected change in the conversion rate, weekday/weekend/holiday variations, seasonality of your traffic, sample size, and all other variables that can interfere on your results. In case of a positive outcome, deploy the winning variation. If the results remain inconclusive, draw insights from it and implement these in a subsequent test.</p><h2 id="useful-tools-to-a-b-test">Useful Tools to A/B test</h2><p>We have talked before about analytics tools, they are important tools for any company. One of them is Google Analytics, as it provides a way to analyze your customers’ behavior. If you don't have a web analytics package on your site, this is something you should do right away. Sign up to Google Analytics and start collecting data to base decisions and tests on. You can't do A/B tests without a baseline data.<br></p><p>When it comes to actually implementing the test experiments there are a lot of options available such as: <a href="https://www.optimizely.com/">Optimizely</a>, <a href="https://vwo.com/">VWO</a>, <a href="https://www.adobe.com/marketing/target.html">Adobe Target</a>, <a href="https://optimize.google.com">Google Optimize</a>, <a href="https://www.abtasty.com/">AB Tasty</a>, and so many others. Here at Vinta, we recommend the use of Google Optimize, which is easy to set up, freemium and integrated with Google Analytics by default. Don't waste your time trying to build your own platform. From my experience, Optimize will fill 95% of your needs, and when it doesn't you will be able to complement Optimize with an implementation of your own, they even provide a way for you to perform <a href="https://developers.google.com/optimize/devguides/experiments">Server-side Experiments</a>.<br></p><h2 id="google-optimize">Google Optimize</h2><p>As I said we use Google Optimize because it's simple to set up and use. It’s integrated with Google Analytics, and the free plan already comes with the basic features needed by any application. In a quick manner, Optimize provides four types of experiments:<br></p><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/wbFteQVuDrx1wcveR6CKC7JsGjRPYAPtG4QFcCtZkI1qj7BmupC0XVltFc_6e2TyZ2SMals7f-Jse0r0EWyyx-X2gXm3b3DNylJx_9yuQzBbmPDIT7fCfMSHZu8LollnGplfg11U" class="kg-image"></figure><p>- <strong>A/B tests</strong>: The A/B test in its basic form. A randomized experiment using two or more variants of the same web page (A and B). Each variant is served at similar times so that its performance can be observed and measured independently of other external factors.</p><p>- <strong>Multivariate Tests</strong>: A multivariate test (MVT) experiments variants of two or more elements simultaneously to see which combination creates the best outcome. Instead of showing which page variant is most effective (as in an A/B test), MVT identifies the most effective variant of each element and analyzes the interactions between them. This is very useful for optimizing multiple aspects of a landing page. The drawback of MVT is that they can lead to unreliable results, by having multiple variants the volume of visitors will be spread through the variants resulting in longer tests and the possibility of not achieving statistical reliability needed for decision-making.</p><p>- <strong>Redirect Tests</strong>: A redirect test is a type of A/B test that allows you to test different web pages against each other. In redirect tests, variants are identified by URL or path instead of traditional page components. They are useful when you want to test two very different landing pages, a complete redesign of a page, or two different flows.</p><p>- <strong>Personalizations</strong>: In case you want to build a landing page for customers who’ll access the site from a specific campaign. Optimize allows you to build a completely new page without the need of a real deploy.<br></p><p>After the creation of the experiment, Optimize gives you all the tools to create variations, define traffic, goals and more:<br></p><figure class="kg-card kg-image-card"><img src="https://lh3.googleusercontent.com/CUe__8d_lL67pdTpl-3rlAxJcWaiRn5eCXxvAF7jCAWhceloMYj75UDJ-SP2lROGws9lK71c6oZPAD89mX0_1PHNg2KMLa_NAXxgjtuiajJnxBtT6rsrObN381PBQO0csJDGghal" class="kg-image"></figure><p>- <strong>Create and edit variants</strong>: The specific changes to your web page are called variants. Optimize provides an editor through which you can create as many variants as you wish to test against your original page</p><p>- <strong>Multiple options of users segmentation</strong>: The selection of which users will participate on your test can be done in various ways: by URL, by source, by a campaign, by Google Analytics Audience, by GeoLocation or even randomly.</p><p>- <strong>Goals</strong>: On Optimize, goals are also called Objectives, and they are synced with your analytic goals. They are the metrics used to measure your variants.</p><p>- <strong>Preview variants</strong>: Before publishing the experiments you can preview your changes to see if everything looks as planned.</p><p>- <strong>Consistency when showing experiments on the same device</strong>: Once the user accesses one of the variants, they will access the same variant as long as the experiment is running.<br></p><p>Beyond creating and running experiments, Optimize also provides a default result analyzer using the <a href="https://www.analyticsvidhya.com/blog/2016/06/bayesian-statistics-beginners-simple-english/">Bayesian</a> method to perform an analysis on the performance of the experiment. The following images are an example of what Optimize reports can look like. The results are divided into sections, where the first one is a summary that displays the experiment status and a summary of the results. The second section is the improvement overview card, which compares the performance of the original to your variant(s), with their percentual improvement against the experiment’s objective(s). The third section displays the performance of your variants against a chosen objective.</p><figure class="kg-card kg-image-card"><img src="https://lh4.googleusercontent.com/8nvNzS5mA5DkubENaypxotGetLD7Fqm5wNfYkvaKnH6r5rc_bI_1E8bZ3b_PHZJp97ICxboVc-eyDhXDj2UXLdwMYT6Q-cfFR7aUUWukw0gck_2KVPh1eRoDg3sBSYcNxvRtIXV0" class="kg-image"></figure><figure class="kg-card kg-image-card"><img src="https://lh6.googleusercontent.com/wBbV5Qu2xFGT8k6lPc4_kbijOsIBeCASFRE7yqybVXZVTKsmbXaLFHOPkYsquc9A5xxVU0aaldEdMdKTOiLrX2fvIX0oL_plfiJ1kwF6EJfamsQx_PnmEFx13La46tQpE3Va9qeJ" class="kg-image"></figure><figure class="kg-card kg-image-card"><img src="https://lh5.googleusercontent.com/C1EgRDqTAh9GKiS9vgVXIB4Dm7qyxAiybVXU-8gQLYfweG0-0UZfXP3p7uGBQL3bYnORolqWD7anvLMUCgEcYyeJE_p64gIAPd_e7i6-p7VmM9t6mupARdRCaLf7IOn_VlQq8aRl" class="kg-image"></figure><p>A/B testing is hard, although it doesn’t look like it. Since here at Vinta we are huge fans of checklists, we’ve developed a <a href="https://devchecklists.com/ab-testing-checklist/">checklist</a> that tries to provide an easy and quick set of things that we recommend you to keep in mind when A/B testing. This checklist is linked to <a href="https://github.com/vintasoftware/ab-testing-checklist">this GitHub</a> repository, so feel free to contribute too.<br></p><p>Want to know more? There’s a free course on Udacity for you: <a href="https://www.udacity.com/course/ab-testing--ud257">A/B testing by Google</a><br></p><p>Also a big thanks to Amanda and Rob for reviewing this post!</p>
Join the Tech Forward newsletter
Stay ahead of the curve with our latest trends about web development.
By clicking “Accept all”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. Check our privacy policies.