Developers Test for Success: How to Accelerate Growth Through A/B Testing
by AppLovin on May 24, 2022

A/B testing is a critically important tool for improving long-term growth and sustainable revenue for an app. It helps you understand how your users are converting and provides insight into how they work within your app, what attracts new users, and what resonates with them to keep them engaged.

So how can you implement this important optimization technique? Read on to learn proven tips and strategies for A/B testing to help accelerate growth and retention, and how to create your own A/B test in AppLovin MAX.

What is A/B Testing?

A/B testing allows you to divide an audience into two (or more) groups, change a variable between content flow served to those groups, and then observe how the change affects the test group. Armed with data, you can then make data-based decisions to improve your users' experience, help acquire new users, and better monetize your app.

Different Types of A/B testing

There are two primary types of A/B testing.

  • In-app, where you test UX/UI, onboarding, and other elements while monitoring things like session time, retention, engagement, and any other app-specific behaviors you want to monitor.
  • Marketing campaigns, which includes ASO (App Store Optimization) and where you can, for example, test different ads and designs to see which works best to acquire new users, i.e. drive installs.

In MAX, A/B tests can be created to test things such as:

  • ARPDAU increases with frequency cap changes, refresh rates, bid floors.
  • Waterfall optimization.
  • Network addition/removals.
A/B testing examples to get you started
  • User acquisition / App Store Monetization (ASO)
    A/B testing can also help you better serve and monetize existing users and also better understand how to get new users by testing different app store strategies. This includes all of the different elements - images, screenshots, descriptive text, etc. - that comprise your app's page in an app store. This is a good place to test different elements to help you dial in what gets users to install your app.
  • In-App testing
    Within your app itself, you can use A/B testing to test a variety of elements. Don't forget to check:
    • Design: Is your CTA (Call-to-Action) button placed in the best location? Does it get clicked more on the left or the right side? Do different colors attract more clicks?
    • Player Engagement: Does showing competitive leaderboards increase player engagement? A/B testing can provide insight into engagement.
    • Monetization: Try different banners or test different in-app purchases to find out which receive the most clicks.
Where should I focus my testing?

With so many options and potential elements to test, what are the most important elements to look at? Here are some common ones:

  • Retention Rate. According to one report, 55% of mobile app users abandon an app one month after downloading and 21% of mobile app users abandon an app after only launching it once. After someone has downloaded your app, you could test any of the following to help figure out what keeps your users coming back:
    • Promotional content, sales, and upgrade offers.
    • Specific product features. What's your app's "killer" feature that users love?
    • Different types of in-app purchases, user rewards, and more.
  • Onboarding. A streamlined, easy-to-understand, and painless onboarding process is critical, and A/B testing can help you understand where your users might be struggling and where they are successful so you can improve the experience for them. Figure out what your users love and they'll keep coming back. Here are some quick A/B test ideas you might use to help improve your onboarding:
    • Test different signup timeframes. Do users have to immediately sign in or do you give them some breathing room before asking? Giving them time to decide gives you more time to showcase your app and could be viewed as less intrusive by your users.
    • Test different onboarding forms. After a user signs up, is the process of filling in any forms as quick and easy as it can be? What type of login is required (social media, email, etc.)? Overly complex or confusing steps in the onboarding process can result in lost users, so consider testing different forms and flows to see what works best.
    • Test different content flows. Show users how many screens or steps are remaining in the onboarding experience so they understand when it will end. You could test different design layouts, including displaying the number of screens that are left vs. dots at the bottom of the screen vs. a bar with the percentage of how much they've completed.
    • Test different messaging. A/B testing allows you to try different messaging and see what works best so you can better understand player psychology and motivations and fine-tune your messaging. Keep text clear, concise, informative, and easy to read. Use short, action-oriented words to connect with your audience, such as: easy, simple, free, love, new, proven, and save. Try different messages and message lengths, keep an eye on your CTA, and you'll learn what truly resonates with users.
How long should I test?

Generally, 1-4 weeks is probably enough time to get enough meaningful data, with an upper maximum of around 4 weeks. Why 4 weeks? Because A/B testing is not a strictly controlled environment and many sudden, unexpected factors (sudden outages, unexpected trends, etc.) could potentially affect your data. It's important to allow enough time to collect data but not so much that you run the risk of polluting it due to unexpected, external factors.

How many users should I test?

According to Nielsen Norman Group, statistical significance is the probability that an observed result could have occurred randomly without an underlying cause. This number should be smaller than 5% to consider a finding significant. If you test 2 colors of button (A and B), track the clicks (conversion rate) for each, and find button B's conversion rate to be significantly higher, then you can be 95% certain that the conversion rate for all your users will be higher using button B.

How do I create an A/B Test in MAX?

If you've never created an A/B test before in MAX, here's how:

  1. From the MAX dashboard, in the left navigation column, select Mediation > Manage > Ad Units.
  2. Click the ad unit for which you would like to set up an A/B test.
  3. In the Edit Ad Unit window, in the Default Waterfall tab, open the ⋯ menu and then select Create AB Test.
  4. Select the Copy existing ad unit configuration… check the box and then click Create AB Test. This will copy your existing waterfall to your new test.
    Note: MAX copies your existing waterfall structure into the test structure you create. You only need to make those changes that you want to test in your new Test Ad Unit. You do not have to build out a completely new waterfall from scratch.
  5. When you create a new A/B test, by default MAX applies the Test configuration to 50% of your users - but you can change this value. Read on for more information.
Congratulations! You've created an A/B test. Now what?

Now that you have an A/B test to work with, what should you begin testing, and what are some good testing strategies?

  • Define what you want to test. You can test and analyze nearly anything, but it's a good idea to start with an hypothesis, which could be as simple as something like "Players of a certain age prefer orange buttons instead of purple."
  • Test one change at a time. While you can make multiple changes (called multivariant testing) in an A/B test, it's best to start with a single change. This will make it easier to zero in on something specific that is improving (or harming) results.
  • Learn more about MAX
  • 8 Key Takeaways That Helped Etermax Improve Testing and ROI
  • How testing creatives can influence game design
Tags:
  • MAX,
  • A/B test
Share

Attachments

  • Original Link
  • Original Document
  • Permalink

Disclaimer

Applovin Corporation published this content on 24 May 2022 and is solely responsible for the information contained therein. Distributed by Public, unedited and unaltered, on 24 May 2022 14:18:03 UTC.