How to fortify your UA strategy
with creatives testing:
value in KPIs, pro tips & tricks
Maria Shestakova
CMO & App Growth Evangelist at SplitMetrics
Lucia MrvovĂĄ
Head of UA at AppAgent
01
02
03
04
05
06
07
08
09
Agenda
Importance of creative testing in UA: some real success stories with KPIs boost
How to test - golden rules of creative testing today
How to build hypotheses
Where to test - most optimal platforms and formats
How not to test - mistakes to avoid
How to win without a winner - A/B tests without a winning variant
What changes to expect with iOS14 & IDFA deprecation
Collaboration between the UA and creative teams
Q&A
The importance of creative
testing in UA. Why
Optimization Events (Data)
With automation taking over more and more of what used to
be the UA manager's job, the two biggest levers we can pull
today are:
Creatives
01 02
Creative itself has become a real targeting lever that resonates with
particular audience segments.
Example 1
Character A
Installs 402
CTR 4.32%
CVR 15.95%
Character B
Installs 247
CTR 2.80%
CVR 9.85%
Character C
Installs 517
CTR 2.49%
CVR 11.30%
Character D
Installs 496
CTR 1.62%
CVR 25.95%
Example 2
CTR 1.72%
CVR 68.38%
ROI 89.64%
CTR 2.09%
CVR 45.90%
ROI 89.34%
CTR 1.21%
CVR 48.15%
ROI 138.52%
CTR 3.61%
CVR 39.63%
ROI 77.45%
Example 3
ZiMAD Gets 36%
Conversion Uplift
by Optimizing App Store
Screenshots for Japan
Example 4
Lab Cave: A/B Testing of
App Store Screenshots
Brings 45.8% CVR Boost
Example 5
App Family Kids:
Doubles App Downloads
Changing App Icon
How to Test
01
02
Integrate creative testing into an agile process
Focus on proving a concept ïŹrst
03 Allow for statistical signiïŹcance to be reached
04 Eliminate bias
The golden rules of testing today
05 Evaluate accordingly
06 Connect the dots & create a consistent user experience on the UA journey
The Testing Process
Creative testing cannot be a standalone discipline and needs to continuously
provide insights from completed tests to feed the creative
ideation/conceptualization and thus build new hypothesis and/or variants
of existing concepts.
→ Integrate creative testing into an agile process and close the loop.
Creative concepts
vs iterations
It is, however, important to test a few (2-3) signiïŹcant iterations in order to avoid
missing out on a potential winner due to a "wrong" element, as the diïŹ€erence
between a winner and a loser is sometimes (quite often) very tiny.
→ Focus on proving a concept ïŹrst before moving into extensive iterations.
The iteration cycle
Concept 1 Concept 2 Concept 4
IdentiïŹcation
of key variables
Concept 3 Concept 5
Creation
of automated ad
Tagging variables
in the ïŹle name
Var 1 Var 2 Var 3 Var 4 Var 5

 Var 46
Running new test campaign
Allow for statistical
signiïŹcance to be reached
We have found that reaching ±300 installs per
ad tested on MAI delivers over 90% statistical
signiïŹcance.
Make sure to adapt volumes when moving to
AEO/VO setup and pass the learning phase again.
0.001 0.002 0.003 0.004 0.005 0.006 0.007
Result
Loser
Control
Unsure
Impression to Install
Eliminate bias
Force equal delivery by putting ads into separate ad sets
Have a separate account for testing OR make sure to use
brand new campaign (no control)
2 layered testing: Re-test the winners from MAI campaign
on AEO/VO for ROAS
Evaluate accordingly
DiïŹ€erent channels tend to bring diïŹ€erent quality of users, hence you might
need diïŹ€erent volumes to reach statistical signiïŹcance when testing for
bottom funnel metrics
Consider platform speciïŹc KPIs - e.g. on YouTube you want to look at the
View Rate as one of the top funnel KPIs
Connect the dots
& create a consistent user experience
on the ua journey
Video ad Store creatives Onboarding
Connect the dots
+76% YoY increase in organic visibility
+122% continual increase in store conversion
+58% YoY relative diïŹ€erence between CPI vs eCPI
+17% increase in D1 retention
+9% increase in D7 retention
+33% YoY revenue growth
We have seen recent cases, where maintaining consistency throughout
the acquisition journey has led to:
How to test
Start with visuals, keep the description A/B testing for later.
Cut your long video down to 10-12 sec and run an A/B test.
When limited by time or budget, better run a 2-variation test.
Use multivariate A/B experiments ONLY to test concepts or when rebranding the app.
Pre-launch A/B tests: redirect users who tap on “Get” to the
survey page with 2-3 short questions.
The golden rules of app store testing today
How to build hypotheses:
looking for hypothesis ideas
One of the most essential steps in creatives testing is creating a hypothesis.
The WHY = research that you have to perform before forming a hypothesis
The element you’re going to change in the new variation
The predicted result of the change
And the rationale, or the WHY? behind the test
Hypothesis includes:
How to build hypotheses:
looking for hypothesis ideas
Study current design
and industry trends
How to build hypotheses:
looking for hypothesis ideas
Find out best practices
for your app or game
category.
How to build hypotheses:
looking for hypothesis ideas
Examine your
competitors.
How to build hypotheses:
looking for hypothesis ideas
Reconsider your app
store product page.
Where to test
Most optimal platforms and formats
How not to test
01
02
Forgetting to look at the top funnel KPIs - Consider full funnel
Interrupting the test prematurely
03 Creating limitations through counterproductive goals
04 Using wrong tools
Mistakes to Avoid
Consider full funnel KPIs
Looking at primary top funnel KPIs will not only allow for more reliable
comparisons but also save a signiïŹcant amount of budget as success/failure
can be seen much faster.
Top funnel KPIs are important metrics for creative testing
Primary KPIs to consider: Scale, CTR, IPM / IR, ROI
Secondary KPIs to consider: Relevance score & engagement
(likes, comments etc.)
Allow enough time to pass
Ideally, you also want the creative to exit the learning phase,
so as to allow for reliable, optimized results.
Given that all apps experience weekly seasonality
it is important to ideally test the creatives for 7 full days.
Creating limitations
The trick is not to avoid losing at all costs but, on the contrary, to lose fast
and ïŹnd a winner that will be able to compensate for all the losses
accumulated on the way.
Creative testing is a numbers game and, as a rule, includes losing many times
before ïŹnding the next big win.
Using wrong tools
01
02
A/B test results via Google Play Experiments diïŹ€er from the ones from paid traïŹƒc sources.
Lack of data on user behavior: no data on average time spent on variation page,
screenshots scroll-through, etc.
03 Only 5 A/B testing experiments at a time.
04 You can test only app page visuals.
05 No tests on the Search and Category pages.
06 You cannot compare the performance of your app to your competitors.
07 No pre-launch tests.
08 No tests for iOS version of your app.
8 limitations of Google Experiments:
Using wrong tools
01
02
Tests results are only applicable to Apple Search Ads where user behaviour is quite speciïŹc.
You cannot experiment with product page elements: app icon, name, promo text, etc.
03
You can use only screenshots and app previews from the product page with a limit up to
10 images and in a ïŹxed order.
04 Creative Sets are bound to keywords which makes results interpretation tricky.
05 It’s hard to measure the statistical signiïŹcance of Creative Sets testing.
06 The sample size for testing within Apple Search Ads should be calculated manually.
07 Lack of user behavior insights.
08 Number of tests at a time is limited by the number of existing Creative Sets.
10 limitations of Creative Sets in Apple Search Ads:
09 It’s quite time-consuming: to change a concept you need to update a product page.
10 It’s impossible to run pre-launch tests.
What tools to use?
01
02
Great variety of A/B tests: test icons, screenshots, video preview, multivariate combinations, etc.
Unlimited number of experiments for iOS and Android
03 3 testing environments: Product, Search & Category Pages
04 In-depth behavioral analytics: every single click and scroll
05 Advanced segmentation to track the diïŹ€erence in conversion by gender, age, & more
06 Competitive intelligence: analyze your rivals’ CVR and compare against their apps
07 Seamless integrations with all current tracking tools
08 Full guidance by ASO experts to ensure your app growth
SplitMetrics - convert ideas into new levels of proïŹtability with in-depth insights.
How to win without a winner
No-winner A/B tests gather lots of info on user behaviour.
And these are valuable insights for new hypotheses and further A/B experiments.
Getting a winner on the very ïŹrst A/B test is highly unlikely.
Psychological trap: with no winner UA manager may be biased towards the new variation.
Way out: refer to the stats gathered throughout an A/B test.
How to win
without a
winner
How to win without a winner
How to win
without a
winner
What changes to expect with
iOS14 and IDFA deprecation
Impact of iOS14 on ASO & creatives testing
Even less control over targeting
Even more leverage on the creative side
No more personalised retargeting
Impact of iOS14 on ASO
& creatives testing
App collections
Impact of iOS14 on ASO
& creatives testing
Typo correction in
App Store search
Impact of iOS14 on ASO
& creatives testing
Updated product
page details
Impact of iOS14 on ASO
& creatives testing
App Clips
Source: developer.apple.com
IDFA deprecation
IDFA deprecation aïŹ€ects app monetization and UA, as the identiïŹer is used
for attribution, analytics, retargeting, creating lookalike audiences.
Limited targeting = higher acquisition costs
Conversion optimization through creatives A/B testing is getting really important,
because optimized banners and product pages will help decrease CPI.
Collaboration between UA
and creative teams
Appoint main responsible person on each side
Meet weekly
Review and translate the data for creative team
Check in on progress early to avoid misalignment
Look back at the M/Q together + research and brainstorm
Collaboration between UA and creative teams
Creative
Concept Test
WorkïŹ‚ow
Design
to produce
UA to analyse
& Design to
feedback
UA & Design to
deïŹne together
Design to deliver
& UA to test
UA to run &
deliver
5. RE-ITERATE /
MOVE ON
4. CONCLUSION
1.CREATIVE
CONCEPT
2. A/B TEST
Keep iterating on
or run a new concept
Develop an idea for
an asset based on a
hypothesis
Draw a conclusion based
on the analysed data
Run a test of 2-3
iterations of the new asset
Measure and document
the performance results
3. RESULTS
Join Us
Mobile Marketing Lead
careers.appagent.co
9 Key Insights
01
02
Continuous creative testing is the pillar of successful UA; always produce new ads.
A good cadence to start with is 5 concepts/week; 2-3 major iterations each.
03 Test over full 7 days to cover weekly seasonality and exit the learning phase.
04 When possible, isolate the placement & always reach statistical signiïŹcance.
05 Agile collaboration between the UA & creative teams is crucial.
06 Start with building strong hypothesis for your creatives testing.
07 Choose the A/B testing tool that ïŹts your goals & scale.
08 There are valuable insights in tests with no winner - don’t overlook this information.
09 With IDFA deprecation conversion optimization through creatives A/B testing is getting really important,
as it will help decrease CPI.
Q&A
 
Thank You
Maria Shestakova
CMO & App Growth
Evangelist at SplitMetrics
Lucia MrvovĂĄ
Head of UA
at AppAgent
Maria@splitmetrics.com Lucia@appagent.co

How to fortify UA strategy with creatives testing

  • 1.
    How to fortifyyour UA strategy with creatives testing: value in KPIs, pro tips & tricks Maria Shestakova CMO & App Growth Evangelist at SplitMetrics Lucia MrvovĂĄ Head of UA at AppAgent
  • 2.
    01 02 03 04 05 06 07 08 09 Agenda Importance of creativetesting in UA: some real success stories with KPIs boost How to test - golden rules of creative testing today How to build hypotheses Where to test - most optimal platforms and formats How not to test - mistakes to avoid How to win without a winner - A/B tests without a winning variant What changes to expect with iOS14 & IDFA deprecation Collaboration between the UA and creative teams Q&A
  • 3.
    The importance ofcreative testing in UA. Why Optimization Events (Data) With automation taking over more and more of what used to be the UA manager's job, the two biggest levers we can pull today are: Creatives 01 02 Creative itself has become a real targeting lever that resonates with particular audience segments.
  • 4.
    Example 1 Character A Installs402 CTR 4.32% CVR 15.95% Character B Installs 247 CTR 2.80% CVR 9.85% Character C Installs 517 CTR 2.49% CVR 11.30% Character D Installs 496 CTR 1.62% CVR 25.95%
  • 5.
    Example 2 CTR 1.72% CVR68.38% ROI 89.64% CTR 2.09% CVR 45.90% ROI 89.34% CTR 1.21% CVR 48.15% ROI 138.52% CTR 3.61% CVR 39.63% ROI 77.45%
  • 6.
    Example 3 ZiMAD Gets36% Conversion Uplift by Optimizing App Store Screenshots for Japan
  • 7.
    Example 4 Lab Cave:A/B Testing of App Store Screenshots Brings 45.8% CVR Boost
  • 8.
    Example 5 App FamilyKids: Doubles App Downloads Changing App Icon
  • 9.
    How to Test 01 02 Integratecreative testing into an agile process Focus on proving a concept ïŹrst 03 Allow for statistical signiïŹcance to be reached 04 Eliminate bias The golden rules of testing today 05 Evaluate accordingly 06 Connect the dots & create a consistent user experience on the UA journey
  • 10.
    The Testing Process Creativetesting cannot be a standalone discipline and needs to continuously provide insights from completed tests to feed the creative ideation/conceptualization and thus build new hypothesis and/or variants of existing concepts. → Integrate creative testing into an agile process and close the loop.
  • 11.
    Creative concepts vs iterations Itis, however, important to test a few (2-3) signiïŹcant iterations in order to avoid missing out on a potential winner due to a "wrong" element, as the diïŹ€erence between a winner and a loser is sometimes (quite often) very tiny. → Focus on proving a concept ïŹrst before moving into extensive iterations.
  • 12.
    The iteration cycle Concept1 Concept 2 Concept 4 IdentiïŹcation of key variables Concept 3 Concept 5 Creation of automated ad Tagging variables in the ïŹle name Var 1 Var 2 Var 3 Var 4 Var 5 
 Var 46 Running new test campaign
  • 14.
    Allow for statistical signiïŹcanceto be reached We have found that reaching ±300 installs per ad tested on MAI delivers over 90% statistical signiïŹcance. Make sure to adapt volumes when moving to AEO/VO setup and pass the learning phase again. 0.001 0.002 0.003 0.004 0.005 0.006 0.007 Result Loser Control Unsure Impression to Install
  • 15.
    Eliminate bias Force equaldelivery by putting ads into separate ad sets Have a separate account for testing OR make sure to use brand new campaign (no control) 2 layered testing: Re-test the winners from MAI campaign on AEO/VO for ROAS
  • 16.
    Evaluate accordingly DiïŹ€erent channelstend to bring diïŹ€erent quality of users, hence you might need diïŹ€erent volumes to reach statistical signiïŹcance when testing for bottom funnel metrics Consider platform speciïŹc KPIs - e.g. on YouTube you want to look at the View Rate as one of the top funnel KPIs
  • 17.
    Connect the dots &create a consistent user experience on the ua journey Video ad Store creatives Onboarding
  • 18.
    Connect the dots +76%YoY increase in organic visibility +122% continual increase in store conversion +58% YoY relative diïŹ€erence between CPI vs eCPI +17% increase in D1 retention +9% increase in D7 retention +33% YoY revenue growth We have seen recent cases, where maintaining consistency throughout the acquisition journey has led to:
  • 19.
    How to test Startwith visuals, keep the description A/B testing for later. Cut your long video down to 10-12 sec and run an A/B test. When limited by time or budget, better run a 2-variation test. Use multivariate A/B experiments ONLY to test concepts or when rebranding the app. Pre-launch A/B tests: redirect users who tap on “Get” to the survey page with 2-3 short questions. The golden rules of app store testing today
  • 20.
    How to buildhypotheses: looking for hypothesis ideas One of the most essential steps in creatives testing is creating a hypothesis. The WHY = research that you have to perform before forming a hypothesis The element you’re going to change in the new variation The predicted result of the change And the rationale, or the WHY? behind the test Hypothesis includes:
  • 21.
    How to buildhypotheses: looking for hypothesis ideas Study current design and industry trends
  • 22.
    How to buildhypotheses: looking for hypothesis ideas Find out best practices for your app or game category.
  • 23.
    How to buildhypotheses: looking for hypothesis ideas Examine your competitors.
  • 24.
    How to buildhypotheses: looking for hypothesis ideas Reconsider your app store product page.
  • 25.
    Where to test Mostoptimal platforms and formats
  • 26.
    How not totest 01 02 Forgetting to look at the top funnel KPIs - Consider full funnel Interrupting the test prematurely 03 Creating limitations through counterproductive goals 04 Using wrong tools Mistakes to Avoid
  • 27.
    Consider full funnelKPIs Looking at primary top funnel KPIs will not only allow for more reliable comparisons but also save a signiïŹcant amount of budget as success/failure can be seen much faster. Top funnel KPIs are important metrics for creative testing Primary KPIs to consider: Scale, CTR, IPM / IR, ROI Secondary KPIs to consider: Relevance score & engagement (likes, comments etc.)
  • 28.
    Allow enough timeto pass Ideally, you also want the creative to exit the learning phase, so as to allow for reliable, optimized results. Given that all apps experience weekly seasonality it is important to ideally test the creatives for 7 full days.
  • 29.
    Creating limitations The trickis not to avoid losing at all costs but, on the contrary, to lose fast and ïŹnd a winner that will be able to compensate for all the losses accumulated on the way. Creative testing is a numbers game and, as a rule, includes losing many times before ïŹnding the next big win.
  • 30.
    Using wrong tools 01 02 A/Btest results via Google Play Experiments diïŹ€er from the ones from paid traïŹƒc sources. Lack of data on user behavior: no data on average time spent on variation page, screenshots scroll-through, etc. 03 Only 5 A/B testing experiments at a time. 04 You can test only app page visuals. 05 No tests on the Search and Category pages. 06 You cannot compare the performance of your app to your competitors. 07 No pre-launch tests. 08 No tests for iOS version of your app. 8 limitations of Google Experiments:
  • 31.
    Using wrong tools 01 02 Testsresults are only applicable to Apple Search Ads where user behaviour is quite speciïŹc. You cannot experiment with product page elements: app icon, name, promo text, etc. 03 You can use only screenshots and app previews from the product page with a limit up to 10 images and in a ïŹxed order. 04 Creative Sets are bound to keywords which makes results interpretation tricky. 05 It’s hard to measure the statistical signiïŹcance of Creative Sets testing. 06 The sample size for testing within Apple Search Ads should be calculated manually. 07 Lack of user behavior insights. 08 Number of tests at a time is limited by the number of existing Creative Sets. 10 limitations of Creative Sets in Apple Search Ads: 09 It’s quite time-consuming: to change a concept you need to update a product page. 10 It’s impossible to run pre-launch tests.
  • 32.
    What tools touse? 01 02 Great variety of A/B tests: test icons, screenshots, video preview, multivariate combinations, etc. Unlimited number of experiments for iOS and Android 03 3 testing environments: Product, Search & Category Pages 04 In-depth behavioral analytics: every single click and scroll 05 Advanced segmentation to track the diïŹ€erence in conversion by gender, age, & more 06 Competitive intelligence: analyze your rivals’ CVR and compare against their apps 07 Seamless integrations with all current tracking tools 08 Full guidance by ASO experts to ensure your app growth SplitMetrics - convert ideas into new levels of proïŹtability with in-depth insights.
  • 33.
    How to winwithout a winner No-winner A/B tests gather lots of info on user behaviour. And these are valuable insights for new hypotheses and further A/B experiments. Getting a winner on the very ïŹrst A/B test is highly unlikely. Psychological trap: with no winner UA manager may be biased towards the new variation. Way out: refer to the stats gathered throughout an A/B test.
  • 34.
  • 35.
    How to winwithout a winner
  • 36.
  • 37.
    What changes toexpect with iOS14 and IDFA deprecation Impact of iOS14 on ASO & creatives testing Even less control over targeting Even more leverage on the creative side No more personalised retargeting
  • 38.
    Impact of iOS14on ASO & creatives testing App collections
  • 39.
    Impact of iOS14on ASO & creatives testing Typo correction in App Store search
  • 40.
    Impact of iOS14on ASO & creatives testing Updated product page details
  • 41.
    Impact of iOS14on ASO & creatives testing App Clips Source: developer.apple.com
  • 42.
    IDFA deprecation IDFA deprecationaïŹ€ects app monetization and UA, as the identiïŹer is used for attribution, analytics, retargeting, creating lookalike audiences. Limited targeting = higher acquisition costs Conversion optimization through creatives A/B testing is getting really important, because optimized banners and product pages will help decrease CPI.
  • 43.
    Collaboration between UA andcreative teams Appoint main responsible person on each side Meet weekly Review and translate the data for creative team Check in on progress early to avoid misalignment Look back at the M/Q together + research and brainstorm
  • 44.
    Collaboration between UAand creative teams Creative Concept Test WorkïŹ‚ow Design to produce UA to analyse & Design to feedback UA & Design to deïŹne together Design to deliver & UA to test UA to run & deliver 5. RE-ITERATE / MOVE ON 4. CONCLUSION 1.CREATIVE CONCEPT 2. A/B TEST Keep iterating on or run a new concept Develop an idea for an asset based on a hypothesis Draw a conclusion based on the analysed data Run a test of 2-3 iterations of the new asset Measure and document the performance results 3. RESULTS
  • 45.
    Join Us Mobile MarketingLead careers.appagent.co
  • 46.
    9 Key Insights 01 02 Continuouscreative testing is the pillar of successful UA; always produce new ads. A good cadence to start with is 5 concepts/week; 2-3 major iterations each. 03 Test over full 7 days to cover weekly seasonality and exit the learning phase. 04 When possible, isolate the placement & always reach statistical signiïŹcance. 05 Agile collaboration between the UA & creative teams is crucial. 06 Start with building strong hypothesis for your creatives testing. 07 Choose the A/B testing tool that ïŹts your goals & scale. 08 There are valuable insights in tests with no winner - don’t overlook this information. 09 With IDFA deprecation conversion optimization through creatives A/B testing is getting really important, as it will help decrease CPI.
  • 47.
  • 48.
    Thank You Maria Shestakova CMO& App Growth Evangelist at SplitMetrics Lucia MrvovĂĄ Head of UA at AppAgent Maria@splitmetrics.com Lucia@appagent.co