Good and bad practices in user segmentation based on Geewa case. Interconnecting monetization with player progression. How we maximized user experience, increasing monetisation based on that.
2. Miroslav Pikhart
Geewa.a.s.
Head of BI
miroslav.pikhart@geewa.com
Using personalization to improve monetization
and what we learned working on Smashing Four
3. About Smashing Four - core game
● Free-to-play PvP game
● Teams of four heroes fight in the arena
● Characters resemble pool balls
● Each hero has an ability
4. About Smashing Four - economy
● Players get orbs for winning matches or buy
them in shop
● Orbs contain random hero cards and gold
● Players collect these resources to upgrade
heroes and reach higher arenas
● Higher arenas lead to higher rewards,
completing the loop
5. Why your game needs a data scientist
and what personalized content can do for your revenue
6. Why your game needs a data scientist
and what personalized content can do for your revenue
7. Why your game needs a data scientist
and what personalized content can do for your revenue
8. Special offers in Smashing Four
● With global launch, we introduced special offers
● Essentially a chance to buy bundled orbs and
currency at a discount
● Had positive impact on conversion
11. Personalization case study 01 - Hero orb
● Goal was to create a new product to improve monetization
● Instead of pushing richer special offers, we wanted to offer our
player something that he personally would like
● In order to reach wider audience, low price-point to start with
13. Hero orb - what it actually is
● There’s a relatively simple algorithm behind the
scenes that scores heroes based on:
○ How often players use the hero
○ How often they request him from their clan
○ How much they upgrade him
14. Lesson learned #1 - Define your tests carefully
● At the very beginning, we tried to A/B test the
feature:
○ One group got the hero orb offer for $5 USD
○ The other got the very same offer for 500 gems
(in-game currency that’s worth exactly $5 USD)
15. Lesson learned #1 - Define your tests carefully
● Then we looked at the conversion to payer...
16. Lesson learned #1 - Define your tests carefully
● Then we looked at the conversion to payer...
17. Lesson learned #1 - Define your tests carefully
● Based on the figure just shown, the product
intended to go with gems, however...
18. Lesson learned #1 - Define your tests carefully
● Based on the figure just shown, the product
intended to go with gems, however...
●
19. Lesson learned #1 - Define your tests carefully
● The initial graph was significantly skewed by ‘old’ players
● These only wanted to find a way to spend their ‘free’ gems
● This shows how important it is to set your metrics precisely
20. Hero orb - More testing
● After settling on running the offers for USD, we were
happy with the performance, but continued testing
other things:
○ Using different algorithms and tweaking them
○ Changing when users see the offer
21. Lesson learned #2 - Don’t test for the sake of testing
● This is the most common mistake I have encountered so far
● We spent weeks designing and evaluating A/B tests
● Most of those were bringing in almost no information
● It took us months to improve the hero orbs as a result
23. Lesson learned #2 - Don’t test for the sake of testing
● Specify the hypothesis you want to prove before running
any test
● Come up with some KPIs you can measure that are both
relevant and not corrupted
● Always have a control group to measure against
24. Hero orb - the results
● In the end, the product turned into our best performer
●
25. Hero orb - takeaways
● Any type of content personalization beats simple offers
● You can make personalization work, even if your team is small
● It’s okay to start with a simple solution, the difference will be
noticeable
● There is always room for improvement
26. Hero orb - going forward
● We are able to deliver users the type of content they want
● That still leaves us with other problems to solve, such as:
○ What amount of content is just right
○ With what frequency and when should we offer
27. Personalization case study 02 - Starter pack
● We wanted to create a product that would increase
our day one conversion
○ Starter pack felt like the obvious choice
28. Starter pack - early blunders
● We created some starter packs and tested them against each other
29. Lesson learned #3 - Set relevant benchmarks
● However compared to an offer we already had in the game, the results
look less optimistic
○ ...and it did not perform well enough
30. Lesson learned #3 - Set relevant benchmarks
● Not looking at the comparison, we continued tweaking the product,
however:
31. Starter pack 2.0 - A new approach
● We abandoned the ‘one size fits all’ starter pack
● If every user is different, why offer them the same thing?
● With this approach, we created three different starter packs
○ $2 pack for users we consider hard to convert
○ $5 pack for ‘average’ users
○ $10 pack for users likely to spend more money
32. Starter pack 2.0 - on user segmentation
● In order to proceed, we needed to segment our audience
● To some extent, your user acquisition does this for you
● In case you’re on your own, there are still things you know
○ Phone model your players are using
○ Country they are from
○ anything you can measure during onboarding
33. Starter pack 2.0 - the effect on KPIs
● This time around we beat the benchmark
34. Starter pack 2.0 - the effect on KPIs
● Comparing the two starter packs together shows the absurd difference
35. Starter pack 2.0 - takeaways
● The change in approach salvaged an underwhelming
product for us
● From using only UA channels, we could correctly assign
about half of all incoming audience
● The rest of cases need to be segmented through a custom
algorithm
37. Summary
● Personalization is the first thing to look at when trying to increase revenue
● Even simple segmentation can lead to significant uplift
● A/B testing is great, but needs to be converging towards a specific goal
● For every game, the best variables for segmentation are different
● Don’t be afraid to drop what isn’t working for you
38. Thank you for listening!
Any questions?
geewa.com
Miroslav Pikhart
Geewa.a.s.
Head of BI
miroslav.pikhart@geewa.com