Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
What to Upload to SlideShare
Loading in …3
×
1 of 36

Design with the Players

0

Share

Download to read offline

In F2P games, we have a lot of assumptions about what players want or not, but very little we verify our assumptions. We use data, reviews, quantitative metrics to frame these assumptions but overlook the spectrum of qualitative feedbacks to understand why a feature works or doesn't. In my presentation, I want to share some powerful tools you can use in Live Ops or New Game development to measure more qualitative feedbacks from your players.

Design with the Players

  1. 1. Rovio © 2017 Confidential. All rights reserved. BUILD WITH THE PLAYERS Sophie Vo
  2. 2. The timed levels "Timed levels are fun to play!” “With the time pressure, players are more likely to convert” Our assumptions:
  3. 3. "They're not relaxing and I play Angry Birds Pop to wind down and relax.” "I love playing AB pop to relax and the timed levels stress me out.” "I have multiple sclerosis. Muscles control is problematic I find these levels very difficult... ...I have to wait until my daughter comes to do it for me. This can be weeks as she lives a long way from me.”
  4. 4. Top engaged and spenders
  5. 5. What if we actually talk to the players!
  6. 6. "Gimme more money!” "The levels are too hard, can we make them more easy?” What POP players want... "More lives” "I don’t have friends to play with in the game” "More EVERYTHING!” "AND FOR FREE :D”
  7. 7. Players don’t know what they want!
  8. 8. In-game surveys Supervised playtesting Observations BEHAVIORAL: What people do ATTITUDINAL: What people say QUALITATIVE WHy? HOW TO FIX ? QUANTITATIVE HOW MANY / HOW MUCH Unsupervised playtesting Game analytics Behavioral statistics A/B testing User interviews - pair/group interviews Focus groups Diary studies Customer feedback Interviews
  9. 9. When do I use qualitative? 1. SELECT FEATURE
  10. 10. 2. ITERATE
  11. 11. User tests Hungry Jelly - new objective ● User tests (LDs, artists, PM) ● Internal playtest
  12. 12. 3. VERIFY & ADJUST
  13. 13. In-game surveys "Too difficult for the reward offered." ”I liked it, but I didn't understand the benefit or purpose.” Piggy Puzzle Event
  14. 14. Building a new game... Concept Prototype 1 Prototype 2 PrePrototype 3
  15. 15. If there’s one thing...
  16. 16. “I know that I don’t know”

Editor's Notes

  • The Secret…
    ... My Secret
  • Joined Rovio and take over AB POP Live game since now almost 3 years
  • I was prepared to take that mission, had a lot of xp in my career and was completely feeling prepared for the battlefield
  • Worked in big companies before, such as GL
    8+y xp
    Live Ops ++
  • Also Wooga, very analytical driven company where I learnt a lot
    And which unlocked my potential as a Data freak. I was working with a lot of data over there, ab tests making projections and forecasts etc..
  • All I had to do with that great experience and knowledge, was to execute my master plan
  • To give an example
    Let me illustrate a case I stumbled on AB POP when joining, on with the timed levels in the game
  • Reaction…. Not great impact on KPI, not a lot of excitement from players either
  • Mindblowing and shocking revelation. I didn’t know about my players!
    For a data freak like me, a whole new world of possibility was open to me and that I could explore!
  • I realized there was something i missed in the picture..
  • Computing the data, it all made sense now!
  • In a moment of crazy genius spark, I wondered…
    What if.. What if we.. what if we talk to the players!
  • While I wish to fulfill player wishes
    The problem is that if we listen to all the player wishes without filtering a bit the information…
  • This is a guarantee for a complete bankruptcy of the game haha
  • PDCA cycle: I approach a lot the development of new features that way where based on assumption, we get the learning for the next iteration. You use both sources of data in the learning process

    FRAME ASSUMPTION: you use reviews and Live Ops feedbacks
    VERIFY: UTs, playtest, surveys
  • PDCA cycle: I approach a lot the development of new features that way where based on assumption, we get the learning for the next iteration. You use both sources of data in the learning process

    FRAME ASSUMPTION: you use reviews and Live Ops feedbacks
    VERIFY: UTs, playtest, surveys
  • We introduced a new game event where you could play one time limited level called “Piggy Puzzle”
    It was ab tested and we got the KPI results which were pretty mixed
    We wanted to understand WHY it didn't perform that well instead of making the next assumptions
    We got quantitative additional data how players liked it or saw it.
    Problem: reward not perceived as valuable for the effort. Not a good UX for the event introduction => next iteration
    Qualitative data on what could have been the problem
  • Rework the message? Final flow
  • Crazy random idea…
  • We are Jon Snow of the game development!
  • Adjust development process on the fact you don’t know.
  • ×