Make Your Games Play, Teach, and Socialize Better: Usability & Playability Techniques From the Field

1,255 views
1,221 views

Published on

These are the slides I gave from a talk at the 2010 Triangle Games Conference

Published in: Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,255
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • I’m new to the Triangle Game Community – just moved back east from Seattle and am now residing in Baltimore.I’ve worked on lots of games over the years with lots of devs and publishers [imgs games, devs, pub]I’ve also worked on statistical software and ecommerce [Amazon, SPSS]My background and training is in social psychology. I studied judge and jury decision making [images of jury deliberations, survey, scientific evidence)
  • [social features @ Amazon; started to dive into FB and LI Apps]get people to dip toe in water, start contributingaggregating and surfacing relevant, helpful content to help make decisions[currently working on some Zynga projects – can’t talk about them]
  • I’ve evaluated a wide variety of games at the recent SG Showcase Challenge competition, ranging from a Military Helicopter Recon/Investigation Sim to a Fraud Prevention Training program to a game designed to teach language skills to kids.I’ve also done some client work on call center training/selection simulations.But I’m very interested in learning the taxonomy of serious games – what are the markets that need serving, what are the kinds of games out there, and how are current offerings meeting or falling short of needs.And I think the folks at SGC have an interesting community based approach to classifying serious games according to Gameplay, Purpose, Market, and Audience. You should check it out and add your own games to the database.
  • I look forward to seeing the SG track talks and meeting lots of you.But I also think I have some useful tidbits for folks who make SGs and will spend a few moments at the end of my talk relating my work to SGs.
  • Across the UE spectrum (discover, try, buy, viral)...Talk about different kinds of pain: Social “Stranger Danger”; ecommerce; Demo version; Share with Friends, Sign up/Registration
  • These are the feelings of pain you get at each stageEmbarrassment & Worry (how did I fail this poor user? how will we fix it given our schedule?Angst & Fear (will our potential solutions even solve the problem? will it cause new problems?)
  • 3. This is part of an actual task list from a developer that I’ve worked with for a long time. It was generated after a day in the usability lab observing participants. They called it the the Wall of Pain and it’s what crystallized my thinking about testing with users as a way to transfer pain from user to developer.The goal here was to get as many fixes as possible back into the game before testing with users again. 2 days later.
  • 4. The nice thing is that when you make fixes in time for the next user testing study, you get a chance to pat yourself on the back for a job well done. This often yields a nice rush and can be somewhat addictive to developers.
  • Discovery -> Trial -> Conversion -> Evangelism -> DiscoveryD: Clear message what the value proposition is and easy way to act on itT: Is the initial experience entertaining, engaging, and meeting expectations or is it frustrating and not working as promisedC: Does the registration/sign up/purchase process flow naturally from the experience and seem reasonable and not excessiveE: Is there an easy way for the player to share his/her positive experiences with othersIdentifying and fixing pain points along the cycle prevents stoppages.
  • A few key reasons/excuses are covered in the next slides
  • There are some fancy and expensive setups out there...
  • We don’t have time to read long reports on old code because we are already working on new code.We’ve chose Speed & Price and will leave Quality to chance...
  • Image of Homer Simpson carImage of Frankenstein
  • It’s inefficient (you’re getting your highly skilled and specialized folks to do things out of their comfort zone, detracting from what they do best)Regardless of good intentions, your game developers should focus on their true passion... They probably won’t light a usability lab on fire, but they probably will be happier and more effective and efficient when they are using their own tools
  • The meat of my talk... Coming up in the next few slides
  • The developers were polite about it – they liked having a checklist of items to work on... But it was inefficient. They worked MUCH faster than our 2 week build->report process.I was new to the game and didn’t have nearly the depth of immersion and expertise into the game or genre. The developers really cared about the results, but I worried that I missed obvious things or reported things back that made no sense or would make the game worse.
  • Images: Airplane, Lots of luggage/equipment, Lots of people watching, Lots of computing (PST and EST)
  • Have you seen the Wired article that talks about all the testing that Halo 3 got – the lovely heat maps, player videos, etc.We didn’t have the time and testing infrastructure to record tons of behavioral data and analyze heat maps and hours of replay data.
  • No fancy labs/heat maps generated. This is why I called it “Discount Extended Playtesting”Equipment cobbled -> Scoring card I used -> Group of us discussing and fixing in real time as we discovered issues over 3 days of testing.Participants moved their pennies around every 15 minutes – and we paid attention when they said “I want to stop playing”
  • Interns + Low Fi paper + quick iteration (in office) + raising the bar (any errors/fears these folks had were definitely severe issues – esp. issues related to trustworthiness: would people create an account and sign in)
  • We could rule OUT a design, but not rule it in. If our own employees feared it, then it would not work.
  • There often is no “process” for a novel-peripheral game. Some things just couldn’t be conveyed in a written report, so standard reports from the publisher wouldn’t have worked even if the publisher had the resources.There was a sincere worry that testing with lots of different groups would pull the team in different directions.The team already did it (junior, but well intentioned designers; crowded room where both usability participants and observers felt awkward)
  • Let the team do what they were most passionate about.Make sure: Observers separated so not influence; we can talk freelyUsed tech like Skype, Web Cams, and office phones (very cheap – just takes elbow grease)Convergence = core progression must accommodateDivergence = separate modes and optionsImages: Observing and Improving; Elbow Grease (used existing tools and tech); convergence/divergence?[or: all in one room; observers and participants separated]
  • No one technique will fit every situation or answer every question. It’s a balancing act. They all have pros and cons.RITE Usabilityworking at the speed of game developmentDiscount Extended Playtestingaddressing a user experience emergencyRaising the Acceptance Bar using convenience samples appropriatelyImproving Existing Processmaking focus testing sessions more efficient and effective
  • I’m going to finish with a few final takeaways...
  • Make Your Games Play, Teach, and Socialize Better: Usability & Playability Techniques From the Field

    1. 1. Make Your Games Play, Teach, and Socialize Better:<br />Usability and Playability Techniques from the Field<br />Jason Schklar<br />Game & User Experience Consultant<br />Triangle Game Conference<br />April 7, 2010<br />
    2. 2. What I know a lot about<br />Entertainment Games<br />Human Computer Interaction<br />Attitudes & Behavior<br />April 7, 2010<br />Triangle Game Conference<br />
    3. 3. What I know a bunch about<br />Social Computing<br />Social Games<br />April 7, 2010<br />Triangle Game Conference<br />
    4. 4. What I came here to learn about<br />Serious Games<br />April 7, 2010<br />Triangle Game Conference<br />
    5. 5. Back to what I know a lot about<br />Most of them are gamers.<br />They already play great games that are usable, playable, and social.<br />For Serious Games to be effective they need to be great games.<br />I’ve worked on lots of games where players make complex decisions in real time.<br />I’ve worked on many UIs and tutorials that engage, teach, and train players quickly and effectively.<br />I know a lot about Serious Game players<br />I know a lot about teaching complexity through game play<br />April 7, 2010<br />Triangle Game Conference<br />
    6. 6. I also know a lot about Game Improvement<br />One way to make your game better is to identify where users experience pain and transfer that pain back onto the shoulders of your development team.<br />Ideally this is an iterative process that involves testing with users.<br />April 7, 2010<br />Triangle Game Conference<br />
    7. 7. Game Improvement Via Pain Transference<br />Embarrassment & Worry<br />[Image of participant/s struggling]<br />Angst & Fear<br />[Image of collaborative approach]<br />Step 1. Measure the pain by studying your users.<br />Step 2: Discuss causes and solutions.<br />April 7, 2010<br />Triangle Game Conference<br />
    8. 8. Game Improvement Via Pain Transference<br />Resentment<br />Step 3. Make fixes. Lots and lots of fixes.<br />April 7, 2010<br />Triangle Game Conference<br />
    9. 9. Game Improvement Via Pain Transference<br />Resentment<br />Relief / Euphoria<br />Step 3. Make fixes. Lots and lots of fixes.<br />Step 4: Lather, rinse, repeat. Validate fixes, find new issues. <br />April 7, 2010<br />Triangle Game Conference<br />
    10. 10. Testing with Users: Why Do It?<br />April 7, 2010<br />Triangle Game Conference<br />Each link in the user experience chain represents a potential falling off point in the virtuous cycle.<br />The way to assess user pain is through user testing.<br />Trial<br />Discovery<br />Conversion<br />Evangelism<br />
    11. 11. User-Testing: Why NOT Do It?<br />There are lots of barriers to testing with users efficiently and effectively...<br />Sometimes they’re legitimate reasons<br />Sometimes they’re excuses<br />April 7, 2010<br />Triangle Game Conference<br />
    12. 12. It costs too much…<br />We can’t afford dedicated facilities, fancy equipment<br />April 7, 2010<br />Triangle Game Conference<br />
    13. 13. It doesn’t integrate with our process<br />We can’t wait weeks for a long report on outdated code<br />We don’t have time to incorporate feedback<br />April 7, 2010<br />Triangle Game Conference<br />
    14. 14. It dilutes/distorts/corrupts our vision<br />We don’t design by focus group<br />April 7, 2010<br />Triangle Game Conference<br />
    15. 15. We already do it, thank you very much<br />We bring in friends and family when we can, have our admins and HR folks play the game, run focus groups<br />April 7, 2010<br />Triangle Game Conference<br />
    16. 16. Testing with Users: Techniques from the field<br />Techniques I’ve used with great results<br />Techniques that address and overcome barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    17. 17. Case #1: RITE Usability<br />PC Real Time Strategy<br />Developers were working too fast for reports to be relevant<br />I worried that I wasn’t providing the best suggestions<br />Doesn’t fit their process<br />May distort their vision<br />The Challenge <br />The Barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    18. 18. Case #1: RITE Usability<br />Created a remote dev studio within the usability lab<br />Discussed issues and implemented fixes in real time<br />April 7, 2010<br />Triangle Game Conference<br />
    19. 19. Case #1: RITE Usability<br />Pros:<br />Integrated very well with the team’s development process<br />Because we validated content and features as they went in, late additions and changes to the game were made with confidence<br />Great results in terms of critical review and commercial success<br />Cons:<br />High $$$ cost (team travel)<br />High energy cost (team travel)<br />Some loss of efficiency (there’s no place like home)<br />April 7, 2010<br />Triangle Game Conference<br />
    20. 20. Case #2: Discount Extended Playtesting<br />Console RPG<br />Weeks to going gold<br />Unsure whether it was too hard or too easy<br />Currently testing mostly with existing fans<br />No dedicated facilities<br />Costs too much<br />Doesn’t fit our process<br />Already do it<br />The Challenge <br />The Barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    21. 21. Case #2: Discount Extended Playtesting<br />Used spare equipment<br />Recruited non-fanboys<br />Operated in real time<br />April 7, 2010<br />Triangle Game Conference<br />
    22. 22. Case #2: Discount Extended Playtesting<br />Pros:<br />We found several progress blocking issues – and fixed them<br />We validated core tutorials and progression with non-fan boys<br />The process took days (hours) not weeks<br />Called up on a Friday afternoon; we were done by Wed<br />Cons:<br />We didn’t get a very nuanced view of the user experience<br />This wasn’t a “polish” study; it was an “attrition reduction” study<br />We didn’t get a chance to validate our fixes<br />April 7, 2010<br />Triangle Game Conference<br />
    23. 23. Case #3: Raise the Acceptance Bar<br />Several small widget development teams<br />Not enough time and money to test them<br />Some projects were highly confidential (internal testing only)<br />Varying fidelity of mocks<br />Costs too much<br />Doesn’t fit our process<br />Distorts our vision<br />The Challenge <br />The Barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    24. 24. Case #3: Raise the Acceptance Bar<br />Weekly opportunistic tests with interns, low-fi mocks<br />Less “acceptance” testing and more “rejection” testing<br />April 7, 2010<br />Triangle Game Conference<br />
    25. 25. Case #3: Raise the Acceptance Bar<br />Pros:<br />Fit within existing process (weekly sprint based design reviews)<br />We caught LOTS of issues while at the pencil and paper stage<br />Cheap<br />Cons:<br />Our test users were still more savvy than the typical customer<br />You can only learn so much testing with non-interactive mocks<br />April 7, 2010<br />Triangle Game Conference<br />
    26. 26. Case #4: Improving Existing Processes<br />Peripheral-based game<br />Needs “broad appeal”<br />Publisher didn’t have resources, so developer was doing it themselves<br />Doesn’t fit our process<br />Distorts our vision<br />Already do it<br />The Challenge <br />The Barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    27. 27. Case #4: Improving Existing Processes<br />Offloaded user-testing work from devs/designers.<br />Created a makeshift lab using existing offices and tech<br />Different audiences to find convergence and divergence<br />April 7, 2010<br />Triangle Game Conference<br />
    28. 28. Case #4: Improving Existing Processes<br />Pros:<br />Able to have the team observe how players of all backgrounds approached, failed, and succeeded at the game<br />Improved their process and results using a combination of their own space and equipment (and some elbow grease)<br />Cons:<br />Still not entirely hands free (although it moved the load from design and development to production and IT)<br />Some offices/studios can’t spare even two rooms<br />April 7, 2010<br />Triangle Game Conference<br />
    29. 29. Summary: Techniques > Barriers<br />April 7, 2010<br />Triangle Game Conference<br />
    30. 30. Testing with Users: General Principles<br />April 7, 2010<br />Triangle Game Conference<br />
    31. 31. Getting the most out of testing with users<br />User experience lead has some form of independence backed by key stakeholders and project leaders<br />Collaboration yields group therapy, not group think<br />Think about and discuss fixes in real time as you discover issues (because no one reads the full reports or goes back and watches the videos)<br />Feed fixes back into the game as quickly as possible<br />Test again<br />
    32. 32. Oh, yeah...<br />Keep your mouth closed and your eyes and ears open<br />Empathize with the users to come up with meaningful stories that explain their (often) puzzling behavior<br />Make sure you address underlying user experience issues, which are not always what users complain about<br />Give players what they need, not what they ask for<br />
    33. 33. Implications for Serious Games<br />April 7, 2010<br />Triangle Game Conference<br />
    34. 34. Applying findings to Serious Games<br />Magnitude of errors: What is a “Sev 1 Issue”?<br />Loss of life<br />International incident<br />Higher drop out rate<br />Cost savings still apply<br />Reduce training time (people may even train on their own time)<br />Reduce CSS costs and overhead<br />Broaden your candidate pool cheaply<br />Harness the power of social and community<br />People already talk about stuff: Track it and surface it<br />Integrate into existing social and community work flows<br />April 7, 2010<br />Triangle Game Conference<br />
    35. 35. Questions & Answers<br />My Contact Info:<br />e: jschklar@gmail.com<br />t: @jackalshorns<br />w: www.InitialExperience.com<br />Resources<br />IGDA – Games User Research SIG<br />LI Group: http://www.linkedin.com/groups?gid=1873014<br />IGDA Page: http://www.igda.org/user-research<br />SIG Resources: http://sites.google.com/site/gamesuserresearch/<br />Microsoft Game Studios User Research:<br />http://mgsuserresearch.com/default.htm<br />April 7, 2010<br />Triangle Game Conference<br />

    ×