Making Usable Software - at Agile Indy / UXPA Joint Meeting

1,540 views

Published on

This session kicked off with a 45 minute working session where UX'ers, programmers, PM's etc. worked side-by-side to take story cards to programming-ready. The group then had a brief retrospective about the workshop.

The slides are primarily from the second half of the session. Carol introduced work that Kaleb Walton & Brian Anderson introduced in their recent Webinar: "Experience Driven Agile: Developing Up to an Experience, Not Down to a Feature." This was followed by a best practices discussion of usability testing in Agile Environments.

Published in: Technology, Business
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,540
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
19
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Making Usable Software - at Agile Indy / UXPA Joint Meeting

  1. 1. @Carologic Making Usable Software September 10, 2012 Indiana UXPA and AgileIndy
  2. 2. "The biggest waste of allis building somethingno one wants" - @ericries #LeanStartupMI in 2011 via @MelBugai
  3. 3. Create a great,usable, accessible,and relevant experience
  4. 4. Workshop
  5. 5. Best Practices
  6. 6. Integrating with Agile Sprint 1 Sprint 2 Sprint 3 Phase / Sprint 0 User Design U Test U Test Research for S2 Design DesignUX Design UR for S3 for S3 for S4 for S1 UR for S4 UR for S5 Increased understanding of Users Phase / Sprint 0 Sprint 1 Sprint 2 Sprint 3 Pre- Dev Dev Dev DevDev
  7. 7. Agile Integration Use each study to pick up information Additional user research done in parallel User Design for U Test U Test U Test U Test U Test Sprint 0 Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Research S2 UX Design for Design for Design for Design for Design for Design for UR for S3 S3 S4 S5 S6 S7 S1 UR for S4 UR for S5 UR for S6 UR for S7 UR for S8 User Observations Interviews Survey Personas Increased understanding of Users Sprint 0 Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 Pre-Dev Dev Dev Dev Dev Dev Dev Dev
  8. 8. Experience Driven Agile:Developing Up to an Experience, Not Down to a FeatureRehash of a Webinar by: Kaleb Walton & Brian Anderson
  9. 9. The “Pitch” Quickly conveys background of problem, Effective Prioritization and proposed solution and statement of value Assignment of Work Items Shirt-size estimates make for easy The problem is that systems prioritization (story points are fine too) managers spend too much time prioritizing and assigning their Sprinkle in risk and value to make teams daily work efforts. prioritization even easier Imagine if Systems Manager Plus offered better prioritization Prioritize dozens of experiences, not capabilities and automated assignment based on definable hundreds business rules. General format: This solution would result in reduced cost for systems The problem is <problem>. Imagine if managers by enabling more <solution>. This solution would result in efficient work assignment, leading to better response <value statement>. times. Lightweight precursor to... Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  10. 10. The “Scenario” Borrowed from UX discipline Paints a clear picture of an entire experience Extremely versatile and ready for use outside development Ourreal-world example of a persons “A definition: experience with a product, describing context with a problem and a proposed solution.” Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  11. 11. Scenarios Are Agile Just Barely Good Enough and Just in Time: Fidelity naturally matches immediate need. Ya Ain’t Gonna Need It: Does it enable the scenario? Minimum Viable Product: What is the minimum experience someone would pay for? Lightweight: Low cost to develop, flexible and quick to communicate. Better Contract: More reliable as its written in terms of Experience rather than Features. Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  12. 12. Telling a Story Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  13. 13. Example ScenarioEFFECTIVE PRIORITIZATION AND ASSIGNMENT OF WORK ITEMSPROBLEMMary, a systems manager at ABC Health, is responsible for a team of 12 system administratorswho handle steady state support of their health care systems and network. One of her biggesttime sinks is prioritizing and assigning her teams daily work efforts. The tool she uses, SystemsManager Plus, doesnt give her any prioritization features except for the ability to sort on apriority field when reviewing work items.As she spends half of her time prioritizing she ends up working over time to tend to her otherduties.SOLUTIONAfter a major update Mary signs into Systems Manager Plus, heads to the work items area andis pleasantly surprised to see a number of new prioritization capabilities. There are more fieldsavailable to sort and filter, as well as a “smart assignment” system that enables her to specifyrules that will result in automatic assignment to specific members of her team.Mary creates a few rules, applies them to existing work items, and is excited to see that over aquarter of the items were automatically assigned. She proceeds to sort and filter the remainingwork items to prioritize and assign to her team. As more work items trickle in she notices thatmany of them are being auto-assigned.These improvements have enabled Mary to focus less on prioritizing and more on doing. Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  14. 14. Easily Pull Out Stories and EpicsAdditional sorting capabilitiesAs a systems manager I want to sort work items byadditional fields such as created date, severity andplatform so that I can more effectively prioritize them.Additional filtering capabilitiesAs a systems manager I want to filter work items byadditional fields such as created date, severity andplatform so that I can more effectively prioritize them.Smart assignment system (epic)As a systems manager I want to specify assignment rules for the system to useto automatically assign work items so that I dont have to assign every work itemmanually.Apply new smart assignment rules to existing work itemsAs a systems manager I want to apply new smart assignment rules to existingwork items so that I can use smart assignment on work items created after thesmart assignment process has executed. Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  15. 15. Scenarios Are Agile Just Barely Good Enough and Just in Time: Fidelity naturally matches immediate need. Ya Ain’t Gonna Need It: Does it enable the scenario? Minimum Viable Product: What is the minimum experience someone would pay for? Lightweight: Low cost to develop, flexible and quick to communicate. Better Contract: More reliable as its written in terms of Experience rather than Features. Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  16. 16. Basic Experience Driven Agile Involvement Over Time by Role Scrum Masters, DevelopersProduct Owners, and TestersUX Analysts, Architects and StakeholdersProduct Mgt Product Backlog Iteration BacklogActivities Pitches Scenarios Stories Estimate, Valuate Estimate Assess, Prioritize Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  17. 17. Experience Driven Agile At Scale Involvement Over Time by Role Scrum Masters, DevelopersProduct Owners, and TestersUX Analysts, Architects and Stakeholders Portfolio BacklogProduct Mgt Product IterationActivities Pitches Backlogs Backlogs Scenarios Stories Scenarios Estimate, Valuate Prioritize Estimate Assess, Prioritize Copyright © 2012 Kaleb Walton, Brian Anderson, Michael Hughes and Terri Whitt
  18. 18. Contact Us Kaleb Brian Anderson user.experience.guy@gmail.com Walton kalebwalton@gmail.comThanks to Other Experience Driven Agile Contributors Michael Hughes, Ph.D Terri Whitt michaelhughesua@gmail.com tw30306@yahoo.com http://experiencedrivenagile.com
  19. 19. Usability Testingin Agile Environments
  20. 20. Any Method Can be Adapted Quick Bare minimum of effort Get needed feedback Provide recommendations Repeatable
  21. 21. Scope Effort Consider budget, resources Time  Recruiting  Facilitating  Analyzing Adding participants increases budget & time
  22. 22. Paper, Clickable or Real Code? Always start with paper  Guerilla / hallway test  Users may misunderstand Clickable prototypes  Easier to understand  Can easily change Real Code  Great if it’s the right solution
  23. 23. Paper or Clickable Prototype Rapid Iterative Testing & Evaluation (RITE) Traditional Testing  In-Person  Remote more challenging
  24. 24. Rapid Iterative Testing & Evaluation Qualitative user feedback  actions + comments Series of small usability tests 3 participants each day Minimum of 3 days of testing  Iteration between testing days  Total of 5 days
  25. 25. RITE Process Priority Test & Level of Effort Update Test 1 High 2 Medium 3 Low 25
  26. 26. Recap Sessions End of each day - after the last session Room with a whiteboard. About 30 minutes. Discuss:  trends seen  concerns  recommendations  prioritize changes for the next round  list lower priority changes for future iterations 26
  27. 27. RITE Results Final prototype  Vetted with users  Base for recommendations Light Report: “Caterpillar to Butterfly”  Screenshots show progressions  What changes were made and why
  28. 28. What Works for RITE Best used early in project lifecycle  Early concepts  Need to be vetted with users  Can assist in quickly shaping designs 28
  29. 29. General Testing Traditional Testing In-Person Remote  Moderated or Unmoderated Less users, shorter sessions: analyze at lunch  Recommend 3 or more users  Half hour to 1 hour each
  30. 30. Regular Testing (Yes, this is an old idea; a great one!)
  31. 31. Bring it On! Small focused tests Reduce waiting for recruitment Once per week or per Sprint Same day mid-week (not Monday or Friday)
  32. 32. User Testing Day! Make team aware Invite everyone  Watch remotely  Recurring meeting invites for stakeholders
  33. 33. What could I test? Identify what to test at start of Sprint  Work in Progress  Multiple projects  Prototypes  Concepts, rough ideas, brainstorming  Competing designs, (A/B testing)  Comparative studies across market  Conduct interviews to inform research  More…
  34. 34. “Teams should stretch to get work into that day’s test and use the cadence to drive productivity.”- Jeff Gothelf - http://blog.usabilla.com/5-effective-ways-for-usability-testing-to-play-nice-with-agile/
  35. 35. Why Regular? Team becomes:  accustomed to steady stream of qualitative insight  insight ensures quick decisions…line up with business and user goals Adapted from Jeff Gothelf - http://blog.usabilla.com/5-effective-ways-for- usability-testing-to-play-nice-with-agile/
  36. 36. Include PWD People with disabilities  “We are all only temporarily able-bodied. Accessibility is good for us all.”  Get to spirit of the law (Section 508, WCAG 2.0) -@mollydotcom at #stirtrek 2011 via @carologic
  37. 37. Make it Repeatable
  38. 38. Pre-Book Your Rooms Test & Observation Rooms Any location will do  Conference rooms  Offices  Quiet corner of cafeteria  Remote
  39. 39. Create Reusable Templates Screener Technology use/experience Knowledge of topic Scripts/Guides Consent Forms Data Collection
  40. 40. Debriefing After Testing
  41. 41. Find Patterns Quickly Issue P1 P2 P3 Search Used Yes No No Widget 1 Used N/A Used – unsure about Task 1 Notes 3 – easy 2 – needed 3 – easy help Task 2 Notes 2 – needed 2 – easy 2 – needed help help Task 3 Notes 2 – needed 3 – easy Ran out of help time Task 4 Notes 2 – needed 3 – easy Ran out of help time
  42. 42. True Statements All interfaces have usability problems Limited resources to fix them More problems than resources Less serious problems distract Intense focus on fixing most serious problems first Adapted from: Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. By Steve Krug
  43. 43. Debrief with Team Assumes stakeholders watched tests  If not, wait for UX analysis Quick analysis to quick decisions All decision makers MUST be present
  44. 44. Goal Identify top 5 or 10 most serious issues  Top 3 from each list  Prioritize from lists  Commit resources for next sprint  Stop Adapted from: Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. By Steve Krug
  45. 45. Guidelines Stay on Topic Be Constructive Don’t get distracted by small problems Intense focus on fixing most serious problems first
  46. 46. Make Useful & UsableRecommendations - Quickly
  47. 47. Transform Data Look for patterns Read “between the lines” Know what you’ve got  Sort, reorganize, review, repeat  What refutes your expectations?  Surprises?  Outliers?
  48. 48. Short and Direct Communication Email or One Pager  Think about audience  How will it be used? Include  Goal of study  What will be fixed and who assigned to  Tasks attempted  Who observed  Future research/enhancements
  49. 49. Tweak, Don’t Redesign Small iterative changes  Make it better now  Don’t break something else Take something away  Reduce distractions  Don’t add – question it Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. By Steve Krug
  50. 50. Do UX Early & Often Make users visible Information radiators  Test findings  Artifacts  Personas  Word Clouds - IA
  51. 51. Recommended ReadingsGothelf, Jeff. Lean UX: Getting Out of the Deliverables Business. (Anticipated inFeb. 2013) 5
  52. 52. Contact Carol @Carologic Email: Carologic@gmail.com SlideShare.net/Carologic SpeakerRate.com/speakers/15585-CarolJSmith
  53. 53. Tool Considerations• In-person or remote?• Lab or on-site?• Prototype limitations (can it be online?, is it a document or a clickable site?)• Number of observers, number of participants?• Number of facilitators?• Logging and video editing needs (time on task, highlight video creation)?• Surveys before or after?• Eye tracking?
  54. 54. Usability Testing Software• Morae• Ovo• SilverBack (Mac only)• UserWorks• Noldus• Tobii (Eye-tracker)• SMI (Eye-tracker)• SurveyMonkey
  55. 55. Screen Sharing Software GoToMeeting – http://www.gotomeeting.com Lotus Sametime Unyte – http://www.unyte.com YuuGuu -- http://www.yuuguu.com WebEx – http://www.webex.com Yugma -- https://www.yugma.com/ Trouble Shooting: CoPilot - https://www.copilot.com/
  56. 56. Recommended Sites Usability.gov W3C Web Accessibility Initiative  http://www.w3.org/WAI/ Accessibility Standards in US (Section 508)  http://www.access-board.gov/sec508/508standards.htm Jakob Nielsen  http://www.useit.com UPA – professional UX association  http://www.upainternational.org/
  57. 57. References Albert, Bill, Tom Tullis, and Donna Tedesco. Beyond the Usability Lab. Beyer, Hugh. User-Centered Agile Methods (Synthesis Lectures on Human- Centered Informatics) Gothelf , Jeff. http://blog.usabilla.com/5-effective-ways-for-usability-testing- to-play-nice-with-agile/ Henry, S.L. and Martinson, M. Evaluating for Accessibility, Usability Testing in Diverse Situations. Tutorial, 2003 UPA Conference. Krug, Steve. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. Ratcliffe, Lindsay and Marc McNeill. Agile Experience Design: A Digital Designers Guide to Agile, Lean, and Continuous. Rubin, Jeffrey and Dana Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. John Wiley & Sons, Inc.

×