1




What to do when:
Informing design at every phase
Dana Chisnell
Web Design World
Seattle
July 2009
2




The team
did a great
job.


Something
is wrong.
But what?
3


Business requirements.
  Check.




IT constraints.
  Check.




Input from users.
  ...
4




How do
you fit user
research
into the
process?
5



There are
simple
things you
can do to
inform
design
decisions:
6




                              Observe
                              and listen
                              to user...
7




These phases
Requirements gathering
Design & test
Launch
8




Map to these questions
What should the design
do?
How should it do it?
Does it do what we want
it to do?
9




What does that look like?




What should the   How should it   Does it do what we
design do?        do it?         ...
10
11




What should the design do?
12
13



Learn who
the users are
14




Brokers or administrators?
15




Doctors or patients?
16




Road warriors or vacationers?
17




๏ Who are the users?
๏ What are they like?
๏ How are they reaching task goals?
18




         Knowing user goals
                     ensures
the design will be successful
19




Ask, listen
Surveys
Focus groups
20



Observe
related tasks
Usability test previous
release
Ethnographic methods
Field usability tests
21



Learn what is useful
to users
22




Benefit
Value
Usefulness
23




  If benefits aren’t clear,
uptake will be a struggle
24



Test
competitors
25



Test
inherent value
Current users
 rave or complain

 interview-based tasks
26



Test
inherent value
New users
 tasks from current users

 verbalize benefits
27



Test
inherent value
Analyze
 current and new users match?

 anyone match Marketing?
28




BobAdvisor
29



Proof
the concept
30




Qualitative data validating ideas
31




                  Reality checkpoint:
translating research to requirements
32



Show early
designs
Focus groups
Participatory design
sessions
33



Test against
benchmarks
Goals
Previous performance
34




How should the design work?
35
36




Prototype
37




Flow
Steps
Information architecture
Interaction
38




Minimize design risks,
experiment with ideas
39




Low fidelity
Paper and pencil drawings
Wireframes
40




High fidelity
Comps
Real data
Realistic look
Realistic interaction
41



Iterate
(rapidly)
42




Experimenting = options, refinements
43




Engage the team in design,
    shorten time to market
44




RITE
Rapid
Iterative
Testing
Evaluation
45




BobAdvisor
46



    BobStar*

                                                      BobStar

                                       ...
47



Compare
designs
Tough questions
Contentious decisions




* Must be radically different

                           ...
48



Test
in context
49




Insights beyond the lab
50




    Environmental factors
Holistic view of task goals
51




Go to users
Narrow scope
Narrow task
User-documented data
Informal or formal
52




Does it do what we want it to do?
53
54



Follow up
on previous releases   BTO
55




Error rates
Hard-to-solve problems
Remedies for next release
56




Start somewhere
57



Classic
usability test
Holistic and
summative
OR
Examine localized
problems
58



Compare
to benchmarks
59




Compare, contrast quantitative data
60




Measure improvement
61



Classic
usability test
Against benchmarks
Defining benchmarks
Summative
62



Verify
and validate
63




Quantitative data about
success measures
64




Ensure goals are being met
             Set priorities
65



Classic
usability test
End-to-end
Refining tasks
How usable is the design
66



    BobStar*

                                                      BobStar

                                       ...
67

Logo Division name
     Text




                     Duane Chisnell shares his
                     thoughts on inves...
68




What to do when
69




How do
you fit user
research
into the
process?
70



There are
simple
things you
can do to
inform
design
decisions
71




Observe
and listen
to users
through
3 key
questions
72




Take-aways
What should the design do?
       observe and listen to users
       test value
       proof the concept...
73




Where to learn more


    Dana’s blog: http://
    usabilitytestinghowto.blogspot.com/

    Download templates, exa...
74




Me


     Dana Chisnell
     dana@usabilityworks.net

     www.usabilityworks.net
     415.519.1148
Upcoming SlideShare
Loading in...5
×

What To Do When: Informing design at every phase

1,133

Published on

Published in: Business, Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,133
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
66
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Transcript of "What To Do When: Informing design at every phase"

  1. 1. 1 What to do when: Informing design at every phase Dana Chisnell Web Design World Seattle July 2009
  2. 2. 2 The team did a great job. Something is wrong. But what?
  3. 3. 3 Business requirements. Check. IT constraints. Check. Input from users. ...
  4. 4. 4 How do you fit user research into the process?
  5. 5. 5 There are simple things you can do to inform design decisions:
  6. 6. 6 Observe and listen to users through 3 key questions http://www.flickr.com/photos/roby72/3345955051/
  7. 7. 7 These phases Requirements gathering Design & test Launch
  8. 8. 8 Map to these questions What should the design do? How should it do it? Does it do what we want it to do?
  9. 9. 9 What does that look like? What should the How should it Does it do what we design do? do it? wanted it to do?
  10. 10. 10
  11. 11. 11 What should the design do?
  12. 12. 12
  13. 13. 13 Learn who the users are
  14. 14. 14 Brokers or administrators?
  15. 15. 15 Doctors or patients?
  16. 16. 16 Road warriors or vacationers?
  17. 17. 17 ๏ Who are the users? ๏ What are they like? ๏ How are they reaching task goals?
  18. 18. 18 Knowing user goals ensures the design will be successful
  19. 19. 19 Ask, listen Surveys Focus groups
  20. 20. 20 Observe related tasks Usability test previous release Ethnographic methods Field usability tests
  21. 21. 21 Learn what is useful to users
  22. 22. 22 Benefit Value Usefulness
  23. 23. 23 If benefits aren’t clear, uptake will be a struggle
  24. 24. 24 Test competitors
  25. 25. 25 Test inherent value Current users rave or complain interview-based tasks
  26. 26. 26 Test inherent value New users tasks from current users verbalize benefits
  27. 27. 27 Test inherent value Analyze current and new users match? anyone match Marketing?
  28. 28. 28 BobAdvisor
  29. 29. 29 Proof the concept
  30. 30. 30 Qualitative data validating ideas
  31. 31. 31 Reality checkpoint: translating research to requirements
  32. 32. 32 Show early designs Focus groups Participatory design sessions
  33. 33. 33 Test against benchmarks Goals Previous performance
  34. 34. 34 How should the design work?
  35. 35. 35
  36. 36. 36 Prototype
  37. 37. 37 Flow Steps Information architecture Interaction
  38. 38. 38 Minimize design risks, experiment with ideas
  39. 39. 39 Low fidelity Paper and pencil drawings Wireframes
  40. 40. 40 High fidelity Comps Real data Realistic look Realistic interaction
  41. 41. 41 Iterate (rapidly)
  42. 42. 42 Experimenting = options, refinements
  43. 43. 43 Engage the team in design, shorten time to market
  44. 44. 44 RITE Rapid Iterative Testing Evaluation
  45. 45. 45 BobAdvisor
  46. 46. 46 BobStar* BobStar As part of our ongoing commitment to provide first class performance reporting for our most valuable clients, we have enhanced our Quarterly Performance Statements for our managed accounts and enrolled full service accounts. The enhancements, which include a new design and expanded content will e introduced to our clients with their 4th quarter 2009 Quarterly Performance Statements this january 2010. Read more Learn more - Quick Market Update: Two charts, one table, and a bullet point - Quarterly conference to be held on March 30 - 140 BobStar* mutual funds to be removed 3/14 - Order BobStar* CSA accepted BobStar*
  47. 47. 47 Compare designs Tough questions Contentious decisions * Must be radically different http://www.flickr.com/photos/8666121@N02/2405646718
  48. 48. 48 Test in context
  49. 49. 49 Insights beyond the lab
  50. 50. 50 Environmental factors Holistic view of task goals
  51. 51. 51 Go to users Narrow scope Narrow task User-documented data Informal or formal
  52. 52. 52 Does it do what we want it to do?
  53. 53. 53
  54. 54. 54 Follow up on previous releases BTO
  55. 55. 55 Error rates Hard-to-solve problems Remedies for next release
  56. 56. 56 Start somewhere
  57. 57. 57 Classic usability test Holistic and summative OR Examine localized problems
  58. 58. 58 Compare to benchmarks
  59. 59. 59 Compare, contrast quantitative data
  60. 60. 60 Measure improvement
  61. 61. 61 Classic usability test Against benchmarks Defining benchmarks Summative
  62. 62. 62 Verify and validate
  63. 63. 63 Quantitative data about success measures
  64. 64. 64 Ensure goals are being met Set priorities
  65. 65. 65 Classic usability test End-to-end Refining tasks How usable is the design
  66. 66. 66 BobStar* BobStar As part of our ongoing commitment to provide first class performance reporting for our most valuable clients, we have enhanced our Quarterly Performance Statements for our managed accounts and enrolled full service accounts. The enhancements, which include a new design and expanded content will e introduced to our clients with their 4th quarter 2009 Quarterly Performance Statements this january 2010. Read more Learn more - Quick Market Update: Two charts, one table, and a bullet point - Quarterly conference to be held on March 30 - 140 BobStar* mutual funds to be removed 3/14 - Order BobStar* CSA accepted BobStar*
  67. 67. 67 Logo Division name Text Duane Chisnell shares his thoughts on investment management and trust.
  68. 68. 68 What to do when
  69. 69. 69 How do you fit user research into the process?
  70. 70. 70 There are simple things you can do to inform design decisions
  71. 71. 71 Observe and listen to users through 3 key questions
  72. 72. 72 Take-aways What should the design do? observe and listen to users test value proof the concept set benchmarks and goals How should it work? prototype and iterate design test in context Does it do what we want it to do? compare to benchmarks validate against usability goals
  73. 73. 73 Where to learn more Dana’s blog: http:// usabilitytestinghowto.blogspot.com/ Download templates, examples, and links to other resources from www.wiley.com/go/usabilitytesting
  74. 74. 74 Me Dana Chisnell dana@usabilityworks.net www.usabilityworks.net 415.519.1148
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×