Design Assignment Part B

574 views

Published on

Published in: Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
574
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • To refresh the audience on the current situation with ConRes We have completed the following tasks already: High level reqs Exploratory studies Persona/scenario developent
  • Utilised the DECIDE framework to structure and pinpoint our evaluation process: - Determine goals: High level goals of the evaluation process (ie. ensure that users can navigate the system) - Explore the questions: The main tasks we would be asking participants to complete/questions we as evaluators would be asking - Evaluation methods: Promotes consideration of appropriate methods of evaluation, which will we explaining within the presentation - Identification of practical issues (ie. sourcing appropriate participants, equipment etc) - Consideration of ethical issues (ie. privacy and security of submissions) - How we intend to analyse and utilise the data collected
  • Broke the system design into three main iterations - planned so didn’t encounter issues which would break schedule - Information architecture testing - ensures that IA is appropriate before design commences - Ad-hoc, also known as ‘quick and dirty’ to ensure that low-fidelity prototypes are appropriate - Various forms of usability testing to ensure that the system meets user requirements and is appropriate for their use - Heuristic analysis - we used Nielsen’s Heuristics
  • Brainstormed information architecture using process-oriented modelling (focussed on the main activities performed by users to generate IA) Treejack is an IA evaluation toolset provided by a Optimal Workshop, a UI eval house in NZ
  • We can’t outline all results here, but here is a brief overview
  • - Allowed the group to pinpoint sections of the design users were having trouble with - Identified issues with the menu and title bar structure which would have otherwise been un-noticed until implementation - Also prompted fine-tuning of the terminology used within the system
  • Compliance with usability principles was considered from the Part A of this report, we factored that in
  • Design Assignment Part B

    1. 1. ConRes: A Conflict Resolution System <ul><li>Simon N. Reynolds </li></ul><ul><li>@simonnreynolds </li></ul><ul><li>[email_address] </li></ul>Matthew Ryan @mattieryan mattie.ryan@ gmail.com
    2. 2. Refresh <ul><li>Mediation and prevention of workplace conflict </li></ul><ul><li>Developed high level requirements </li></ul><ul><li>Completed exploratory studies </li></ul><ul><li>Produced user personas and scenarios of use </li></ul><ul><ul><li>Extrapolated detailed requirements of system for prototype designs </li></ul></ul>
    3. 3. Design Requirements <ul><li>Utilised personas and scenarios of use </li></ul><ul><ul><li>Detailed enough to continue not require further analysis </li></ul></ul><ul><li>Storyboard developed to provide additional perspective on requirements </li></ul>
    4. 4. DECIDE Framework Analysis of Data Ethical Issues Goals Practical Issues ? Evaluation Methods Questions
    5. 5. Evaluation Process 270 Initial Design Final Design Revised Design Ad-hoc Evaluation Usability Testing Heuristic Analysis Field Testing IA Testing
    6. 6. Initial Design <ul><li>Re-visited and re-analysed user requirements </li></ul><ul><li>Brainstormed information architecture </li></ul><ul><li>Researched various designs </li></ul><ul><li>Modeled on existing social networks </li></ul><ul><ul><li>Reduction in learning curve </li></ul></ul><ul><ul><li>Simple, clean design </li></ul></ul>
    7. 7. Evaluation Procedure <ul><li>Process-oriented design and brainstorm outlined tasks/elements </li></ul><ul><li>Treejack IA evaluation </li></ul><ul><ul><li>Upload site map </li></ul></ul><ul><ul><li>Specify tasks and expected answer </li></ul></ul><ul><ul><li>Participants navigate and select an appropriate option </li></ul></ul>
    8. 8. Evaluation Procedure <ul><li>Ad-hoc evaluation of low-level prototype candidates </li></ul><ul><ul><li>Team members explain and justify design </li></ul></ul><ul><ul><li>Evaluated against usability principles </li></ul></ul><ul><ul><li>Most suitable design to continue development </li></ul></ul>
    9. 9. Findings <ul><li>Initial information architecture ineffective </li></ul><ul><ul><li>Terminology inappropriate </li></ul></ul><ul><ul><li>Hierarchy illogical for use </li></ul></ul><ul><ul><li>Tools helped to rectify </li></ul></ul><ul><li>Prototypes benefited from proven IA </li></ul><ul><ul><li>Simplified design and evaluation process </li></ul></ul>
    10. 10. Low-Fidelity Prototype
    11. 11. Revised Design <ul><li>Developed high-quality graphical designs within Photoshop </li></ul><ul><li>Implemented all changes recommended within previous evaluations </li></ul><ul><li>Better represented system interaction style </li></ul><ul><li>Used to complete usability testing and heuristic evaluation </li></ul>
    12. 12. <ul><li>Task-driven system walkthrough </li></ul><ul><li>Participant completes various tasks whilst evaluator observes </li></ul><ul><ul><li>User able to ask questions </li></ul></ul><ul><ul><li>Evaluator to informally interview about system suitability </li></ul></ul><ul><li>Followed by structured questionnaire </li></ul>Evaluation Procedure
    13. 13. Evaluation Procedure <ul><li>Interaction logging </li></ul><ul><li>Chalkmark evaluation tool </li></ul><ul><ul><li>Specify a task to be completed </li></ul></ul><ul><ul><li>Uploaded prototype </li></ul></ul><ul><ul><li>User selects next logical option to complete task </li></ul></ul><ul><ul><li>Tool outlines user clicks and response times </li></ul></ul>
    14. 14. Evaluation Procedure <ul><li>Nielsen’s Heuristics </li></ul><ul><ul><li>Ten principles which define an effective, usable system </li></ul></ul><ul><li>Analysis conducted by design team </li></ul><ul><li>ConRes mid-fidelity prototypes generally met all principles </li></ul><ul><ul><li>Recommended to improve terminology consistency and visibility of system status </li></ul></ul>Number Evaluators Success of Nielsen’s Heuristics
    15. 15. Findings <ul><li>Many evaluations identified similar problems </li></ul><ul><li>These were all analysed and rectified within final high-fidelity prototypes </li></ul><ul><li>Mid-fidelity designs found to comply with various usability principles </li></ul>
    16. 16. Mid-Fidelity Prototype
    17. 17. Final Design <ul><li>Implemented all recommendations of previous evaluations </li></ul><ul><li>Developed HTML shell of system </li></ul><ul><ul><li>Outlines navigation </li></ul></ul><ul><ul><li>Mirrors interaction style of final system </li></ul></ul><ul><ul><li>Assists in visualisation of system and its role in the resolution of workplace conflict </li></ul></ul>
    18. 18. Evaluation Procedure <ul><li>Field Testing </li></ul><ul><li>Ensures all previous evaluations effective </li></ul><ul><li>Occurs within normal environment of use </li></ul><ul><li>Conducted using functionally incomplete prototype </li></ul><ul><ul><li>Simulated use of system </li></ul></ul>
    19. 19. Findings <ul><li>Participants felt prototype had evolved appropriately for everyday needs </li></ul><ul><li>Determined no further revision was required </li></ul><ul><li>Recommended to proceed with functional development and testing </li></ul>
    20. 20. High-Fidelity Prototype <ul><li>Demonstration </li></ul>
    21. 21. Thank you <ul><li>Simon N. Reynolds </li></ul><ul><li>@simonnreynolds </li></ul><ul><li>[email_address] </li></ul>Matthew Ryan @mattieryan mattie.ryan@ gmail.com

    ×