Your SlideShare is downloading. ×

Design Assignment Part B

405

Published on

Published in: Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
405
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • To refresh the audience on the current situation with ConRes We have completed the following tasks already: High level reqs Exploratory studies Persona/scenario developent
  • Utilised the DECIDE framework to structure and pinpoint our evaluation process: - Determine goals: High level goals of the evaluation process (ie. ensure that users can navigate the system) - Explore the questions: The main tasks we would be asking participants to complete/questions we as evaluators would be asking - Evaluation methods: Promotes consideration of appropriate methods of evaluation, which will we explaining within the presentation - Identification of practical issues (ie. sourcing appropriate participants, equipment etc) - Consideration of ethical issues (ie. privacy and security of submissions) - How we intend to analyse and utilise the data collected
  • Broke the system design into three main iterations - planned so didn’t encounter issues which would break schedule - Information architecture testing - ensures that IA is appropriate before design commences - Ad-hoc, also known as ‘quick and dirty’ to ensure that low-fidelity prototypes are appropriate - Various forms of usability testing to ensure that the system meets user requirements and is appropriate for their use - Heuristic analysis - we used Nielsen’s Heuristics
  • Brainstormed information architecture using process-oriented modelling (focussed on the main activities performed by users to generate IA) Treejack is an IA evaluation toolset provided by a Optimal Workshop, a UI eval house in NZ
  • We can’t outline all results here, but here is a brief overview
  • - Allowed the group to pinpoint sections of the design users were having trouble with - Identified issues with the menu and title bar structure which would have otherwise been un-noticed until implementation - Also prompted fine-tuning of the terminology used within the system
  • Compliance with usability principles was considered from the Part A of this report, we factored that in
  • Transcript

    • 1. ConRes: A Conflict Resolution System
      • Simon N. Reynolds
      • @simonnreynolds
      • [email_address]
      Matthew Ryan @mattieryan mattie.ryan@ gmail.com
    • 2. Refresh
      • Mediation and prevention of workplace conflict
      • Developed high level requirements
      • Completed exploratory studies
      • Produced user personas and scenarios of use
        • Extrapolated detailed requirements of system for prototype designs
    • 3. Design Requirements
      • Utilised personas and scenarios of use
        • Detailed enough to continue not require further analysis
      • Storyboard developed to provide additional perspective on requirements
    • 4. DECIDE Framework Analysis of Data Ethical Issues Goals Practical Issues ? Evaluation Methods Questions
    • 5. Evaluation Process 270 Initial Design Final Design Revised Design Ad-hoc Evaluation Usability Testing Heuristic Analysis Field Testing IA Testing
    • 6. Initial Design
      • Re-visited and re-analysed user requirements
      • Brainstormed information architecture
      • Researched various designs
      • Modeled on existing social networks
        • Reduction in learning curve
        • Simple, clean design
    • 7. Evaluation Procedure
      • Process-oriented design and brainstorm outlined tasks/elements
      • Treejack IA evaluation
        • Upload site map
        • Specify tasks and expected answer
        • Participants navigate and select an appropriate option
    • 8. Evaluation Procedure
      • Ad-hoc evaluation of low-level prototype candidates
        • Team members explain and justify design
        • Evaluated against usability principles
        • Most suitable design to continue development
    • 9. Findings
      • Initial information architecture ineffective
        • Terminology inappropriate
        • Hierarchy illogical for use
        • Tools helped to rectify
      • Prototypes benefited from proven IA
        • Simplified design and evaluation process
    • 10. Low-Fidelity Prototype
    • 11. Revised Design
      • Developed high-quality graphical designs within Photoshop
      • Implemented all changes recommended within previous evaluations
      • Better represented system interaction style
      • Used to complete usability testing and heuristic evaluation
    • 12.
      • Task-driven system walkthrough
      • Participant completes various tasks whilst evaluator observes
        • User able to ask questions
        • Evaluator to informally interview about system suitability
      • Followed by structured questionnaire
      Evaluation Procedure
    • 13. Evaluation Procedure
      • Interaction logging
      • Chalkmark evaluation tool
        • Specify a task to be completed
        • Uploaded prototype
        • User selects next logical option to complete task
        • Tool outlines user clicks and response times
    • 14. Evaluation Procedure
      • Nielsen’s Heuristics
        • Ten principles which define an effective, usable system
      • Analysis conducted by design team
      • ConRes mid-fidelity prototypes generally met all principles
        • Recommended to improve terminology consistency and visibility of system status
      Number Evaluators Success of Nielsen’s Heuristics
    • 15. Findings
      • Many evaluations identified similar problems
      • These were all analysed and rectified within final high-fidelity prototypes
      • Mid-fidelity designs found to comply with various usability principles
    • 16. Mid-Fidelity Prototype
    • 17. Final Design
      • Implemented all recommendations of previous evaluations
      • Developed HTML shell of system
        • Outlines navigation
        • Mirrors interaction style of final system
        • Assists in visualisation of system and its role in the resolution of workplace conflict
    • 18. Evaluation Procedure
      • Field Testing
      • Ensures all previous evaluations effective
      • Occurs within normal environment of use
      • Conducted using functionally incomplete prototype
        • Simulated use of system
    • 19. Findings
      • Participants felt prototype had evolved appropriately for everyday needs
      • Determined no further revision was required
      • Recommended to proceed with functional development and testing
    • 20. High-Fidelity Prototype
      • Demonstration
    • 21. Thank you
      • Simon N. Reynolds
      • @simonnreynolds
      • [email_address]
      Matthew Ryan @mattieryan mattie.ryan@ gmail.com

    ×