Design Assignment Part B
Upcoming SlideShare
Loading in...5
×
 

Design Assignment Part B

on

  • 743 views

 

Statistics

Views

Total Views
743
Views on SlideShare
739
Embed Views
4

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 4

http://www.linkedin.com 4

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • To refresh the audience on the current situation with ConRes We have completed the following tasks already: High level reqs Exploratory studies Persona/scenario developent
  • Utilised the DECIDE framework to structure and pinpoint our evaluation process: - Determine goals: High level goals of the evaluation process (ie. ensure that users can navigate the system) - Explore the questions: The main tasks we would be asking participants to complete/questions we as evaluators would be asking - Evaluation methods: Promotes consideration of appropriate methods of evaluation, which will we explaining within the presentation - Identification of practical issues (ie. sourcing appropriate participants, equipment etc) - Consideration of ethical issues (ie. privacy and security of submissions) - How we intend to analyse and utilise the data collected
  • Broke the system design into three main iterations - planned so didn’t encounter issues which would break schedule - Information architecture testing - ensures that IA is appropriate before design commences - Ad-hoc, also known as ‘quick and dirty’ to ensure that low-fidelity prototypes are appropriate - Various forms of usability testing to ensure that the system meets user requirements and is appropriate for their use - Heuristic analysis - we used Nielsen’s Heuristics
  • Brainstormed information architecture using process-oriented modelling (focussed on the main activities performed by users to generate IA) Treejack is an IA evaluation toolset provided by a Optimal Workshop, a UI eval house in NZ
  • We can’t outline all results here, but here is a brief overview
  • - Allowed the group to pinpoint sections of the design users were having trouble with - Identified issues with the menu and title bar structure which would have otherwise been un-noticed until implementation - Also prompted fine-tuning of the terminology used within the system
  • Compliance with usability principles was considered from the Part A of this report, we factored that in

Design Assignment Part B Design Assignment Part B Presentation Transcript

  • ConRes: A Conflict Resolution System
    • Simon N. Reynolds
    • @simonnreynolds
    • [email_address]
    Matthew Ryan @mattieryan mattie.ryan@ gmail.com
  • Refresh
    • Mediation and prevention of workplace conflict
    • Developed high level requirements
    • Completed exploratory studies
    • Produced user personas and scenarios of use
      • Extrapolated detailed requirements of system for prototype designs
  • Design Requirements
    • Utilised personas and scenarios of use
      • Detailed enough to continue not require further analysis
    • Storyboard developed to provide additional perspective on requirements
  • DECIDE Framework Analysis of Data Ethical Issues Goals Practical Issues ? Evaluation Methods Questions
  • Evaluation Process 270 Initial Design Final Design Revised Design Ad-hoc Evaluation Usability Testing Heuristic Analysis Field Testing IA Testing
  • Initial Design
    • Re-visited and re-analysed user requirements
    • Brainstormed information architecture
    • Researched various designs
    • Modeled on existing social networks
      • Reduction in learning curve
      • Simple, clean design
  • Evaluation Procedure
    • Process-oriented design and brainstorm outlined tasks/elements
    • Treejack IA evaluation
      • Upload site map
      • Specify tasks and expected answer
      • Participants navigate and select an appropriate option
  • Evaluation Procedure
    • Ad-hoc evaluation of low-level prototype candidates
      • Team members explain and justify design
      • Evaluated against usability principles
      • Most suitable design to continue development
  • Findings
    • Initial information architecture ineffective
      • Terminology inappropriate
      • Hierarchy illogical for use
      • Tools helped to rectify
    • Prototypes benefited from proven IA
      • Simplified design and evaluation process
  • Low-Fidelity Prototype
  • Revised Design
    • Developed high-quality graphical designs within Photoshop
    • Implemented all changes recommended within previous evaluations
    • Better represented system interaction style
    • Used to complete usability testing and heuristic evaluation
    • Task-driven system walkthrough
    • Participant completes various tasks whilst evaluator observes
      • User able to ask questions
      • Evaluator to informally interview about system suitability
    • Followed by structured questionnaire
    Evaluation Procedure
  • Evaluation Procedure
    • Interaction logging
    • Chalkmark evaluation tool
      • Specify a task to be completed
      • Uploaded prototype
      • User selects next logical option to complete task
      • Tool outlines user clicks and response times
  • Evaluation Procedure
    • Nielsen’s Heuristics
      • Ten principles which define an effective, usable system
    • Analysis conducted by design team
    • ConRes mid-fidelity prototypes generally met all principles
      • Recommended to improve terminology consistency and visibility of system status
    Number Evaluators Success of Nielsen’s Heuristics
  • Findings
    • Many evaluations identified similar problems
    • These were all analysed and rectified within final high-fidelity prototypes
    • Mid-fidelity designs found to comply with various usability principles
  • Mid-Fidelity Prototype
  • Final Design
    • Implemented all recommendations of previous evaluations
    • Developed HTML shell of system
      • Outlines navigation
      • Mirrors interaction style of final system
      • Assists in visualisation of system and its role in the resolution of workplace conflict
  • Evaluation Procedure
    • Field Testing
    • Ensures all previous evaluations effective
    • Occurs within normal environment of use
    • Conducted using functionally incomplete prototype
      • Simulated use of system
  • Findings
    • Participants felt prototype had evolved appropriately for everyday needs
    • Determined no further revision was required
    • Recommended to proceed with functional development and testing
  • High-Fidelity Prototype
    • Demonstration
  • Thank you
    • Simon N. Reynolds
    • @simonnreynolds
    • [email_address]
    Matthew Ryan @mattieryan mattie.ryan@ gmail.com