To refresh the audience on the current situation with ConRes We have completed the following tasks already: High level reqs Exploratory studies Persona/scenario developent
Utilised the DECIDE framework to structure and pinpoint our evaluation process: - Determine goals: High level goals of the evaluation process (ie. ensure that users can navigate the system) - Explore the questions: The main tasks we would be asking participants to complete/questions we as evaluators would be asking - Evaluation methods: Promotes consideration of appropriate methods of evaluation, which will we explaining within the presentation - Identification of practical issues (ie. sourcing appropriate participants, equipment etc) - Consideration of ethical issues (ie. privacy and security of submissions) - How we intend to analyse and utilise the data collected
Broke the system design into three main iterations - planned so didn’t encounter issues which would break schedule - Information architecture testing - ensures that IA is appropriate before design commences - Ad-hoc, also known as ‘quick and dirty’ to ensure that low-fidelity prototypes are appropriate - Various forms of usability testing to ensure that the system meets user requirements and is appropriate for their use - Heuristic analysis - we used Nielsen’s Heuristics
Brainstormed information architecture using process-oriented modelling (focussed on the main activities performed by users to generate IA) Treejack is an IA evaluation toolset provided by a Optimal Workshop, a UI eval house in NZ
We can’t outline all results here, but here is a brief overview
- Allowed the group to pinpoint sections of the design users were having trouble with - Identified issues with the menu and title bar structure which would have otherwise been un-noticed until implementation - Also prompted fine-tuning of the terminology used within the system
Compliance with usability principles was considered from the Part A of this report, we factored that in