Oplægget blev holdt ved InfinIT-arrangementet "Usability- evaluering i softwareusvikling", der blev afholdt den 16. september 2010. Læs mere om arrangementet her: http://infinit.dk/dk/hvad_kan_vi_goere_for_dig/viden/reportager/usability-evaluering_paa_forkant_er_rigtig_god_business.htm
4. Reducering af resourceforbrug: Procedure
Tests (4-6 hours)
• Conduct 4-6 think-aloud sessions with the
test monitor and data logger (makes notes)
present
Analysis (2-2½ hours)
• Conduct 1 hour brainstorming and data analysis session
• Articulate and discuss the most critical problems of the system
• Rate the severity of the problems (e.g. as critical, serious or cosmetic)
and categorize them in themes (as they emerge)
• The discussion is managed by the IDA facilitator who asks questions
for clarification and writes the problems on a whiteboard or flip-over
• Use printed screenshots and written notes for supporting overview
• Spend 1-1½ hours on writing up the content of the
whiteboard into a ranked list of problems with clear
references to the system
• Review the problem list together for final consensus
5. Reducering af resourceforbrug
Instant Data Analysis Video Data Analysis Total
Critical 11 12 13
Serious 15 15 22
Cosmetic 15 19 27
Total 41 46 62
7. Træning af udviklere
• Usability evaluation and user interaction design are two separate
activities in software delopment
• This separation is often carried through as a complete separation
of work between evaluators and developers
8. # Lecture Exercises
Training Course 1 Introduction to the course and Pilot test:
basic website technology Each team conducts simple
pilot usability tests of websites
•
2 Basic introduction to usability to train their practical skills in
Teach software developers and issues and guidelines for
interaction design
usability evaluation.
The teams choose the website
designers to conduct usability themselves. Experience with
3 The think-aloud protocol and how conducting tests and the
evaluations to set up a test scenario. User results achieved are discussed
groups and their different needs afterwards.
• Provide participants with skills
in formative usability 4 Application of questionnaires for
collecting data and how to use
evaluation different kinds of questions
• No prerequisites 5 Computer architecture and
website technology
Usability evaluation:
The teams conduct a usability
• It is done in a week 6 Describing the usability testing
method and how to collect and
evaluation of the Hotmail
website according to a
specification provided by the
• Result: a usability report analyze empirical data course instructors.
The usability evaluations are
7 Other usability evaluation conducted at the university in
methods and how to conduct a assigned rooms for each team.
full-scale usability test session After the usability test
sessions, the teams analyze the
8 Website structures, information empirical data and make a
search and web surfing usability report that describes
the identified usability
9 Guidelines for website design and problems.
principles for orientation and
navigation
10 Principles for visual design and
different interaction styles
9. Results (1)
Evaluation
Teams Conducting the Task quality Questionnaire/
evaluation and relevance Interviews
Student
3.42 (0.73) 3.22 (1.05) 2.72 (1.00)
(N=36)
Professional
4.38 (0.74) 3.13 (1.64) 3.50 (1.69)
(N=8)
• The students did quite well in conducting the evaluation
• The professionals did significantly better
• On task quality and relevance the students seem to do better than
the professionals (but not significant)
10. Results (2)
Report
Test Clarity of Executive Clarity of Layout of
Data quality
Teams description problem list summary report report
Student
3.03 (0.94) 3.19 (1.33) 2.53 (1.00) 2.39 (0.80) 2.97 (0.84) 2.94 (0.89)
(N=36)
Professional
4.00 (1.31) 2.13 (0.83) 3.50 (0.93) 3.38 (1.06) 4.25 (0.71) 3.25 (0.71)
(N=8)
• The students did well on describing the test and providing
underlying data in appendices
• The worst student performance was in clarity of the problem list
and executive summary
11. Results (3)
Results
Problem Qualitative Quantitative
Number of Practical Use of Evaluation
categorizatio results results Conclusion
Team problems relevance literature of test
n overview overview
Student
2.56 (0.84) 2.06 (1.22) 3.03 (1.00) 3.03 (1.00) 2.28 (1.14) 3.08 (0.81) 2.64 (0.90) 2.44 (1.08)
(N=36)
Professional 4.13 (1.13) 3.25 (1.75) 4.25 (1.49) 3.75 (1.16) 2.00 (1.51) 3.13 (0.35) 3.88 (0.64) 2.88 (1.13)
(N=8)
• The students did poorly on problem identification and description
• Both groups did poorly in describing the quantitative results
(efficiency and effectiveness)