Your SlideShare is downloading. ×
Groupware Evaluation:    An OverviewIrene-Angelica Chounta, Nikolaos Avouris       University of Patras, Greece           ...
Overview      • A brief review on Groupware        Evaluation      • Lessons learned from evaluation        studies      •...
Groupware Applications                software systems that support people                 involved in a common task to ac...
Groupware Evaluation            the assessment of the strengths and weaknesses of              groupware applications and ...
Objectives of groupware evaluation     • the groupware application       used as an interactive       system     • the mec...
Methods based on single user                   evaluation practice        •use of heuristics        •user testing        •...
Groupware-specific methods      • Mechanics of collaboration: Collaboration,        communication and awareness aspects, a...
Analytical methods      • Groupware Task Analysis (GTA): Hierarchical        task analysis in combination with human      ...
Groupware-specific frameworks      • Participatory evaluation (PETRA): A framework           designed to address both theo...
Examples of Evaluation Studies      •traditional HCI methods were combined with      groupware evaluation methodologies   ...
Study A     •Evaluation of a web-based     argumentation tool used by     communities of practice [1]    • heuristic evalu...
Results of Study A • Various aspects need to be studied and evaluated   separately for an overall assessment • Expert base...
Study B   •Qualitative study of synchronous   collaboration for problem-solving   on allocation of attention resources   d...
Results of Study B      •awareness and communication failures are often      interpreted as unwillingness towards collabor...
Discussion       • The complexity of group activity setting makes it difficult         to identify the source of observed ...
thank you                            hci.edu.gr                                    16http://hci.ece.upatras.gr
Upcoming SlideShare
Loading in...5
×

Chounta avouris limassol2011

358

Published on

Presentation slides from the 1st European Workshop on HCI Design and Evaluation, April 8th, Cyprus

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
358
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Chounta avouris limassol2011"

  1. 1. Groupware Evaluation: An OverviewIrene-Angelica Chounta, Nikolaos Avouris University of Patras, Greece 1
  2. 2. Overview • A brief review on Groupware Evaluation • Lessons learned from evaluation studies • Discussion 2http://hci.ece.upatras.gr
  3. 3. Groupware Applications software systems that support people involved in a common task to achieve their goals 3http://hci.ece.upatras.gr
  4. 4. Groupware Evaluation the assessment of the strengths and weaknesses of groupware applications and systems Groupware systems are expected to support: – communication between partners – support for awareness of others actions – the establishment of shared understanding and goals 4http://hci.ece.upatras.gr
  5. 5. Objectives of groupware evaluation • the groupware application used as an interactive system • the mechanics of group interactions or communication means used • the collaborative experience • the group activity outcome 5http://hci.ece.upatras.gr
  6. 6. Methods based on single user evaluation practice •use of heuristics •user testing •interviews & questionnaires •focus groups •ethnographic methods Question: Are they enough to capture completely the groupware case? How easily are applied in groupware (e.g. groupware evaluation in real-case scenario) 6http://hci.ece.upatras.gr
  7. 7. Groupware-specific methods • Mechanics of collaboration: Collaboration, communication and awareness aspects, are analyzed and mapped through small-scale actions (Gutwin, C., Greenberg, S., 2000) • Collaboration Usability Analysis (CUA): high and low- level representation of collaborative activity and the interactions through field studies and task analysis techniques (Pinelle,D. et. al., 2003)Gutwin, C., Greenberg, S. “The Mechanics of Collaboration: Developing Low Cost Usability Evaluation Methods for SharedWorkspaces”, (2000)Pinelle,D., Gutwin, C., Greenberg. S. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with themechanics of collaboration”, (2003) 7http://hci.ece.upatras.gr
  8. 8. Analytical methods • Groupware Task Analysis (GTA): Hierarchical task analysis in combination with human information processing models that focuses on the triplet people, work, situation (Van der Veer et. al., 1996) • Distributed GOMS (DGOMS): representation of group activity to predict execution time, distribution of workload and other performance variables. (Min, D. et. al., 1999)Van der Veer, G.C, Lenting, B.,F., Bergevoet, B.A.J. “GTA: Groupware task analysis -- Modeling complexity”, (1996)Min, D., Koo, S., Chung, Y.H., Kim, B. “Distributed GOMS: an extension of GOMS to group task”, (1999) 8http://hci.ece.upatras.gr
  9. 9. Groupware-specific frameworks • Participatory evaluation (PETRA): A framework designed to address both theoretical concerns and practical design issues of groupware evaluation (Ross, S., et. al., 1995) • Modeling and mapping awareness within a collaborative setting focusing on the central relationships underlying the processes of distributed group work (Neale, D., et. al., 2004) • Breakdown analysis classifying different breakdown and repair scenarios to highlight implications (Hartswood, M. et. al. 2000) Ross, S., Ramage, M., Rogers, Y. “PETRA: participatory evaluation through redesign and analysis”, (1995) Neale, D.C., Carroll, J.M, Rosson, M.B. “Evaluating computer-supported cooperative work: models and frameworks”, (2004) Hartswood, M., Procter, R. “Design guidelines for dealing with breakdowns and repairs in collaborative work settings”, (2000) 9http://hci.ece.upatras.gr
  10. 10. Examples of Evaluation Studies •traditional HCI methods were combined with groupware evaluation methodologies •special emphasis to the effectiveness of alternative awareness mechanisms. •qualitative analysis of video to assess the collaboration activity 10http://hci.ece.upatras.gr
  11. 11. Study A •Evaluation of a web-based argumentation tool used by communities of practice [1] • heuristic evaluation for single-user interface was combined with Groupware Heuristic Evaluation • both synchronous and asynchronous collaboration was studied • Overall fifty (50) participants took part in the study [1] Chounta, I.A., Avouris, N. “Heuristic Evaluation of an Argumentation Tool used by Communities of Practice”, (2009) 11http://hci.ece.upatras.gr
  12. 12. Results of Study A • Various aspects need to be studied and evaluated separately for an overall assessment • Expert based inspection methods need to be combined with user observation • We need not only focus on the collaborative functionality but also on the user interface design issues. • Most of the issues observed in a collaborative session were due to flows of the interface design rather than communication and awareness problems. 12http://hci.ece.upatras.gr
  13. 13. Study B •Qualitative study of synchronous collaboration for problem-solving on allocation of attention resources during different collaborative sessions [1] •Three dyads practice was monitored by an eyetracker and analyzed. •The dyads were formed in order to study different group dynamics. •The logfiles of the collaborative activity were combined with the logfiles of the eyetracker to analyze the interplay between task, awareness mechanisms and collaborative practice.[1] Chounta I.A., Avouris N. “Study of the effect of awareness on synchronous collaborative problem-solving”, (2010) 13http://hci.ece.upatras.gr
  14. 14. Results of Study B •awareness and communication failures are often interpreted as unwillingness towards collaboration or gradual lost of interest in a collaborative activity •the lack of adaptive awareness mechanisms that help users to set priorities leads users to withdraw from the joint activity •partners remain visible in the common workspaces, are aware of the actions of their partners but take no actual role in the collaborative activity 14http://hci.ece.upatras.gr
  15. 15. Discussion • The complexity of group activity setting makes it difficult to identify the source of observed problems • The outcome of collaborative activities rely on many factors such as: the quality of collaboration, the context of the activity and the tools that mediate the activity. • Indications to use combination of single user evaluation methodologies and groupware specific methods • Still a long way to go in establishing an evaluation framework for a wide range of collaborative applications • Open issue: how to include the context of use and the quality of the outcome in the evaluation process 15http://hci.ece.upatras.gr
  16. 16. thank you hci.edu.gr 16http://hci.ece.upatras.gr

×