SlideShare a Scribd company logo
1 of 35
Towards Task Analysis Tool Support 
Suzanne Kieffer1 
Nikolaos Batalas2 Panos Markopoulos2 
1Université catholique de Louvain 
Louvain School of Management 
Louvain-la-Neuve, Belgium 
2Eindhoven University of Technology 
Industrial Design 
Eindhoven, The Netherlands
Task Analysis 
 User goals, tasks and 
work environment 
 User errors, 
breakdowns in the task 
and workarounds
Task Analysis 
Usability Goals Setting 
Work Reengineering 
User Interface Design 
Other 
Usability Engineering Tasks
Data collection 
 Face-to-face interaction 
 User observation 
 Note taking 
 Audio/video recording and transcribing 
 Task Analysis remains resource intensive
Room for improvement 
Analyst efficiency 
Analyst workload 
User time and effort 
In situ data collection 
Ambulatory Assessment methods
Ambulatory Assessment (AmA) 
Purpose: to assess the ongoing behaviour, 
knowledge and experience of people during task 
execution in their natural setting 
Examples: experience sampling, repeated-entry 
diaries, ecological momentary assessment, 
acquisition of ambient signals
To which extent can AmA methods 
support in situ data collection during 
task analysis procedures?
Method 
1. Task model hypothesis 
 Analysis of procedures and artefacts 
 Setting of questions and experimental design 
2. Tool-supported in situ data collection 
 Users: expertise and responsibility 
 Tasks: frequency, criticality and complexity 
 Problems and errors 
3. Contextual observations/interviews
Case study
Hot-Dip Galvanizing on Continuous Lines
Step 1 – Task model hypothesis
Artefact: 
paper checklist
Setting of questions 
Q1. Please indicate your degree of 
familiarity with this task 
Q2. How frequently is this task 
executed? 
Q3. Please indicate when it was 
executed for the last time 
Q4. Please indicate when it will be 
executed next time 
Q5. Please select all the possible 
contexts where it takes place 
Q6. Why does it have to be 
executed? 
Q7. Please indicate a mean to facilitate 
or to improve this task 
Q8. Please give an example of 
possible problem during its execution 
Q9. Please give an example of error 
committed during its execution 
Q10. Please select in the list all the 
participants to this task 
Q11. Please indicate who asks for its 
execution 
Q12. Please indicate to whom the 
related result is communicated
Experimental setup 
30 
items 
4 
key users 
3 
shifts 
12 
questions 
12 participants 
29 items x 12 questions 
+ 1 question 
4200 questions  350 questions per participant 
9 days  40 questions a day per participant
Step 2 – In situ data collection
TEMPEST 
1. Prepare your material (questions and protocol) 
2. Program sequences of questions 
3. Create participants 
4. Fire questions 
5. Analyze answers
Step 3 – Contextual 
observations/interviews
Observations/interviews 
 Key functions 
 Team leader 
 Mobile operator 
 Bath operator 
 Key aspects 
 Communication flow 
 Countdown of items 
 Intra-team collaboration 
 Problems or errors
Results & discussion
Challenge 
 Unfriendly work environment 
 Complex work organization 
 Collaborative 
 Distributed in space and time 
 Rotating shifts
With vs. without tool-support 
With TEMPEST Without TEMPEST 
Analyst’s 
Efficiency 
Increased productivity 
Increased accuracy 
Limited productivity 
Risk of mistakes 
Analyst’s 
Workload 
Automated & remote 
Safe & comfortable 
Structured process 
Manual & face-to-face 
Difficult & tedious 
Unstructured process 
User’s 
time & effort 
38 hours overall in 9 days 
20 minutes a day per user 
36 hours overall (estimation) 
3 hours per user (estimation) 
Questions 
Timely with snooze option 
Rather not intrusive 
Disruptive 
Intrusive 
Answers Complete results Fragmented results
Requirements 
 Supporting tools 
 Analyst configurability 
 Real-time monitoring and traceability of responses 
 On the fly adaptation of the sampling protocol 
 Data collection across platform (responsiveness) 
 Task model hypothesis 
 Guidelines for analysts 
 Mapping with the sampling protocol 
 Mapping with the responses
Take away 
 Task Analysis Tool Support (TATS) 
 Method and TEMPEST 
 Feasibility and cost-efficiency of TATS 
 Requirements for conducting TATS
Thank you! 
Contact details 
suzanne.kieffer@uclouvain.be 
n.batalas@tue.nl 
p.markopoulos@tue.nl 
TEMPEST survey 
http://goo.gl/DTgdqC
Definition of the key users
Divergences
Convergences 
 Reasons to execute a task (Q6): instructions, 
cleanliness and quality 
 Means to improve the tasks (Q7): automation, 
better care of the zinc bath and new equipment 
 Problems (Q8): technical problems and accidents 
 Errors (Q9): related to manipulation of the zinc 
bath and lack of time
“The questions interfered with my schedule” 
 Satisfaction questionnaire, 5-item Likert scale 
 Shift A=3.50, equally distributed between “neutral” and “agree” 
 Shift B=2.67 
 Shift C=2.25 
 Most of the participants (10/12) thought they 
answered between 15 and 30 questions a day, 
while they actually answered about 40

More Related Content

What's hot

Demystifying Predictive Coding Technology
Demystifying Predictive Coding TechnologyDemystifying Predictive Coding Technology
Demystifying Predictive Coding Technology
Daegis
 
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
Margaret-Anne Storey
 

What's hot (13)

MediaEval 2016 - IR Evaluation: Putting the User Back in the Loop
MediaEval 2016 - IR Evaluation: Putting the User Back in the LoopMediaEval 2016 - IR Evaluation: Putting the User Back in the Loop
MediaEval 2016 - IR Evaluation: Putting the User Back in the Loop
 
Demystifying Predictive Coding Technology
Demystifying Predictive Coding TechnologyDemystifying Predictive Coding Technology
Demystifying Predictive Coding Technology
 
MediaEval 2016 - COSMIR and the OpenMIC Challenge: A Plan for Sustainable Mus...
MediaEval 2016 - COSMIR and the OpenMIC Challenge: A Plan for Sustainable Mus...MediaEval 2016 - COSMIR and the OpenMIC Challenge: A Plan for Sustainable Mus...
MediaEval 2016 - COSMIR and the OpenMIC Challenge: A Plan for Sustainable Mus...
 
Occe2018: Student experiences with a bring your own laptop e-Exam system in p...
Occe2018: Student experiences with a bring your own laptop e-Exam system in p...Occe2018: Student experiences with a bring your own laptop e-Exam system in p...
Occe2018: Student experiences with a bring your own laptop e-Exam system in p...
 
Seda 2016
Seda 2016Seda 2016
Seda 2016
 
Preliminary Results from Field Testing of DL Curriculum Modules
Preliminary Results from Field Testing of DL Curriculum ModulesPreliminary Results from Field Testing of DL Curriculum Modules
Preliminary Results from Field Testing of DL Curriculum Modules
 
Introduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey ResearchIntroduction to Usability Testing for Survey Research
Introduction to Usability Testing for Survey Research
 
Usability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG ClevelandUsability Testing Basics: What's it All About? at Web SIG Cleveland
Usability Testing Basics: What's it All About? at Web SIG Cleveland
 
Computer based test designs (cbt)
Computer based test designs (cbt)Computer based test designs (cbt)
Computer based test designs (cbt)
 
Conducting Remote Unmoderated Usability Testing: Part 1 - RemoteUX Training W...
Conducting Remote Unmoderated Usability Testing: Part 1 - RemoteUX Training W...Conducting Remote Unmoderated Usability Testing: Part 1 - RemoteUX Training W...
Conducting Remote Unmoderated Usability Testing: Part 1 - RemoteUX Training W...
 
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
Publish or Perish: Questioning the Impact of Our Research on the Software Dev...
 
Analytics - Presentation in DkIT
Analytics - Presentation in DkITAnalytics - Presentation in DkIT
Analytics - Presentation in DkIT
 
EDUCAUSE: Transformation of Help Desk Services at Rutgers
EDUCAUSE: Transformation of Help Desk Services at RutgersEDUCAUSE: Transformation of Help Desk Services at Rutgers
EDUCAUSE: Transformation of Help Desk Services at Rutgers
 

Similar to Towards Task Analysis Tool Support

7. evalution of interactive system
7. evalution of interactive system7. evalution of interactive system
7. evalution of interactive system
Kh Ravy
 
Differences in-task-descriptions
Differences in-task-descriptionsDifferences in-task-descriptions
Differences in-task-descriptions
Sameer Chavan
 
Presentationchapter2
Presentationchapter2Presentationchapter2
Presentationchapter2
viniciusbsb
 

Similar to Towards Task Analysis Tool Support (20)

classmar2.ppt
classmar2.pptclassmar2.ppt
classmar2.ppt
 
7. evalution of interactive system
7. evalution of interactive system7. evalution of interactive system
7. evalution of interactive system
 
HCI 3e - Ch 9: Evaluation techniques
HCI 3e - Ch 9:  Evaluation techniquesHCI 3e - Ch 9:  Evaluation techniques
HCI 3e - Ch 9: Evaluation techniques
 
UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University
 
Evaluation techniques in HCI
Evaluation techniques in HCIEvaluation techniques in HCI
Evaluation techniques in HCI
 
evaluation techniques in HCI
evaluation techniques in HCIevaluation techniques in HCI
evaluation techniques in HCI
 
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...Assessing Perceived Usability of the Data Curation Profiles Toolkit  Using th...
Assessing Perceived Usability of the Data Curation Profiles Toolkit Using th...
 
Differences in-task-descriptions
Differences in-task-descriptionsDifferences in-task-descriptions
Differences in-task-descriptions
 
Usability testing through the decades
Usability testing through the decadesUsability testing through the decades
Usability testing through the decades
 
POLITEKNIK MALAYSIA
POLITEKNIK MALAYSIAPOLITEKNIK MALAYSIA
POLITEKNIK MALAYSIA
 
Ph.D. Research Update: Year#3 Annual Progress and Planned Activities
Ph.D. Research Update: Year#3 Annual Progress and Planned ActivitiesPh.D. Research Update: Year#3 Annual Progress and Planned Activities
Ph.D. Research Update: Year#3 Annual Progress and Planned Activities
 
NISI Introductie Continuous Delivery 3.0
NISI Introductie Continuous Delivery 3.0NISI Introductie Continuous Delivery 3.0
NISI Introductie Continuous Delivery 3.0
 
Building the e-Assessment Centre
Building the e-Assessment CentreBuilding the e-Assessment Centre
Building the e-Assessment Centre
 
Presentationchapter2
Presentationchapter2Presentationchapter2
Presentationchapter2
 
Seminar Proposal Tugas Akhir - SPK Pemilihan tema Tugas Akhir menggunakan Ana...
Seminar Proposal Tugas Akhir - SPK Pemilihan tema Tugas Akhir menggunakan Ana...Seminar Proposal Tugas Akhir - SPK Pemilihan tema Tugas Akhir menggunakan Ana...
Seminar Proposal Tugas Akhir - SPK Pemilihan tema Tugas Akhir menggunakan Ana...
 
Research presentation (pecha kucha)
Research presentation (pecha kucha)Research presentation (pecha kucha)
Research presentation (pecha kucha)
 
Ipt Syllabus Changes Project Management
Ipt Syllabus Changes   Project ManagementIpt Syllabus Changes   Project Management
Ipt Syllabus Changes Project Management
 
Unit of competence kaizen level ii berhanu tadesse (1)
Unit of competence kaizen level ii berhanu tadesse (1)Unit of competence kaizen level ii berhanu tadesse (1)
Unit of competence kaizen level ii berhanu tadesse (1)
 
Using Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User ResearchUsing Automated Testing Tools to Empower Your User Research
Using Automated Testing Tools to Empower Your User Research
 
NTeQ Lesson
NTeQ LessonNTeQ Lesson
NTeQ Lesson
 

Recently uploaded

Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
panagenda
 

Recently uploaded (20)

TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data DiscoveryTrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
TrustArc Webinar - Unlock the Power of AI-Driven Data Discovery
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin WoodPolkadot JAM Slides - Token2049 - By Dr. Gavin Wood
Polkadot JAM Slides - Token2049 - By Dr. Gavin Wood
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
Web Form Automation for Bonterra Impact Management (fka Social Solutions Apri...
 
FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024FWD Group - Insurer Innovation Award 2024
FWD Group - Insurer Innovation Award 2024
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Why Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire businessWhy Teams call analytics are critical to your entire business
Why Teams call analytics are critical to your entire business
 
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot ModelMcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
Mcleodganj Call Girls 🥰 8617370543 Service Offer VIP Hot Model
 
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
CNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In PakistanCNIC Information System with Pakdata Cf In Pakistan
CNIC Information System with Pakdata Cf In Pakistan
 

Towards Task Analysis Tool Support

  • 1.
  • 2. Towards Task Analysis Tool Support Suzanne Kieffer1 Nikolaos Batalas2 Panos Markopoulos2 1Université catholique de Louvain Louvain School of Management Louvain-la-Neuve, Belgium 2Eindhoven University of Technology Industrial Design Eindhoven, The Netherlands
  • 3. Task Analysis  User goals, tasks and work environment  User errors, breakdowns in the task and workarounds
  • 4. Task Analysis Usability Goals Setting Work Reengineering User Interface Design Other Usability Engineering Tasks
  • 5. Data collection  Face-to-face interaction  User observation  Note taking  Audio/video recording and transcribing  Task Analysis remains resource intensive
  • 6.
  • 7. Room for improvement Analyst efficiency Analyst workload User time and effort In situ data collection Ambulatory Assessment methods
  • 8. Ambulatory Assessment (AmA) Purpose: to assess the ongoing behaviour, knowledge and experience of people during task execution in their natural setting Examples: experience sampling, repeated-entry diaries, ecological momentary assessment, acquisition of ambient signals
  • 9. To which extent can AmA methods support in situ data collection during task analysis procedures?
  • 10. Method 1. Task model hypothesis  Analysis of procedures and artefacts  Setting of questions and experimental design 2. Tool-supported in situ data collection  Users: expertise and responsibility  Tasks: frequency, criticality and complexity  Problems and errors 3. Contextual observations/interviews
  • 12. Hot-Dip Galvanizing on Continuous Lines
  • 13.
  • 14. Step 1 – Task model hypothesis
  • 16.
  • 17. Setting of questions Q1. Please indicate your degree of familiarity with this task Q2. How frequently is this task executed? Q3. Please indicate when it was executed for the last time Q4. Please indicate when it will be executed next time Q5. Please select all the possible contexts where it takes place Q6. Why does it have to be executed? Q7. Please indicate a mean to facilitate or to improve this task Q8. Please give an example of possible problem during its execution Q9. Please give an example of error committed during its execution Q10. Please select in the list all the participants to this task Q11. Please indicate who asks for its execution Q12. Please indicate to whom the related result is communicated
  • 18. Experimental setup 30 items 4 key users 3 shifts 12 questions 12 participants 29 items x 12 questions + 1 question 4200 questions  350 questions per participant 9 days  40 questions a day per participant
  • 19. Step 2 – In situ data collection
  • 20.
  • 21. TEMPEST 1. Prepare your material (questions and protocol) 2. Program sequences of questions 3. Create participants 4. Fire questions 5. Analyze answers
  • 22. Step 3 – Contextual observations/interviews
  • 23. Observations/interviews  Key functions  Team leader  Mobile operator  Bath operator  Key aspects  Communication flow  Countdown of items  Intra-team collaboration  Problems or errors
  • 25. Challenge  Unfriendly work environment  Complex work organization  Collaborative  Distributed in space and time  Rotating shifts
  • 26. With vs. without tool-support With TEMPEST Without TEMPEST Analyst’s Efficiency Increased productivity Increased accuracy Limited productivity Risk of mistakes Analyst’s Workload Automated & remote Safe & comfortable Structured process Manual & face-to-face Difficult & tedious Unstructured process User’s time & effort 38 hours overall in 9 days 20 minutes a day per user 36 hours overall (estimation) 3 hours per user (estimation) Questions Timely with snooze option Rather not intrusive Disruptive Intrusive Answers Complete results Fragmented results
  • 27. Requirements  Supporting tools  Analyst configurability  Real-time monitoring and traceability of responses  On the fly adaptation of the sampling protocol  Data collection across platform (responsiveness)  Task model hypothesis  Guidelines for analysts  Mapping with the sampling protocol  Mapping with the responses
  • 28. Take away  Task Analysis Tool Support (TATS)  Method and TEMPEST  Feasibility and cost-efficiency of TATS  Requirements for conducting TATS
  • 29. Thank you! Contact details suzanne.kieffer@uclouvain.be n.batalas@tue.nl p.markopoulos@tue.nl TEMPEST survey http://goo.gl/DTgdqC
  • 30. Definition of the key users
  • 32. Convergences  Reasons to execute a task (Q6): instructions, cleanliness and quality  Means to improve the tasks (Q7): automation, better care of the zinc bath and new equipment  Problems (Q8): technical problems and accidents  Errors (Q9): related to manipulation of the zinc bath and lack of time
  • 33.
  • 34.
  • 35. “The questions interfered with my schedule”  Satisfaction questionnaire, 5-item Likert scale  Shift A=3.50, equally distributed between “neutral” and “agree”  Shift B=2.67  Shift C=2.25  Most of the participants (10/12) thought they answered between 15 and 30 questions a day, while they actually answered about 40