SlideShare a Scribd company logo
1 of 21
Altered State of Consciousness:  Confessions of Instructional Designers  Who Conduct Usability Testing   Wachovia Human Resources – Michael Rukstelis & Yun Zhou
What’s in it for me? ,[object Object],[object Object],[object Object],[object Object],[object Object]
Designer vs. User What the user really needs What the designer thinks the user needs How the design looks to the user How the user thinks it is intended to work Source:  http://www.jroller.com/resources/behrangsa
Usability Testing ,[object Object],[object Object],[object Object],[object Object]
Our Case Study ,[object Object],[object Object],[object Object],[object Object],[object Object]
The Challenge ,[object Object]
You are in the driver seat! ,[object Object],[object Object],[object Object],[object Object],[object Object]
First Prototype
First Prototype Analysis ,[object Object],[object Object],[object Object]
First Test Findings ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Second Prototype
Second Prototype Test Test Room Observer Room Mirror user facilitator observer
Second Test Findings ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Implications: The Two Lenses ,[object Object],[object Object],[object Object]
Third Prototype
Third Test Findings ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What Have We Learned?
What Have We Learned? ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What Have We Learned? Usability testing engages your true end user. It is a powerful practice to help design and fine tune the e-learning program until it functions to support learning successfully.
Questions and Comments?
Thank You for Being Here!

More Related Content

What's hot

QM Standards 2, 3, & 4: Objectives and Alignment
QM Standards 2, 3, & 4: Objectives and AlignmentQM Standards 2, 3, & 4: Objectives and Alignment
QM Standards 2, 3, & 4: Objectives and AlignmentStaci Trekles
 
Designing Authentic, Quality Assessments
Designing Authentic, Quality AssessmentsDesigning Authentic, Quality Assessments
Designing Authentic, Quality AssessmentsStaci Trekles
 
Examining the effect of a real time student dashboard on student behavior and...
Examining the effect of a real time student dashboard on student behavior and...Examining the effect of a real time student dashboard on student behavior and...
Examining the effect of a real time student dashboard on student behavior and...Bob Bodily
 
M enabling assessment for learning presentation
M enabling assessment for learning presentationM enabling assessment for learning presentation
M enabling assessment for learning presentationLisa Donaldson
 
How To Pass
How To PassHow To Pass
How To PassRA
 
How To Pass
How To PassHow To Pass
How To PassRA
 
Jessica perry assignment3
Jessica perry assignment3Jessica perry assignment3
Jessica perry assignment3Jessica Perry
 
Assessment: Managing Tests, Projects, and Grade Center
Assessment: Managing Tests, Projects, and Grade CenterAssessment: Managing Tests, Projects, and Grade Center
Assessment: Managing Tests, Projects, and Grade CenterStaci Trekles
 
E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008Jo Richler
 
SMAR TPACK for Digital Learning Design
SMAR TPACK for Digital Learning DesignSMAR TPACK for Digital Learning Design
SMAR TPACK for Digital Learning DesignMonash University
 
Introduction to Module 7
Introduction to Module 7Introduction to Module 7
Introduction to Module 7etlearn
 
Using real-time dashboards to improve student engagement in virtual learning ...
Using real-time dashboards to improve student engagement in virtual learning ...Using real-time dashboards to improve student engagement in virtual learning ...
Using real-time dashboards to improve student engagement in virtual learning ...Bob Bodily
 
Supporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual ClassroomSupporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual ClassroomMelanie
 
April 1 ppt preso ola
April 1 ppt preso olaApril 1 ppt preso ola
April 1 ppt preso olaJenni Hayman
 
The RISE Framework: Using learning analytics for the continuous improvement o...
The RISE Framework: Using learning analytics for the continuous improvement o...The RISE Framework: Using learning analytics for the continuous improvement o...
The RISE Framework: Using learning analytics for the continuous improvement o...Bob Bodily
 
Ideate : Generating Ideas/Selection
Ideate : Generating Ideas/SelectionIdeate : Generating Ideas/Selection
Ideate : Generating Ideas/Selectionolabanjik
 
Forward thinking feedback at UEA
Forward thinking feedback at UEAForward thinking feedback at UEA
Forward thinking feedback at UEABlackboardEMEA
 

What's hot (19)

QM Standards 2, 3, & 4: Objectives and Alignment
QM Standards 2, 3, & 4: Objectives and AlignmentQM Standards 2, 3, & 4: Objectives and Alignment
QM Standards 2, 3, & 4: Objectives and Alignment
 
Designing Authentic, Quality Assessments
Designing Authentic, Quality AssessmentsDesigning Authentic, Quality Assessments
Designing Authentic, Quality Assessments
 
Examining the effect of a real time student dashboard on student behavior and...
Examining the effect of a real time student dashboard on student behavior and...Examining the effect of a real time student dashboard on student behavior and...
Examining the effect of a real time student dashboard on student behavior and...
 
M enabling assessment for learning presentation
M enabling assessment for learning presentationM enabling assessment for learning presentation
M enabling assessment for learning presentation
 
How To Pass
How To PassHow To Pass
How To Pass
 
How To Pass
How To PassHow To Pass
How To Pass
 
Jessica perry assignment3
Jessica perry assignment3Jessica perry assignment3
Jessica perry assignment3
 
Assessment: Managing Tests, Projects, and Grade Center
Assessment: Managing Tests, Projects, and Grade CenterAssessment: Managing Tests, Projects, and Grade Center
Assessment: Managing Tests, Projects, and Grade Center
 
Mulligan97 (1)
Mulligan97 (1)Mulligan97 (1)
Mulligan97 (1)
 
E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008E Assessment Presentation Ver2 June 2008
E Assessment Presentation Ver2 June 2008
 
SMAR TPACK for Digital Learning Design
SMAR TPACK for Digital Learning DesignSMAR TPACK for Digital Learning Design
SMAR TPACK for Digital Learning Design
 
Introduction to Module 7
Introduction to Module 7Introduction to Module 7
Introduction to Module 7
 
Using real-time dashboards to improve student engagement in virtual learning ...
Using real-time dashboards to improve student engagement in virtual learning ...Using real-time dashboards to improve student engagement in virtual learning ...
Using real-time dashboards to improve student engagement in virtual learning ...
 
Supporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual ClassroomSupporting Faculty in the Virtual Classroom
Supporting Faculty in the Virtual Classroom
 
April 1 ppt preso ola
April 1 ppt preso olaApril 1 ppt preso ola
April 1 ppt preso ola
 
The RISE Framework: Using learning analytics for the continuous improvement o...
The RISE Framework: Using learning analytics for the continuous improvement o...The RISE Framework: Using learning analytics for the continuous improvement o...
The RISE Framework: Using learning analytics for the continuous improvement o...
 
Ideate : Generating Ideas/Selection
Ideate : Generating Ideas/SelectionIdeate : Generating Ideas/Selection
Ideate : Generating Ideas/Selection
 
Liane Bryson, Katie Hughes & James Towner Rhetoric & Writing
Liane Bryson, Katie Hughes & James Towner Rhetoric & WritingLiane Bryson, Katie Hughes & James Towner Rhetoric & Writing
Liane Bryson, Katie Hughes & James Towner Rhetoric & Writing
 
Forward thinking feedback at UEA
Forward thinking feedback at UEAForward thinking feedback at UEA
Forward thinking feedback at UEA
 

Viewers also liked

Viewers also liked (10)

Wp seidman 03
Wp seidman 03Wp seidman 03
Wp seidman 03
 
Bimmer club
Bimmer clubBimmer club
Bimmer club
 
D1047 us compplanbook
D1047 us compplanbookD1047 us compplanbook
D1047 us compplanbook
 
Wp seidman 05
Wp seidman 05Wp seidman 05
Wp seidman 05
 
Wp seidman 02
Wp seidman 02Wp seidman 02
Wp seidman 02
 
Ispi presentation script 920
Ispi presentation script 920Ispi presentation script 920
Ispi presentation script 920
 
Геоинформационная платформа EverGIS
Геоинформационная платформа EverGISГеоинформационная платформа EverGIS
Геоинформационная платформа EverGIS
 
Bbvi faq
Bbvi faqBbvi faq
Bbvi faq
 
Wp seidman 07
Wp seidman 07Wp seidman 07
Wp seidman 07
 
Art 200807 chiropractic_products
Art 200807 chiropractic_productsArt 200807 chiropractic_products
Art 200807 chiropractic_products
 

Similar to Ispi Presentation Script 920

Formative Evaluation for Educational Product Development
Formative Evaluation for Educational Product DevelopmentFormative Evaluation for Educational Product Development
Formative Evaluation for Educational Product DevelopmentVanessa Gennarelli
 
Usability methods to improve EMRs
Usability methods to improve EMRsUsability methods to improve EMRs
Usability methods to improve EMRsJeffery Belden
 
Tech. Writing Usability Presentation
Tech. Writing Usability PresentationTech. Writing Usability Presentation
Tech. Writing Usability Presentationmhobren
 
Usability Primer - for Alberta Municipal Webmasters Working Group
Usability Primer - for Alberta Municipal Webmasters Working GroupUsability Primer - for Alberta Municipal Webmasters Working Group
Usability Primer - for Alberta Municipal Webmasters Working GroupNormanMendoza
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptxZahirahZairul2
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating AssessmentsChristina Sax
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designsElizabeth Snowdon
 
User Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSAUser Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSASTC India UX SIG
 
Training_and_Development_pptx.pptx
Training_and_Development_pptx.pptxTraining_and_Development_pptx.pptx
Training_and_Development_pptx.pptxRoyk16
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Modeleshikachattopadhyay
 
25 march introducing design methods
25 march introducing design methods25 march introducing design methods
25 march introducing design methodsAbhishek Sagar
 
25 march introducing design methods
25 march introducing design methods25 march introducing design methods
25 march introducing design methodsAbhishek Sagar
 
How to do usability testing and eye tracking
How to do usability testing and eye trackingHow to do usability testing and eye tracking
How to do usability testing and eye trackingObjective Experience
 
U koersythesis&evaluation
U koersythesis&evaluationU koersythesis&evaluation
U koersythesis&evaluationloumcgill
 
UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University Dr.Mohammed Alhusban
 
UX Design Process | Sample Proposal
UX Design Process | Sample Proposal UX Design Process | Sample Proposal
UX Design Process | Sample Proposal Marta Fioni
 

Similar to Ispi Presentation Script 920 (20)

Formative Evaluation for Educational Product Development
Formative Evaluation for Educational Product DevelopmentFormative Evaluation for Educational Product Development
Formative Evaluation for Educational Product Development
 
Usability
UsabilityUsability
Usability
 
Usability methods to improve EMRs
Usability methods to improve EMRsUsability methods to improve EMRs
Usability methods to improve EMRs
 
Tech. Writing Usability Presentation
Tech. Writing Usability PresentationTech. Writing Usability Presentation
Tech. Writing Usability Presentation
 
Usability
UsabilityUsability
Usability
 
Usability Primer - for Alberta Municipal Webmasters Working Group
Usability Primer - for Alberta Municipal Webmasters Working GroupUsability Primer - for Alberta Municipal Webmasters Working Group
Usability Primer - for Alberta Municipal Webmasters Working Group
 
11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx11 - Evaluating Framework in Interaction Design_new.pptx
11 - Evaluating Framework in Interaction Design_new.pptx
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
 
Prototyping and Usability Testing your designs
Prototyping and Usability Testing your designsPrototyping and Usability Testing your designs
Prototyping and Usability Testing your designs
 
User Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSAUser Research Techniques by Vikram Rao, RSA
User Research Techniques by Vikram Rao, RSA
 
Training_and_Development_pptx.pptx
Training_and_Development_pptx.pptxTraining_and_Development_pptx.pptx
Training_and_Development_pptx.pptx
 
Evaluation
EvaluationEvaluation
Evaluation
 
ADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design ModelADDIE- An Instructional Systems Design Model
ADDIE- An Instructional Systems Design Model
 
25 march introducing design methods
25 march introducing design methods25 march introducing design methods
25 march introducing design methods
 
25 march introducing design methods
25 march introducing design methods25 march introducing design methods
25 march introducing design methods
 
How to do usability testing and eye tracking
How to do usability testing and eye trackingHow to do usability testing and eye tracking
How to do usability testing and eye tracking
 
U koersythesis&evaluation
U koersythesis&evaluationU koersythesis&evaluation
U koersythesis&evaluation
 
UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University UX and Usability Workshop Southampton Solent University
UX and Usability Workshop Southampton Solent University
 
UX Design Process | Sample Proposal
UX Design Process | Sample Proposal UX Design Process | Sample Proposal
UX Design Process | Sample Proposal
 
Lean UX
Lean UXLean UX
Lean UX
 

Ispi Presentation Script 920

Editor's Notes

  1. Michael: Welcome to our presentation: Altered State of Consciousness: Confessions of Instructional Designers Who Conduct Usability Testing. My name is Michael Rukstelis, and this is my colleague Yun Zhou. We are both instructional designers from the Learning Strategy Group at Wachovia Bank.   During the past few years we’ve adopted usability testing to be part of the design process. Today we’d like to tell you a story about how our perception of usability testing evolved through the practice of it. Particularly, this story reflects our own journey as instructional designers, which is why we title this presentation as “confessions.” This is a development from simple attempts to get feedback about the interface, such as the font, color, type size, the general visual appeal and basic navigation, to a more mature attempt to sort out the value and meaning of usability as it applies to instructional design. Now let’s look at what you’ll get out of this session and why you should stay here 
  2. Michael: In this session, We’ll look at a case study with three iterations of a web-based training program we did a year ago. We’ll role-play and put you in the driver seat as the user, so that you’ll experience “live” tests in these different phases. By doing these very practical, down-to-earth activities, you will understand the key roles of usability testing and learn the key skills easily to apply to your own instructional design. We’ll also share our test findings, reflections, and lessons learned. These you can also take away. To make it even easier, you’ll also receive our test tool template – a test plan documenting the entire process (from planning to executing) with all the instruments attached. We’ll distribute this after the second test. Last but not least, you’ll have chance to participate in highly interactive discussions, share your feedback, learn from each other, and together we’ll make this a very engaging experience for all. Having said that, I hope you have a clearer picture of what you’ll get out of this session today. Now let me turn to Yun to get us started.
  3. Yun : Before we dive into the case study, we want to examine our reality. Take a look at these four graphics and think about if things like this has ever happened in your world. Does anyone want to share your experience? At least I have a basketful. (Audience Response) What we can see from these graphics and your experience is our world is never ideal, for example: between the first graphic and the second, what is user really needs is a) often filtered and distorted through layers of communication, b) or maybe a need assessment was not done with the right people, c) or maybe training objectives were based on previous training, etc. And between the second and the third, often an instructional design would get a) so pre-occupied with the subject matter and learning objectives, to the exclusion of how that learning happens in the users’ world: that you only see the trees, not the forest. b) Opinion from subject matter experts, or program sponsors, sometimes reinforce this exclusion. So when the product is delivered, the user looks at this delivery and tries to make sense out of it. It’s not exactly what they need, so they end up exerting a tremendous amount of effort in order to make it work, such as putting a sofa on a swing to bring it closer to what they originally need.   Although the graphics is just illustrative, the point is a gap is almost unavoidable between the user’s true need and what they actually receive. In your experience, what can we do to address this gap? (Audience Response) You have pointed out many good practices. Today we’re going to focus on usability testing as the practice that we find helpful in amending this gap.
  4. Yun: Usability testing can be defined differently depending on the discipline and context. In our group, we generally have the following criteria for it: It engages the real end user, person who will really be using your training. It’s not a full-fledge program but a representative portion of it during the early instructional design phase. In e-learning it could be a couple of screens with limited functionality. It’s an end user try out, or test drive, of this early design. 4) The purpose of such tests are to validate the design, to identify areas of design improvement Now let’s reflect for a moment on the role of usability testing in instructional design. Is this something optional in your world? How many of you are doing it? How are you doing it? (Audience response) Thank you for your response. Now let me turn to Michael for our case study, and get ready to give our approach a “test drive.”
  5. Michael: This case study is around the design of a a two-hour WBT Program that educates new managers to Identify situations where HR policies should be consulted or assistance should be engaged. Locate, understand, and apply HR policies/assistance. Most policies are on our internal website, and there are also HR assistance team such as our HR support center, HR advisor, etc. In applying the policies, recognize and address the challenges they might face, such as the challenges in coaching and negotiating Master a problem solving strategy that always encourages users to think through the situation, identify the main issue, assess risks/impacts, conducting research etc to develop a solution to any situation. Even though we were able to summarize the learning objectives into these four brief bullets, there is a lot of information to absorb and a lot to learn in two hours. So here is the challenge:
  6. Michael : 1) How do you design a WBT that has a high degree of complexity, while making the learning program as meaningful and as easy to use as possible? 2) Next we are going to look at the first quick stab the designer took. What we call this is a “low-fidelity prototype,” it’s a quick, cost effective way to pull together a web design and get feedback – in this case it’s only a couple of screens which give you the look and feel of the interface, with limited functionalities. We often turn this around in the very very early design stage. 3) By the way, my role in this project is the project manager, and Yun’s role is the usability testing lead. So what you’ll look at next was not designed by either of us… 
  7. Michael : Now let’s have our first fun activity. We’ll look at this prototype together, and We’ll put you in the driver seat as the users, and You should be yourself and react to what you see. Since we are doing this as a group and the prototype is not fully functional, Yun and I will be your drivers. We will facilitate by asking you some questions, and you tell us how you think this thing should work, such as what you would click, what path you would take if you are a real user using this program Here we ask everyone to think out loud and share your thoughts; don’t edit them Be candid and critical Don’t be concerned at all about saying anything wrong, because this is a test of the system on how well it helps you learn, not one on your individual knowledge or intelligence. When we test drive this prototype in a second, Yun will capture your reactions on a flipchart to show you later how your experience compares to the designer’s intention. We’ll then also look at our usability test findings and lessons we learned last year, and how that help shape the next iteration. Are you ready? Next Steps View the 1 st Case Study. Ask the audience to take a minute to look at the interface. Restate the learning purpose. Ask the audience what they would do first and focus on expectations. Note: remember the main emphasis here for learnability concerns the problem/solution methodology, while it’s the visibility issues for the interface.
  8. b) click on Information, to go to our internal HR policy website to consult the policies. c) click on Tool to utilize any checklist, quick reference guide, or job aid provided in the program to resolve this situation. d) click on Ask the Expert to see if there are any Tips or Best Practices they can learn in developing or applying their solution. When they are satisfied with their response, they will click on I'm finished button. Their response will display in this Your Response box, under the Submit My Response button. If they are not happy yet with this response, they will click Enter/Edit My Response. If they are satisfied, they will click Submit My Response. When they submit their response, some text will display in this gray area in the right. This is the recommended solution or feedback by the program. The user is supposed to compare this recommendation with their own response, and take the learning. Along the way :   If they run into questions, they should click on Help.    If they want to go to the next screen or previous screen, use Next or Back. Menu would take them to the main menu where they can select another scenario. This is how the designer intends for things to work. You probably have your own judgment by now on if there is any gaps between that and your way of approaching it. When we tested this prototype last year, we visited five end user representatives in their cubicles, observed and probed with them, just like what we did with you except it was one-on-one. 1) First we found about users’ general reactions, such as they did not like the color, think the screen layout was cluttered and space weren’t used wisely. 2) What we found more important, though, is the pattern of their behaviors. At the first screen, they took off in different directions, not knowing what was expected of them. Some people read the Your Response first. Some read the scenario. Some glanced the entire interface and click on Help or Ask the Expert. Some practically clicked on everything in order to see how it works, before they really worked on the scenario. So there is not a sense of clear path and sequence, and users can’t tell what’s going on just by looking at this interface. 3) When people clicked on I’m finished after developing the answer, they considered that was submitted and expected feedback. When the program didn’t give them feedback, and display their response in a hardly visible, they were confused and experienced loss of control. 4) We also observed hardly anyone had a sense on when and how to use the buttons hidden in the lower right, such as Ask the Expert, Virtual Team, Information, Tools, etc. They seemed encrypted. And users had very little structure developing their response to the question (what do you do to resolve the situation). They just typed a few words. Then when they did get very extensive feedback, the concept of “compare to learn” does not make sense, because the difference was just too drastic. Although there had been so many problems, there are positive things coming out of the test. We concluded from this test why the design wasn’t successful, and the direction of our improvement for next iteration. Let me turn to Michael to share these with you. Yun: The designer’s intention is: The user would read the scenario fully, in this case you need to scroll and read it all. After that, they would read the question in the Your Response box, and click on Enter/edit my response. While developing this response, users should: a) consult the background of the team member involved, by clicking on virtual team to read their profile
  9. Michael: From the usability tests on this first prototype, we concluded that beyond the general reaction such as not so much fondness of the color, volume of text, scrolling to read, etc, this design interface wasn’t successful mainly due to the following design elements. I’m summarizing these concepts in language partially borrowed from Donald Norman’s Design of Everyday Things: 1) Visibility : this means by looking, the user can tell with very little thinking what the interface is for and how to interact with it. To put it another way, users know what is expected of them. The example Yun just gave, in this prototype, by looking at the interface, users did not know what was going on and took off in different directions -- that is an indication of poor visibility. Because the screen is overly complex and cluttered, and what they are supposed to do wasn’t showing, the problem-solving design was not visible. 2) Feedback , which means the user should receive full, continuous and meaningful response from their actions which reinforce their sense of control. In this prototype, Yun gave an example that when our users clicked on I’m finished, they expected feedback from the program. But actually the program did not consider that submitted yet, and didn’t give them the expected feedback. This confused the users and impaired their sense of control. 3) Predictability , which means user can well and consistently predict the outcome of their actions. In this prototype, because of the failures in visibility and feedback, it was very hard for the users to predict anything. Yun pointed out users didn’t know when and how to utilize the buttons in the lower right, such as Information, Tool, Ask the Expert, and some users clicked on everything to figure out how the program works before actually working on it. These are all indications of people failing to predict the program. So the first prototype didn’t seem to be such a hit. As a result, (transition to next slide)
  10. Michael : We felt: that the design wasn’t executing many of the learning objectives, particularly the problem-solving methodology that encourages users to analyze, assess, research etc to develop a solution to any given situation. We also found that the problem-solving methodology was more important than originally conceived. We found that because this methodology wasn’t apparent in this design, users brought in their own world view to solve the problems, and developed response randomly based on their personal experience only. Because they didn’t solve problem in a structured way, when they compared their response with the program’s feedback, the difference was too drastic for them to generate any meaningful learning. So, making apparent t he problem-solving methodology became critical because it provided a structure that end-users could use to think through and develop a solution that would meet the program’s objectives. To remedy the first prototype, we decide to do the following: Improve the visual appeal including layout, color, way to display text, etc. Simplify the interface; make it more visible for them what to do first, second, and third. Accentuate the problem-solving strategy by making apparent the learning sequence. So we incorporated these findings and quickly turned around a second prototype. Yun will share that with you in second. But before we move on, do you have any questions on the first prototype ? (Audience Response)
  11. So now let me ask you first: what would you do when you come to this screen? (audience response) Key Improvements : Let’s look at the key improvement and design intention now. Obviously, better usage of screen space and color, etc, but more importantly There are two main areas of navigations clearly laid out now. These tabs on the top provide the tasks and the content that you need to work with in this scenario. These buttons in the lower right provide global navigation to exit, help and access to main menu. Breaking down the problem-solving model into piece meal including Situation, Issue, Goal, Risks, and users are given feedback continuously along the way. I’ll show this when I explain the design intentions. Design intention: (Click on Situation Tab) Read the Situation thoroughly. After that, user will go from left to right to click on the tabs in order to proceed. (Click on Issue Tab) When you come to the Issue tab, recall the situation you read and type your answer to this question: What is the main issue/problem that you need to address in this situation? While developing response, there is a virtual advisor function to receive optional tips. Submit your response when finished. Then the program displays your response and gives you the program's recommendation. The designer intends for you to compare these and learn, before proceeding to the Goal tab. (Click on Goal and Risks tab) The Goal tab and Risks tab work the same way. You will be asked questions to respond to, and submit your response to compare. (Click on Resources Tab) Designer intends for you to form a better understanding of the situation, issue, risks and goal before you develop a solution. One more thing you need to do within the problem-solving model is to research the HR policies that apply. When you come to the Resources tab, you will see your direction to conduct research. When users are done with their research, the designer intends for users to come back to this screen and continue by clicking Solution. (Click on Solutions Tab) After policies are researched on, the solution tab is activated and the designer intends for the user to develop a solution to the situation and then compare their solution with the program's recommendation. This works the same way as the other three tabs, Issue, Goal, Risks. (Click on Best Practices Tab) The last tab will only be activated after you have a solution submitted. After comparing the solution, you can learn about some of the best management practices associated with such situation. There will be a sub-menu of 3-4 optional questions/concerns that apply most to managers who encounter situations like the one they have worked through, and each question opens up a new screen within the tab. When you reach this tab, the Next button lights up and you can click on it to proceed to the next scenario. So does this design intention connects better with the way you would approach it? (audience response) Yun: A second prototype is quickly designed to simplify the interface and accentuating the problem-solving model. Here we’ll briefly repeat the activity on the first prototype, but we’ll try to do it in a more accelerated fashion. We’ll 1) figure out what do you think you as a user would do first, second, and third, and 2) how do you think of this design compared to the first one, then 3) I’ll walk you through the designer’s intention and point out the key improvements that were intended. You can then, compare that against your perception. And Michael will share our test findings after that.
  12. Yun : The second prototype is an almost fully functional prototype and we conducted much more systematic, formal usability tests with six user representatives identified by our client from various business units. The entire test process is documented in your handout, so are the instruments we used, so you can refer to that for details. Here is a layout of the test setup: Test is conducted one-on-one in a test lab which includes a Test Room and an Observer Room separated by a one-way mirror. The facilitator and the user sit in the test room. Up to six people can sit in the Observer room to see and hear what’s happening in the test room. The computer monitor and audio equipments in these two rooms are connected, but the user couldn’t hear what’s going on in the Observer room. Users are observed to go through the prototype with no intervention except occasional probing from the observer. This is very important here. You’ll probably notice by now that in our activities we have never asked a close ended question such as “do you like this color”. We always ask open-ended questions and let the user lead our way to see their reality. We never express our own opinion of the design – which is why we don’t have the designer or project manager conduct the tests. Anyone who is too close to the subject is not going to be objective. And when you work with the users, the minute you open your mouth to comment the design, you are shutting theirs. So observe – probe only when needed – is the key in conducting these tests. After the user goes through the prototype, I’d have them complete a questionnaire to rate various aspects of the course on a 1-5 scale. And we’ll debrief with additional open-ended questions to get out most of the users’ experience and reactions. The questionnaires and the debriefing guides are both in your handouts on page X.
  13. Michael : What we found out from the second prototype test are: The improvement on the visual appeal and navigation were much more positive. The layout is intuitive, with the labels on top going from left to right, and the global navigations in the lower right. Very clear. Most people understand the left to right indicates a sequence. However, the labels weren’t so intuitive to some users. While the designer tried to break down the problem-solving model into piece meal, with the situation, issue, risks, goal, etc, many people think of problem solving in a more holistic fashion and do not recognize there is a sequence through these labels. Also, because it’s a folder metaphor, mostly on the web, or in real life, a bunch of folders do not indicate a sequence either. Consequently, in the test, we found: People jumped around without realizing there is an intended problem-solving model. They viewed the tabs as kind of unrelated to one another and challenged the sequence intended. Because of such jumping around and exploring, the progress through the scenario slowed down, negatively affecting the pace. There is one general HR policy and service overview module and additional 8 scenarios to be completed within the 2 hours, and in our test this single scenario was taking 30 minutes. As it took the users longer to figure out things, they became distracted and exhausted, and were able to learn less. At this point of the project, Yun and I (transition to next slide)
  14. Michael : … came to the realization that when usability is applied to instructional design, it means two aspects: Ease of Use This relates to the visual appeal, layout, navigation, and all the things we were able to take care of after the first prototype test. We did, as a result, made the prototype much easier to use after the first test. Ease of Learning This relates to a deeper layer in the interface - the things we realized after the first prototype but had not yet been able to completely take care of in the second prototype design. For example, although the visual appeal and navigation were improved, users still encountered obstacles in mastering the problem solving model. After the first prototype test, we recognized that a more structured problem-solving model was critical, so our conception of the interface’s structure, the layout, and the process flow also changed. So we tweaked the interface according to the conclusion that a problem-solving strategy would drive how the objectives, content, and activities would be created. Then we recognized that even after clarifying the structure, another level of interaction needed to be thought through. This level was was more narrowly defined by the users’ perception with the interface. T he issue is more about how users think they should use the interface as opposed to the intent of the design (i.e., use the tab-folder in any order vs. a required sequence). What happened next was more of an intervention by the designer. The test showed that the user’s perception and tendency to explore the interface needed to be constrained in order for the intended instruction to take place successfully. What's important is that it's now clear the difference between the two levels, and it's also clear how the instructional designer is behaving. At the first level, we behaved more like a usability engineer (the screen is too cluttered, there's not enough feedback); but at the second level, we were behaving as ID practitioners, making a decision about how a learning strategy would be structured and sequenced through the interface, according the needs of the audience. Next Yun will walk us through the 3 rd prototype to give you a sense of how we worked through the challenge at the level of learning. Before we do that, are there any questions on the second prototype? (audience response)
  15. Yun: Taking what we had learned from the second prototype usability tests, we fine-tuned it to turn around the third prototype. Now it probably took you less than a second to figure out what is expected of you here. So can you tell me that, or can I have a volunteer? (audience response) There are a lot of minor fine-tunings that we did, but one of the major changes is we re-labeled the tabs as Step 1, 2, 3, 4, 5 and 6, and adding more action-oriented and goal-oriented instruction within each tab. For example, Step 1 is Analyze the Situation. The instruction explicitly tells the user to read the situation and think about the issue and impact. This clearly directs users’ action, and gives their reading more purpose, decreases their guessing around and enables them to focus on content. When in the second prototype the tabs were labeled as Issue, Goals, Risks, although the layout indicates a sequence, people challenges that sequence because the labels don’t suggest a sequence to them intuitively, and they don’t feel compelled to respect that sequence. When the tabs were labeled as 1, 2, 3, 4, 5, people no longer challenged this sequence, because Step 1, 2, 3, 4, 5 indicate such an intuitive sequence and user followed the sequence without distraction. (transition to next slide)
  16. Yun : We did our third prototype usability tests quickly with another six users after making these changes, and the test was done in similar format as the second round, except we were targeting more the key problem areas we identified from the second round of tests. We specifically timed the time users spent on each tab, and if they followed the suggested sequence, and if they were able to learn better through this structure. Our test findings show people recognized the designer’s intention much sooner and followed it with ease. They had a clear sense of direction, read with more purpose, had less distructions and were able to better focus. The pace was improved and they were able to achieve the program’s learning objectives better. So these is the story about our three iterations of the case study design, our three rounds of tests, and what have we learned from ( transition to next slide?)
  17. Michael: So now let’s reflect for a moment what have we learned from the first prototype to the third prototype (transition to next slide).
  18. Michael : After looking at these three prototypes and hear about our experience with the three usability tests, I hope that you agree with Yun and me that in e-learning program, usability testing is a big deal in bridging the gap between designer’s intention and user’s perception. Usability testing helps us increase the visibility of design and decrease distraction. When people take training at work there is already more distraction then we want, and e-learning system image should never become one more barrier to prevent learning.   We should design system images that speak a common language with the users, so that they will only have to battle with the subject matter, but not with us. And since we have different users speaking different languages, the only way for us to learn their perception is to work with them. This is why usability testing that engages your true end user becomes an extremely important and powerful practice to help us design system image and fine tune the e-learning program till it’s successful.
  19. Michael : Read slide.