Event ID: 1516472
Event Started: 3/23/2010 6:30:00 PM
Please stand by for real-time captioning. >
>> Good afternoon, everybody, this is Darryl diamond with web manager, university.
It's 2:25, earn time, on Tuesday the 23rd, we will begin in about five minutes. If
you have technical difficulties you can contact us at 202-208-2688, or web manager
university -- we will begin in about five minutes. Thank you very much. >
>> we are about two minutes out from the start of today's Webinar. Contact us via
e-mail at webmanageruniversity@GSA.gov or call at 202-208-2688. We will begin in
just a few minutes. Thank you very much. >
>> this is Darryl Diamond, with Web Manager University. We will start in one or two
minutes. If you have technical difficulties please contact us at 202-208-0668 or
webmanageruniversity@GSA.gov. We will start in just a minute. Thank you very much.
>> Good afternoon everyone. I am the special projects manager, we want to thank you
for being here for todayed's Webinar, Optimize your Top Tasks With Remote Usability
Testing, call Darryl at 202-208-0668 if you are having technical difficulty.
>> If you have a question, please type it into the chat area of the Webinar. We
will take questions at the end, and during the Webinar, please type them in.
There's a small double arrow, that will minimize. Today's Webinar will be recorded
and you will receive a link to download the recording, transcript and the electronic
version of the in an e-mail following the Webinar. Before we begin I would like to
introduce Nicole Burton, usability exist on best practice here at GSA --
>> Nicole Burton: It's wonderful to be here with you, and it is my pleasure to
introduce Cliff Tyllick this afternoon. He is an active member of the user
experience subcouncil and a usability specialist for the text text commission on
environmental quality. He's been there 14 years. He helped form the Texas
commission on environmental qualities accessibility coordination group, a couple
years ago and leads agency-wide training on how to do usability and accessibility at
his agency. He recently taught a training on how to make PowerPoint presentations
accessible. In addition to being a dog thews yaft, up hear more about later, he's a
[indiscernible] thews yaft, and recently picked 108 Morrell mushrooms. It's
Pleasure to introduce him, how to optimize your top tasks with remote usability
>> Cliff Tyllick: It's a pleasure. Today we will describe a highly effective way
to -- using remote usability testing. I should be on the next slide already. We
will get this -- let's see -- there we go. I guess I just ruined this slide's
suspense, but people often seem skeptical that remote usability testing can really
work. Here at the Texas commission on environmental quality, we have done enough of
this kind of testing to realize it's value. We plan to do even more in the future
and I would like to encourage you to do the same for your agency.
>> Before we go further, though, I would like to know more about you. Alicia, do we
have a quick poll for them?
>> Yes, just give me one second.
>> Okay, folks, do you have access to Webinar software such as go to Webinar, WebEx
or that sort of thing?
>> We will close that poll in just another minute.
>> It looks like 65% of the respondents said yes, 20% said no, and 15% said they did
>> That's good to know.
>> Second question: Have you ever done remote usability testing? Yes, no, or I
don't know. Please answer now.
>> It looks like 9% of the respondents said yes, they have done remote usability
testing. 88% said no, and 3% said I don't know.
>> One more question. Have you ever had a phone conversation with a customer while
they were on your website? Yes, no, or I don't know. Go ahead and answer.
>> 83% of the respondents said yes, 17% said no and there were no I don't knows.
>> That's good to know, at least the people knew what the person on the other end of
the phone was doing. That's good. Well, I am glad to hear we are getting in touch
with a lot of people who haven't done this at all. I would like to point out, those
of you who have been on the phone with a customer, that actually is a form of remote
usability testing. I won't go into it too far now, we will explore it a little
later, but let's start out, then, getting to the questions that people ask all the
time. Can I really do it with three to five people? Where should I find these
people? And how do I set up and run the test?
>> We will start with whether three to five participants can tell you anything.
What you need to keep in mind is that with three to five tests you can identify
widespread problems. Jefff Sa u R o posted a great -- this is what you need to
remember, we have a link in the slide presentation.
>> You certainly won't find every problem and there's no absolute guarantee you will
find every severe problem, but up find a lot of them. The more customers a problem
effects, the more likely it is you will run across that problem with at least one of
your participants. So with this type of testing we stay on alert for two kinds of
problems: The ones that absolutely break the site and the ones that might not be
that severe, but they seem to give trouble to everyone tries to do a certain task,
the task you are testing.
>> So, three to five is enough problems that effect every problem, you are going to
just need one test, and one out of three, you will have a five out of six chance,
85% likely, to finds the problem that effects one out of three of your customers by
the time you run five tests.
>> So let's go on. Where will we find these three to five people? The short answer
is anywhere. The longer answer is just about anywhere. The test we are about to
show you is one example. In the past we got our participants from e-mailed or
phoned-in complaints. I would have the software running, and take notes. If you
have walk-in customers, you can talk with them, meet people at conferences, check
with them. In this case I met the participant, Walker Santiago through Austin Boxer
Rescue. Austin Boxer Rescue tries to find new homes for boxers, usually after folks
find out what is cute in a puppy isn't so cute in a 60-pound dog. When I was
walking our own dog I found an eight to 10-month old boxer. We tried finding the
owner, after a few months had no luck. We pretty well figured out somebody must
have dumped her, and Posey, as we named her, is really sweet, but the dog we already
had wasn't eager to have a sister. They were fine on their own, but together they
were a terror, at least for my wife. So we contacted Austin Boxer Rescue. A couple
of months later Christmas was coming, and we still had Posey, and travel plans, and
no place to kennel Posey. It turned out one of the volunteers with Austin Boxer
Rescue, Erin, agreed she and her husband would take care of Posey for us while we
were out of town. I gave them a business card for contact information. When her
husband, Walker, usually just there hanging around at Boxer Rescue events, saw where
I worked, he got really excited. You see, Walker works with a company that has to
get per permits from us and make sure the companies they work with have valid
permits from us. Walker and I talked and before too long I was asking if he would
be interested in doing these kind of tests. He really jumped at the opportunity, he
was thrilled to be asked. Last week, when we got around to setting up this
particular test, checking with his people at work about does his computer have what
it takes to participate, there going to be problems with the firewall, that type of
thing, his boss got wind of it and was even more thrilled than he was. He
wholeheartedly backed Walker's participation in this test and phyt you future
tests. I mention this because the question of do we have to pay participants comes
up. Although that's probably a good idea, we don't have a budget to do that. We
can't. So we rely on people like Walker. It turns out we really can get them
>> You just need to make sure that the people you get are typical people who would
do the task you are testing. You don't want someone who works at your agency,
unless you are testing a feature on your own Internet. You don't want an intern,
unless you are testing a task only interns would do, or be among the people who will
do it. You want people who would actually have to do the task you are testing.
Notice I said "would have to do," not have to do. Thinking of the website as a
place -- it isn't necessary for them to know the task can be done online. For
example, maybe your state allows people to renew drivers licenses online. Not
everybody would think of doing that. But if you are looking for people to do a
usability test on your site where you allow them to renew a license online, anybody
who drives in your state would be a valid participant.
>> Okay, so we have the participants. How are we going to do this test? The short
answer is however you can. If you have Go to Meeting, Live Meeting, some other
virtual meeting application, that's going to be great. Live Meeting is a Microsoft
product. Go to Meeting is from Adobe. These products have meat remote testing
simpler and a lot more powerful, because, as I was doing the remote test with
Walker, I didn't have to jot much down, the software was recording everywhere his
mouse went. It was recording every click. We even got audio added to the
>> I think, not absolutely sure, but I think if you have the bandwidth you can even
add a webcam and capture facial expressions. That's beyond the scope of what we
will demonstrate here. Nicole, is it right all federal agencies have access to Go
>> Nicole: No, I don't think everyone has Go to Meeting. A lot of agencies have Go
to Meeting, GoToWebinar, same time, so not everyone, but we could see from the polls
many agencies have them.
>> At least 65% have it, so that's good. If you don't have either of these
applications and can't get them, you can always do what I started out doing, do it
over the phone. You won't be able to record what they are doing on screen without
special software, but you will still be able to find and document most of your most
trouble some problems.
>> The way I got into this was answering complaints that came into our web master
e-mail. If people were complaining about some feature of the website, I called
them, e-mail, call if they give me the number, find out what it's like to experience
our site from their perspective. They really appreciated that in most cases. They
at least respected it. About half the time, I would say, going through what really
was an over-the-phone usability test with them turned anger or resentment into
gratitude. We at least got to civility.
>> The only drawback is on the phone you have to take a lot of notes, but you can
use screen shot software. You're following along on your computer, doing on your
website exactly what they tell you they are doing on their end of the presentation.
So you can get a screen shot. I use Fire Shot with FireFox, for example, make a
quick screen capture, add comments and continue with the test. It's not hard to get
good documentation even with over the phone tests.
>> Okay. So we know what can work, the next thing is to find a task, and you have
to define task pretty narrowly. This is Nicole's working definition. And it works
for me too. An action you can start, complete, and then go to lunch.
>> Find and complete and submit a W-4 with holding form. That could be a task. If
that can be done online, not sure it can, but if it could, maybe on your intranet,
that could be a use ability test. Get a 1040 and its instructions, register to
attend the next Webinar. web file a complaint, pay a fee. Your top tasks depend
on your customers and website. If you haven't read or heard the transcript of Jerry
McGovern's Webinar, check it out. If you need help figuring out the top tasks, you
can't do better than getting Jerry's advice.
>> Testing doesn't have to be expensive, doesn't take lots of time, and any
well-designed testing is better than no testing at all. By well-designed, the main
things to remember, you need a good task, the right participants, the right ways of
measuring what they do, the right metrics, and a way of observing unobtrusively.
You know, when you are working with the participant with the Internet or phone
connection between you, it's a lot easier to be unobtrusive, even if you just have
to push the mute button. One last thing to remember is: Keep it simple. Don't
>> This might be a good place to stop for questions. So, Nicole? Would you like to
>> Just to remind people, if you have questions, do please enter them into the chat
box in the Webinar controls on your screen. Alicia, do we have questions yet?
>> Yes, I think we do.
>> Okay, No, we don't.
>> Well, one thing people tend to ask, is there a risk of people seeing confidential
information on participant's computer. Y'all are seeing my slides, right? I
>> You did game of free cell pop up right at the beginning when I was being
introduced, did you?
>> Good. This is the same interface. I can choose what you see from my computer.
When you allow a participant to use the same software -- and by the way, we had
absolutely no problems with the firewall at Walker's business or here at the TCQ,
you just download a free client, the people who are setting up the meetings are the
ones that have to have the software, that will be you. The participant downloads
the free client, you turn control over to them and you're only going to see what
they decide to show you on their screen.
>> Our top task was pretty simple. Report a smoking vehicle. You can do that
online in Texas. Could Walker find the right web page, and I think, should I click
this? Show people the page? We have links to the slide, I will try to show it to
you later. Can he find that page? Once there, can he find the right link to the
right form? Can he complete it in a certain time period? I picked three minutes,
maybe five is better. Nott sure. One thing that's important to me, can he tell
whether he's succeeded? If you just complete a form, don't get anything that lets
you know you failed, you might think you have done everything right. If he fails,
what's the cause.
>> I need to say a little about the task. We don't have millions of vehicles with
bad exhaust systems in Texas, but anybody who sees a smoking vehicle can report it.
We have that set up because we know that vehicles that put out a lot of smoke, even
one of those, produces a pollution than many well-tuned cars. This is a way to get
an alert. We send a letter to the registered owner of the car, letting them know
the car was observed smoking and giving tips on what they can do about it. I picked
this task because I felt it would demonstrate useful points for this Webinar.
>> Let's see how Walker did. I need to give control to Nicole.
>> Nicole: Just one second, we will get this going.
>> Let me see if there are any questions. There is a question: Have you ever
validated the effectiveness of this remote testing method to more classical methods
in a controlled environment?
>> Cliff Tyllick: No, and I don't think that's really necessary. You use the two
in tandem. You know you're not going to pick up, as I mentioned -- you are not
going to catch every problem; but if you do catch problems, you can do something
about them. If you catch problems this way, they are big problems. At least in the
number of people they affect. The validity of the problem comes in more with what
you do with the information and sometimes what you do with the information is decide
not to do anything until you can gather more information. If you have the
resources, maybe after you have done a few of these tests you might feel like you
have the experience to do a more controlled test with a larger number of
participants, to check out that one problem. Then you don't wind up investing all
those resources, all those precious resources for that well-controlledded test on
something you could have founds out by watching one person try to use your site over
>> Does that make sense?
>> Yes. Great, thank you. I think we are ready for the video now.
>> With the help of Walker Santiago, we're going to be testing whether people can
report smoking vehicles to the state agency in Texas responsible for air quality.
Walker, before we begin, I would like to point out to you, we are not testing you,
we are testing our website. Anything you find, anything you can't find, is not your
shortcoming, it's our website's short-coming, and the information you help uncover
will help us make our website better. So let's just go through the normal mouse
clicks, as you would, and do the things that make sense to you. We will see how
well our website works for you.
>> Santiago: Should I go ahead and search for TCQ as if I am a person who has never
been to the website?
>> Cliff: Let me ask you first, to be sure you are a customer for whom this will be
important, do you drive in the Austin area? I do.
>> Do you often see vehicles that are smoking excessively and wish you could report
>> I do, yes.
>> If we hadn't contacted you about this test, would you know the ECQ is the place
to report them? no.
>> Would you know you could report them? no, I would not have known, I thought the
police were pretty much the only people who were enforcers, be on the look out to
report themselves, I guess. I would have thought the Department of Public safety
may be more that jurisdiction for smoking cars.
>> Cliff: So that helped us already. Let's imagine you have thought there's a way
to report it online. Let's start at a search page and search for a place to report
that you see a vehicle that is smoking excessively.
>> Okay, right now I am on the Google search engine, I will type smoking car, Austin
report, see what comes up. First one, TCQ does come up right there, with a phone
number and link to the website. I will click that.
>> Right away gives information -- telling for different counties to report at this
other website. Let's see, I scroll down, how reporting works. If it smokes more
than 10 seconds, it's a real problem, report within 30 days, English, Spanish,
great. Doesn't apply to diesel. To report a smoking vehicle -- you would want to
remember the license plate number, date, time, city, and I guess where in the city.
Within 30 days, and here it is, smoking online. I will click that. I select
Austin.let's put in 000000, so we know it's a test.
>> Today's date? fine.
>> We will say I saw it at 12:00 o'clock, 30 minutes ago.
>> Location -- anyplace on your are drive to work.
>> A good int section. I will submit this. License plate number -- you have to put
an official license plate?
>> Cliff: Right.
>> Santiago: Of course. All right, is Should I put my own vehicle? cliff: Let's
not do that. We could do that, but what I would like you to do, you found the form,
you filled it out -- let's imagine you are convinced that's the correct license
number as you remember it. How will you contact us -- or
>> Santiago Santiago: If this isn't the right number or -- I would -- one thing,
there's no way for me to get back to the previous page that had that phone number,
which is what, the way I would try to contact you guys with information. I will go
to the top right where there's another google search engine, or home, back to the
engine, say if I go to the TCQ website, now I know it's TCQ. I will just go to the
search engine now on the right-hand side, type smoking car again. There it is,
again, the same search came up. I click that, at the page I was at before, had to
go in a round-about way, but I got back to it. There's a number right here. I can
also fax. It says the information, or even mail it in.
>> Cliff: That's very good. Do you feel that will accomplish the task of reporting
the smoking vehicle?
>> Santiago: Yes.
>> Cliff: Let me ask another couple questions I should have asked up front. How
experienced are you with using computers? Something comfortable? Or new to you?
>> It's my job, I am pretty comfortable.
>> Cliff: Okay. Good to know. What exactly is your job? I am a civil engineering
technician at a civil engineering company in Austin, so we actually use a TCQ
website and other federal and state agency websites to do our job. Some of these
websites are a lot more familiar than others. I never reported the smoking car, so
this is the first time I used it.
>> So this was new.
>> Let me have you do one more task for us, and we are on this web page, and let's
say you were motivated -- did you find this experience satisfactory or was there
something -- Santiago: I think it was great. When I was in the report mode, didn't
let me report the license, even though I thought it was the right number, there was
no way for me to even, if I go to upper left, I couldn't use the back to search, or
back button in the browser to get back to the original -- this page. Or there was
no other link to that original page to get the phone number, even providing the
phone number, some of the information that is here, reporting the smoking vehicle
information on that web page would have been helpful. I was stuck on the web page,
couldn't go forward or back.
>> Cliff: That's very useful. Nicole: Okay, back to Cliff.
>> Cliff: Let's see how Walker did. He did find the right page, did find the right
form, completed the form, but he couldn't submit the completed form and cornet get
back to the page with the contact information. So he did quite well until I gave
him that bogus license plate number. I wonder if anyone noticed anything else about
what Walker came up with.
>> Let's see -- Nicole: Alicia, were there questions or ob servations.
>> Doesn't look like any just came in.
>> Cliff: Be thinking about that, and what I would like to know is how you think
our website did. I have my own observations on that, but what did you note? The
people watching this Webinar? What observations did you notice that are worth
noting in this usability test?
>> Folks, if you will like, enter your observations, we have a few. One of the
attendees said it was a round-about way to find the right form. The tester was
willing to do a lot of scrolling.
>> Cliff: Yeah, and on that one, I wonder how much much of that was because there
was somebody be at the end of the phone. It's hard to know, the fact we have given
them a task makes them look more carefully than if he were doing it himself. But
let's go ahead.
>> He jumped to search when frustrated.
>> This attendee says my observation is that you gave him the term smoking car,
would others use this term Mean different things to different people. A different
term, things would be different.
>> Cliff: That's valid. In my defense, we did this recording when I came right out
of a meeting, had trouble getting on line to begin with. I what you wasn't at the
top of my game, but that is a good catch. You don't want to put words in their mind
that you know are going to be the perfect term for them to find it. You want them
to come up with that, term on their own. There was a point where I tried hard not
to -- drawing a blank, but where I asked him to do another task, tried hard not to
give words to lead him. You have be careful about that in the design of your task.
So it sounds like everybody came up with what we came up with. Nicole and I thought
the search engine optimization was great, even though maybe I did cue him a little
>> We have found that in general, however we ask the question, you see a car ahead
of you belching a lot of smoke, what are you going to you call it? We founds people
were able to find this page on the search engine. This test reproduced that.
Followed a round-about path to the form. Once he got to the landing page, he passed
up two links to the form. One says English, another says in-English as the link
text, take you to the reporting form, but Walker didn't connect to that at all.
That's a good case where if we see this pattern repeats itself over two or four more
customers, we probably would want to do a more focused study or with that,
information, look at the link text. Looks like it's not working,ing. We might run
a few more tests first. It's a good idea to produce a quick report, like this one,
right at the end of a test. Even caught the big failure, Walker mentioned it, we
broke the back button. The neat thing about recording a test, you would be
surprised, maybe not, how hard it is to get programmers to understand that you don't
want to break the back button. When the user wants to go to the page they were just
on, unless they intentionally opened a new tab or window, they are expecting the
back button to take them there. Walker is a pretty savvy guy, he knows computers
quite well, he was in AutoCAD drawings just before we started this test, so he knows
a lot about computing, but even he couldn't deal with the fact the back button was
broken, didn't think about using the browser history.
>> So, moving forward to what we should report, I reported basically that, the SEO
looks great, made it to a couple of points about the back button. We need to recode
this form so the back button doesn't get overwritten or we at least need to add a
link into the form or into the error message that lets people get -- if somebody
can't report through the form, some kind of error, display the phone number so they
can call if they want to. That's a serious problem. Even though it might not
affect every user, we did catch it on the first person, will effect every person who
enters an invalid license plate number, and the kind of thing, if we fix it right
away, we should, before we run another test.
>> One of these patches should be there before we do two to four more are tests, if
at all possible.
>> The problem about the link text falls in the middle. It's one of the things
where we want to monitor the performance, we want to see if we need to get our
people writing web pages to attend that Webinar that's up coming, helping your site
visitors engage and suck ensucceed -- I pointed out, sometimes you find problems
that will make you decide, this is something I need to check out with 25 people.
You will solve the big problems before you get to the ones that need close scrutiny.
>> Nicole, did you have a checklist that shows us how to do this? Set things up?
>> Nicole: Yes, I do. If you would go to the next slide, it's as easy as one, two,
three. We will review what Cliff did. The first thing is we set up the test,
prepare for it. Then we run the test. Then we go to review and iterate, repeat the
>> Preparing for the test, first step is to identify your three to five top tasks
and which of those tasks you are going to choose, one task at a time. Three to five
customers. We didn't go into at length how you identify your top tasks, but we have
done other Webinars on that, and as Cliff mentioned, the Webinar from Jerry McGovern
in January was fabulous. It's a free Webinar, in the archive, check it out.
There's a tool called the quick and easy customer profile, web content.gov. I will
be doing a free demo Webinar on April 6 on how to fill out the quick and easy
customer profile. We will work with the Department of Justiceis and you will
actually see how they use web analytics and this customer profile tool to identify
their top customer and identify their top tasks. You have to get the tasks, set a
goal for the test, Cliff identified he wanted to find out what barriers there were
to reporting a smoking vehicle. It's good to add a time goal. Could be three to
five minutes, just so you know what a reasonable period of time is, beyond which, if
your participant goes, you have a problem. You recruit the three to five customers.
Cliff talked about wonderful ways to recruit customers, they are all around us.
You find a time slot, schedule, at their and your convenience. If you are going to
use the Webinar, you set up the Webinar software, or if you don't have Webinar, you
don't matter. You arrange when to call them, and the link, if you are using a
Webinar. You think with return on investment when you set up the test. In other
words, sometimes it can have a powerful effect on the business task that is
connected to that task we are doing. It might lower the desk calls, worth 20 to
30-dollars per call, lower the time the web master spends.
>> So after you have done the test, there is a way to roughly quantify the value the
test brought you.
>> Next, we preparedded for the test. Now you are going to run the test. You heard
Cliff do it. You introduce yourself, you present the task, and in this case we
suggest people, if you are just starting out, do one task. The danger -- you may be
tempted to do more tasks, but the more tasks, the more data you have coming in.
Doesn't take long to get confusing. We recommend people start with three to five
customers, one task. You will ask them to do the task, you observe and listen. One
of the challenges when starting out is -- we managers and usability specialists are
helpful people. You don't want to be too helpful during a usability test. You want
to listen to the person and see where they go. As you can see from the recording,
you don't have to be perfect. Cliff came running in from a meeting, did this test,
might have done things a little differently, that's true for all of us. The
important thing is you do it. I have a friend who is a horticulturalist, she says
you are not a real gardener unless you kill things on a regular basis. The
important thing is you keep doing it, the more you do it, the more you learn.
>> There is a reference I recommend to people, I will have Darryl send this out with
the follow-up e-mail to people, it's a book I have on my book shelf called the
Handbook of Usability Testing. It's a really great handbook doing usability
testing, reminding you of all these things and a few more.
>> I am not doing very well running these slides am I? You are.
>> I recommend you watch this as soon as possible while still fresh. You will be
surprised, I have been, if I allow too many meetings or other projects between the
meeting and watching the recordings, watch it as soon as you can, while it's fresh.
Here we recommend going at your findings in a really simple, direct way. Identify a
few things that were barriers to success. A few things that worked, because every
website has something, usually, that works. That's very helpful when you are
talking to the developers and designers, give them a jam sandwich. Say hey, the SEO
is great, but there's a few things we want you to work on with the form. Having the
thing that's work is good to know and communicate.
>> Then there might be a few other things that are possible improvements you can
make. Keeping it scoped nice and simple.
>> You will repeat this process with two to four additional customers already
scheduled and optimize this task. That's the point of this whole exercise, to make
the task easier to do. The low-hanging fruit, the things that are really clear will
make a difference. Go ahead and make those changes and move on to the next task.
Most people don't have a staff, array of usability specialists to work endlessly on
small parts of their website. You have a lot to do, other hats you wear, optimize
top task number one and move on to top task number two. By the time you are
finished you can start again, up have new top tasks. They do change over time.
>> Now, in terms of using the remote usability testing, use Webinar, telephone and
computer, as Cliff said, you can easily use the telephone and computer with
Internet. I have colleagues that used a new program called [indiscernible], they
liked it, it is free for minimal use, try it out. There are other ways. Go ahead,
if you would like to suggest them, type those into the chat box.
>> Here are the three take-aways. Test one top task with three to five customers,
set a time within the realm of what's reasonable to ask a customer. Identify one
upon barrier, at least one success, one thing that worked and another improvement
you would like to work on.
>> Optimize the task, move on to the second top task.
>> Cliff has a couple of wonderful resources here. They are great documents, Jeff's
article, and Gerry's Webinar. You are welcome to join us for the demo of the quick
and easy customer profile in two weeks. The April 8 course, she will use the same
case study that you will be seeing on the Webinar demo, the Department of Justice,
helping them identify their top tasks.
>> Do we have other questions?
>> Yes, there are lots of questions. It looks like, since we only have two more
than thes left during the scheduled time frame, if folks want to stay on five to 10
minutes, we can get some of these answered. Otherwise they will be captured in the
transcript, when we send that around people can see that.
>> What sort of consent will you get from participants?
>> That's something I have been a little concerned about. We did this very
informally. We find people are willing to share. As we start moving towards
recording more we might need to be a little more formal about the consent. When the
testing was just Cliff's notes jotted down during a phone conversation, that was one
thing. I guess that's the take-home, I think.
>> Nicole: That's a really, really good question. This is quite informal, but as
you can see, we used a clip of the session with Walker. That will get posted up on
the web, we probably need to go back, have him sign a plain English consent form
saying it's okay to post this on the open Internet.
>> When you are doing usability tests, capturing the likeness of someone doing a
video, very important to get a consent form. There are forms on usability.gov.
That's a good point. It's a good thing to err on the side of being cautious. cliff:
I am jotting a note right now. I will get a form from usability.gov
>> Another question?
>> How do you estimate acceptable time on task?
>> I think it really depends on the task. If they were filling out an online job
application, should take longer than filling out the time for a vehicle pouring out
>> It's common sense, if you are not sure, a specific domain you are not familiar
with, usually the subject matter experts in that particular domain can tell you what
a reasonable amount of time should be. They deal with the customers on a regular
daily, weekly basis. There's someone in your organization that is closer to the
task that can give you pretty solid information. It can be arranged, three to five
minutes, five to 10 minutes, you don't have to narrow it down too much.
>> One main thing about having the recording; we can go back, see to the second how
long it took Walker to find the form after he got on the page, complete the form
once he got there. So even if we are way off with our guess up front, we can refine
that after doing a few tests, and seeing, with the stop watch, people actually doing
it, where do they start getting frustrated.
>> that leads to the next question, would it be better to use users that you know
visit the site and provide a list of tasks for them to choose from?
>> Cliff: We have done that, similar to what we do at our conferences, we have an
environmental trade fair once a year. Did testing of our homepage that way. Our
focus was can you get to a task from our homepage and complete it, looking at
redesigning the homepage that, approach worked really well. But, I have to say,
that's the point at which you need to start looking at larger groups. Once you
start letting the user pick the task, well, if you have five tasks, five users, each
picks a different task, you don't have three to five per task. We found in testing
the homepage that way, when we tried to make changes too soon, after five or ten
results, we started going around in circles a lot. We really needed data from 50 to
100 people before we could see a consistent pattern that helped us do a better job
of redesigning the homepage. Focus on one task for this kind of approach. If you
are broadening it out, looking at a site-wide behavior, I think you are beyond the
scope of this kind of test.
>> Okay Great. Anything to add Nicole? no, I think I agree with that, and I think
that is a very good approach. Here what we are looking at is the top tasks, a
really small basket, one at a time. The way to go.
>> Here's a question about policy, basically. Wouldn't the script have to go
through the OMB clearance process to comply with PRA? I am so glad somebody asked
that question. No! I was waiting. The beauty of only testing with three to five
users is that it is not the subject to the OMB clearance process. You can do as
many tests that are limited to three to five users as you want, without clearance.
>> What tips do you have for selecting top tasks?
>> Watch Gerry McGovern's Webinar. He's the master of identifying top tasks, and
you can also come on the demo for the quick and easy customer profile. You want to
look at some of your web analytics, especially search terms, the top 50 search
terms, top 50 most visited links, and then, you kind of walk backwards from those
artifacts if you will, to piece together what your customers were doing that they
came to your website, visited pages, put in search terms, and you will see patterns
that will lead you to what your top tasks are
>> I recommend doing the quick and easy customer profile because it does tie it in
with your web analytics. A lot of people have references, opinions, preferences,
you look at analytics, it tells you what people have done already on your website.
>> Did you want to add something, Cliff? yes, just that you might add to the
analytics things that are looking at things pragmatically, forward looking.
>> What kinds of things, what are the regulatory implications of certain tasks? Is
it a very important task that people get done right? That might be more important
to tests than something done a lot, but is less -- has no regulatory are
implications to it at all.
>> Right. It's definitely true if there's a task in cerkz with the failure of which
your agency's head is going to -- his picture's going to show up on the front page
of the Washington Post or New York Times, might be a pretty important task.
>> Right, or from the customer's perspective, if they fill out this form incorrectly
it puts them in a position where our investigator are a couple years later may
ticket them and they my get tick may get fined. Think of them in terms of the
implications for the customer of a failure that you can avoid by having a
>> Great. You mentioned you can do these sessions with folks who call in with
problems. Do you normally try to do the test when they call, or schedule a time
with them? Which do you feel is more effective? I think it would be better to call
them back. Just off the top of my head, if I were to call in, complain about
something, somebody were trying to enroll me, get me to do a usability test right
then, I think I would tend to feel used. I will need a little more time to cool
down, come back, have someone else contact me. I think it would come across as
better customer service that way, and that is the way we happen to do it typically
-- our agency is so large, I don't hear, unless we go ask people who answer phones,
have you gotten anything about the website? We do get information that comes in to
various e-mail boxes on the website, a customer satisfaction survey we found out is
almost impossible to find, but some people find that, fill it out. If it has to do
with the web I will contact them and I have gotten very good test results that way.
That's where they made the complaint a few days later, somebody they weren't
expecting to hear from calls them back, interested in hearing why the site didn't
work for them, so that feature can be fixed. That really does help people feel
better about it. I do, with that, call, let them know we would like to do that and
ask is now a good time. Most of the time they say yes, if it's not a good time,
yeah, can we schedule another time to call, take 20 or 30 minutes at the very
>> Great. Thank you. There are a few more questions, but since we have run about
11 minutes over. I will gather these questions that haven't been answered, pass
them along to you and Nicole and we can post them with this slide and the
transcript, if that's okay with the both of you.
>> Fine with me.
>> Before I pass it back to Nicole to wrap up, I want to thank you for -- please
fill out the evaluation to provide us with feedback about the Webinar today. Nicole
will host another Webinar for Web Master University, the quick and easy customer
profile, registration for the 2010 government and new media conference is open on
April 1. Be sure to reserve your space. Nicole?
>> Nicole: Thank you very much Alicia and Darryl for your help on this Webinar. I
want to thank Cliff as well, and let everybody know that Cliff and I are both on the
user experience sub-council, meet on the second Tuesday of every month from 2:30 to
3:30. Sometimes we have Webinars and information sessions, sometimes we are working
on projects and have regular organizational meetings. You can enroll in the group
by going to forum.web content.gov, go to the user experience subcoincidence I
council group, you find out more about usability practices, like remote testing.
Thank you very much everybody. We appreciate you coming on the Webinar. Thank you.
>> That concludes today's Webinar. Thank you. >
>> [event concluded]