The document discusses lightweight and guerrilla usability testing methods for digital humanities projects. It outlines how to plan tests with minimal resources, including recruiting 1-3 participants and testing with paper or early prototypes. Tests can be run quickly in public spaces to get feedback. The key steps are deciding what to test, writing tasks and scenarios, conducting the test while taking notes, then prioritizing fixes. The goal is to improve designs through an iterative process of quick, low-cost testing with real users.
Slides from a workshop put up by UX Champaign Urbana (http://www.meetup.com/UXBookClubCU/) describing the process and benefits of Usability Testing. Workshops took place from October 12-November 1, 2014 and the [co][lab]. For more information: http://usabilitypopuplab.com/
Content Strategy and Product Management (in science education)Roger Hart
Presentation from Content Strategy Applied 2017
When your product is mostly content, product management looks a lot like content strategy. The Royal Society of Chemistry is an academic publisher, and a major provider of educational resources for schools and teachers. So that's certainly true here. Having worked in content strategy and product management, and now helping the RSC develop its product management function, I'll talk about how the disciplines interact.
We'll cover:
- What makes a good strategy, and what it means to be a product
- Innovation, roadmapping, and thinking about services
- Measurement and value when your goals are both charitable and commercial
This is a talk I gave to students of the Manukau Institute of Technology, focusing on key usability heuristics, and giving them tips on how to run their own user research or usability testing.
Slides from a workshop put up by UX Champaign Urbana (http://www.meetup.com/UXBookClubCU/) describing the process and benefits of Usability Testing. Workshops took place from October 12-November 1, 2014 and the [co][lab]. For more information: http://usabilitypopuplab.com/
Content Strategy and Product Management (in science education)Roger Hart
Presentation from Content Strategy Applied 2017
When your product is mostly content, product management looks a lot like content strategy. The Royal Society of Chemistry is an academic publisher, and a major provider of educational resources for schools and teachers. So that's certainly true here. Having worked in content strategy and product management, and now helping the RSC develop its product management function, I'll talk about how the disciplines interact.
We'll cover:
- What makes a good strategy, and what it means to be a product
- Innovation, roadmapping, and thinking about services
- Measurement and value when your goals are both charitable and commercial
This is a talk I gave to students of the Manukau Institute of Technology, focusing on key usability heuristics, and giving them tips on how to run their own user research or usability testing.
Slides from a 5/10/2017 talk at the Nasdaq Entrepreneurial Center (@theCenter) about a lean research mindset, the mechanics of learning from users, and the structure of a research prototype test session.
Remote Fieldwork: How observational studies elevated usability at AutoTrader.comEmily Schroeder
While traditional task-based usability research provides invaluable insights, sometimes expanding your practice to include additional methodologies allows usability to have greater influence in an organization. In this session, you will learn how adding remote observational studies enabled the team at AutoTrader.com to become more involved in projects from the beginning.
EdUI 2016: How to Implement Low-Tech, High-Impact Usability TestingMelissa Eggleston
You already know the value of usability testing. But how do you convince everyone else?
This mini-workshop will explain what has worked for facilitators Julie Grundy, Information Architect and UX Designer, Duke University, and Melissa Eggleston, Consultant.
We will help attendees gain confidence in their ability to bring usability testing into their organization—despite little resources and time.
Our goal is to arm each attendee with a practical guerrilla testing approach and a feeling that they can beat down the bureaucracies of higher ed.
Participants will learn how to sell and conduct a usability test with minimal resources. They will also create a script in the workshop, customized to their institution.
No test script survives contact with the software.
That’s where scripted tests fail. Scripts rely heavily on assumptions, inhibit investigative work, and cost too much. Automating tests won’t cut it either; it may be efficient, but still won’t dive wide and deep where the problems lie.
This is where exploratory testing adds the most value; however it is still largely, albeit incorrectly, perceived as an undisciplined, ineffective test technique.
In this talk, I discussed why exploratory testing works better than scripted tests, what critical gap it addresses, and how to do it well.
Don't solve the wrong problem rocketconf may 2016Ben Sauer
Zengenti and Clearleft recently partnered on a redesign, which began with user research that defined the trajectory of the whole project. In this talk you’ll learn about why user research should never be treated as a ’nice to have’, how the team did it, and how to get research to stick in peoples' minds.
Day 2 slides from a two-day workshop on UX Foundations by Meg Kurdziolek and Karen Tang. Day 2 covered research methods that can be used throughout the design process to evaluate and validate design.
The Europeana Newspapers Project held a workshop in Amsterdam in September 2013. This presentation from Channa Veldhuijsen of the National Library of the Netherlands explains some principles of usability testing for historic newspapers presented online.
The Art of APPlication: Using Apps to Engage Students as Collaborators, Creat...sewilkie
How often do you leave a workshop brimming with ideas and anxious to put them into action? Following our first session, Apps Task-onomy, we will dig even further as we investigate ways to implement recommended apps into YOUR practice. Join this "make session", where participants will create lessons and app-tivities for immediate use in their class(es).
Please provide a link back to our BalancEdTech wiki if you use part/all of our resources: http://balancedtech.wikispaces.com/BLC13+-+The+Art+of+APPlication
Slides from a 5/10/2017 talk at the Nasdaq Entrepreneurial Center (@theCenter) about a lean research mindset, the mechanics of learning from users, and the structure of a research prototype test session.
Remote Fieldwork: How observational studies elevated usability at AutoTrader.comEmily Schroeder
While traditional task-based usability research provides invaluable insights, sometimes expanding your practice to include additional methodologies allows usability to have greater influence in an organization. In this session, you will learn how adding remote observational studies enabled the team at AutoTrader.com to become more involved in projects from the beginning.
EdUI 2016: How to Implement Low-Tech, High-Impact Usability TestingMelissa Eggleston
You already know the value of usability testing. But how do you convince everyone else?
This mini-workshop will explain what has worked for facilitators Julie Grundy, Information Architect and UX Designer, Duke University, and Melissa Eggleston, Consultant.
We will help attendees gain confidence in their ability to bring usability testing into their organization—despite little resources and time.
Our goal is to arm each attendee with a practical guerrilla testing approach and a feeling that they can beat down the bureaucracies of higher ed.
Participants will learn how to sell and conduct a usability test with minimal resources. They will also create a script in the workshop, customized to their institution.
No test script survives contact with the software.
That’s where scripted tests fail. Scripts rely heavily on assumptions, inhibit investigative work, and cost too much. Automating tests won’t cut it either; it may be efficient, but still won’t dive wide and deep where the problems lie.
This is where exploratory testing adds the most value; however it is still largely, albeit incorrectly, perceived as an undisciplined, ineffective test technique.
In this talk, I discussed why exploratory testing works better than scripted tests, what critical gap it addresses, and how to do it well.
Don't solve the wrong problem rocketconf may 2016Ben Sauer
Zengenti and Clearleft recently partnered on a redesign, which began with user research that defined the trajectory of the whole project. In this talk you’ll learn about why user research should never be treated as a ’nice to have’, how the team did it, and how to get research to stick in peoples' minds.
Day 2 slides from a two-day workshop on UX Foundations by Meg Kurdziolek and Karen Tang. Day 2 covered research methods that can be used throughout the design process to evaluate and validate design.
The Europeana Newspapers Project held a workshop in Amsterdam in September 2013. This presentation from Channa Veldhuijsen of the National Library of the Netherlands explains some principles of usability testing for historic newspapers presented online.
The Art of APPlication: Using Apps to Engage Students as Collaborators, Creat...sewilkie
How often do you leave a workshop brimming with ideas and anxious to put them into action? Following our first session, Apps Task-onomy, we will dig even further as we investigate ways to implement recommended apps into YOUR practice. Join this "make session", where participants will create lessons and app-tivities for immediate use in their class(es).
Please provide a link back to our BalancEdTech wiki if you use part/all of our resources: http://balancedtech.wikispaces.com/BLC13+-+The+Art+of+APPlication
What is Lean UX? Come get introduced to the topic of Lean UX and learn the fundamentals of this approach, and how it is revolutionizing the field of UX with UserTesting. Discover how constant iterating through cycles and learning from each cycle can create products which can overcome business challenges and meet customer needs, while saving big bucks, resources, and time.
We will cover the basic principles of Lean UX, and how UserTesting fits into this model of research.
These slides were used during a workshop on Usability Testing - an intro.
We covered the following topics:
1) What is user experience?
2) Why is usability important
3) How do we evaluate usability?
Introduction to usability and usability testing as a discipline, followed by how to do guerilla usability testing. Presented at Duke Tech Expo April 13, 2018 with co-author Lauren Hirsh, with content from a prior collaborative presentation of hers.
Remote usability testing and remote user research for usabilityUser Vision
From User Vision's presentation on remote usability testing describing some of the main methods, challenges, tools and tips for successful remote usability testing for user experience
Slides from a session at the American Alliance of Museums 2014 annual meeting, "Tech Tutorial: User Testing on a Shoestring (Beginners)."
Session presenters:
Christina DePaolo
Dana Mitroff Silvers
Charlotte Sexton
http://www.aam-us.org/events/annual-meeting/program/sessions-and-events?ID=2353
User testing is a fantastic method to discover problems. But why is it such a great user research method? How to make sure you recruit the right participants? How to write the right questions and tasks for your usability test? And what is your job as a moderator? This slide deck answers all your questions on usability testing!
User Experience Basics for Product ManagementRoger Hart
User Experience (UX) has matured as a discipline and radically changed how products are delivered. It touches workflows, usability, customer needs, and of course visual design and UI. Product managers can't ignore it, even if they want to... and if they want to, they're probably wrong. The tools of User Experience can help us get closer to our customers and differentiate our products.
2 hours training on Mobile UX with Farah Nuraini, Interaction Designer at Traveloka, Indonesia
45 min theory: Research, Analysis, Design solutions and Testing
+ 1h15 min of hands-on exercises with the 5 facilitators from Traveloka.
Have you ever wanted to test your website or web app with users, but haven’t been sure which are the best tools and techniques to use to ensure you get the most out of it? Chris Bush walks you through the core concepts of guerrilla usability testing, shows how you can use it in your own projects and shares some of his favourite ideas, tips and tricks for making most of your users’ time.
This slideshow covers.
+ Making your users comfortable and setting up a room for testing;
+ Testing on laptops and devices;
+ Tips and tricks for capturing your notes that can save you hours of analysis time;
+ Bonus section: Testing out early concept work with users.
This was a 4-hour workshop that was given at World Usability Day Colombia. #wudco14
Summary:
Now more than ever is the survival of the easiest. Whether the product is a website or a handheld device, success depends largely on how easy it is to use. Usability testing is one of the most effective for creating an intuitive methods. By observing actual people when they use the product, you can get valuable insights if your design is easy to use. Attendees will learn how to conduct a usability test with end users of a product. This workshop is highly interactive and includes several practical exercises to give participants practical experience.
You will learn:
- How to plan a usability testing study
- How to define the goals and objectives
- Explore options (unmoderated usability testing vs. unmoderated & remote vs. in-person)
- How to recruit the right participants
- How to create tasks (Interview-based vs. predefined tasks)
- How to moderate a usability test
- How to analyze and report the results
Advocating for usability: When, why, and how to improve user experiencesSarah Joy Arnold
This webinar will include a short description of why user experience matters, an exploration of when, why and how to do surveys, interviews and usability testing, and conclude with a discussion on how to be an advocate for users at your library.
How to Effectively Lead Focus Groups: Presented at ProductTank TorontoTremis Skeete
Topic: How to Effectively Lead Focus Groups
Tremis Skeete, NexTier Innovations
Talking to users can be a challenge and running a focus group is one of those tasks which most Product Managers would say is essential in getting real insights. Whether you want to test your user group's response to a new product or changes to features within an existing product, as a product person you need to have a creative set of analytical skills and strategies for how to steer the group toward productive discussions. In this presentation, Tremis will discuss how focus groups can truly work well for you, and how you can organize, coordinate, and effectively lead focus group sessions.
User Experience Design Fundamentals - Part 2: Talking with UsersLaura B
#2 in a 3-part series on UX Fundamentals: Talking with Users
Understand why you should talk to users to uncover, validate and/or understand their goals.
Learn how and when to talk with your users:
User research methods
Planning
Best practices for interviews
Rethink research, illuminate history with the British LibraryMia
Join Dr Mia Ridge, Digital Curator for Western Heritage Collections at the British Library, to discover how research and technology can create a richer picture of our past. Living with Machines is a collaborative project between the Alan Turing Institute, universities and the British Library – home to the world’s most comprehensive research collection. Together, they are using data science and digital history methods to analyse millions of historical documents and understand the impact of mechanisation in the 19th century. Their initial approach has focused on specific regions like Yorkshire that will help tell us the story of industrialisation in Britain.
The 'Living with machines' project is a collaboration between the British Library and the Alan Turing Institute for Data Science and Artificial Intelligence. This presentation introduces the project and highlights some early explorations and work.
Festival of Maintenance talk: Apps, microsites and collections online: innova...Mia
Talk for the Festival of Maintenance in Liverpool https://festivalofmaintenance.org.uk/ My talk notes http://www.openobjects.org.uk/2019/09/festival-of-maintenance-talk-apps-microsites-and-collections-online-innovation-and-maintenance-in-digital-cultural-heritage/
Hopes, dreams and reality: crowdsourcing and the democratisation of knowledge...Mia
Crowdsourcing projects have generated millions of data points through volunteer contributions of classifications, tags and other information about cultural heritage and scientific collections. However, to what extent have crowdsourcing and citizen science projects democratised knowledge about the past within 'official' collections and knowledge management systems? And how would infrastructures and policies in cultural heritage organisations need to change to allow deeper integration with knowledge captured through citizen science projects?
Infrastructural Tensions: Infrastructure, Implementation, Policies
The event is a collaboration between Digital Humanities Uppsala, Uppsala University Library, the Department of Archives, Museums and Libraries (ALM), and Uppsala Forum on Democracy, Peace and Justice.
In search of the sweet spot: infrastructure at the intersection of cultural h...Mia
A short paper for a panel on 'Data Science & Digital Humanities: new collaborations, new opportunities and new complexities' at Digital Humanities 2019, Utrecht.
Living with Machines at The Past, Present and Future of Digital Scholarship w...Mia
Short paper on the Living with Machines project for a panel at the Digital Humanities 2019 conference in Utrecht, Netherlands. Living with Machines is a research project using data science with historical sources and questions at scale to rethink the impact of technology on the lives of ordinary 19thC people
Enabling digital scholarship through staff training: the British Library's ex...Mia
A talk at the DH Lab at the University of Exeter in February 2019.
The British Library's Digital Scholarship Training Programme provides colleagues with the space and support to
develop the necessary skills and knowledge to support emerging areas of modern scholarship. Their familiarity with the foundational concepts, methods and tools of digital scholarship in turn helps promote a spirit of innovation and creativity, encouraging digital initiatives within the Library and with external partners. Finally, the programme of events helps nourish and sustain an internal digital scholarship community of interest/practice.
In this talk, Digital Curator Dr. Mia Ridge will share some of the lessons the team have learnt about delivering Digital Scholarship training in a library environment since it began several years ago, and some of the challenges they still face.
A modest proposal: crowdsourcing in cultural heritage benefits us all.Mia
Projects like In the Spotlight http://playbills.libcrowds.com encourage people to pay close attention to historic playbills while transcribing text to help make them more discoverable. Crowdsourcing cultural heritage tasks can create new relationships between cultural organisations and the public, while creating moments of curiosity that help people understand the past and present. Isn't it time you tried crowdsourcing?
A provocation for the British Library Labs 'Building Library Labs around the world' event, with folk from national, state and university libraries with existing or planned digital 'Labs-style' teams.
Crowdsourcing at the British Library: lessons learnt and future directionsMia
Digital Humanities Congress, University of Sheffield, September 2018.
The British Library has been experimenting with crowdsourcing since it launched the Georeferencer (http://www.bl.uk/georeferencer/) in 2012. It launched an updated platform for crowdsourcing in late 2017. Currently the platform supports two projects, In the Spotlight (http://playbills.libcrowds.com/, transcribing information from the Library's historic collection of theatre playbills) and Convert-a-Card (https://www.libcrowds.com/collection/convertacard, converting printed card catalogues into digital records).
This presentation will provide a case study of the implementation of this crowdsourcing platform, considering how the design of behind-the-scenes processes such as metadata workflow, and visible outputs such as the user experience and conversations with participants, were informed by lessons learnt from past projects. The platform is integrated with new Library infrastructure that publishes images in IIIF (International Image Interoperability Framework, http://iiif.io/about/) and has pioneered the use of web annotations for crowdsourced data.
It will discuss how and why In the Spotlight was designed with a balance between productivity (the number of tasks completed) with enjoyment and opportunities for engagement (whether discussing interesting playbills on the forum or social media, or investigating aspects of theatre history) in mind. It will also look at the integration of crowdsourced data into the Library's catalogues, and how the project has changed in response to requests and feedback from participants.
The presentation will include a progress update on the project, and discuss how we applied best practices like usability testing and Europeana's Impact model (https://pro.europeana.eu/what-we-do/impact). It will finish with a preview of future plans for the platform, including the ability for library staff to build their own projects with digitised collections in compatible formats. Reducing the technical overhead for launching a pilot project could be immensely valuable - but how will we ensure that anyone starting a project understands that crowdsourcing is more about people than it is about technology?
Crowdsourcing 'In the Spotlight' at the British LibraryMia
Presentation for Discovery/Participation Panel: User Generated & Institutional Data Transcription projects at EuropeanaTech https://pro.europeana.eu/page/europeanatech-2018-programme
A talk for the CILIP MMIT group at their 'The wisdom of the crowd? Crowdsourcing for information professionals' event, Heritage Quay, University of Huddersfield, March 2018
Museums+Tech conference 2017: Museums and tech in a divided world, Imperial War Museum London
Friday November 3 2017
http://museumscomputergroup.org.uk/events/museumstech-2017/
Historical thinking in crowdsourcing and citizen history projectsMia
The TL;DR version: repeated exposure and active attention to primary materials can develop some historical skills; more learning happens through observing and participating in discussion.
Presentation for Creating Historical Knowledge Socially: New Approaches, Opportunities and Epistemological Implications of Undertaking Research with Citizen Scholars
Washington DC, October 2017
Abstract: This 20-minute presentation examines the extent to which crowdsourcing and 'citizen history' projects and discussion platforms enable and encourage the practice of historical thinking. It takes the definitions of historical thinking set out by scholars and institutional bodies and the American Historical Association's 'core competencies' for students in history courses and degree programs as cues for an extensive trace-ethnographic analysis of participant discourse on crowdsourcing and digital community history platforms. This analysis found evidence for the development of historical thinking, situated learning and collective knowledge creation through participation in online communities of practice. Crowdsourcing project forums support many of the behaviours considered typical of communities of practice, including problem solving, requests for information, seeking the experience of past behaviours, coordinating actions, documenting shared knowledge and experiences, and discussing developments. This paper draws on research undertaken for my 2015 PhD, Making digital history: The impact of digitality on public participation and scholarly practices in historical research, in which I explored the ways in which some crowdsourcing projects encourage deeper engagement with history or science, and the role of communities of practice in citizen history.
Cross-sector collaboration for digital museum and library projectsMia
I provide some examples of cross-sector collaboration from the UK, and include some examples of different models for international collaboration. Invited presentation for the Chinese Association of Museums, Taipei, Taiwan, August 2017
Connected heritage: How should Cultural Institutions Open and Connect Data?Mia
Keynote for the International Digital Culture Forum 2017, Taichung, Taiwan, August 2017
I approach the question by describing the mechanisms organisations have used to open and connect data, then I look at some of the positive outcomes that resulted from their actions. This is not a technical talk about different acronyms, it's about connecting people to our shared heritage.
Wish upon a star: making crowdsourcing in cultural heritage a realityMia
Keynote for the Digikult 2017 conference. The success of crowdsourcing projects that have transcribed, categorised, linked and researched millions of cultural heritage and scientific records has inspired others to try it their own organisations. We can look to 'star' projects for ideas, but what it's really like to run a crowdsourcing project?
For Beyond the Black Box, University of Edinburgh, February 2017
As the datasets used by humanists become ever larger and more readily accessible, the ability to render and interpret overwhelmingly large amounts of information in graphically literate ways has become an increasingly important part of the researcher’s skillset. In this workshop, participants will be introduced to the core principles of scholarly data visualisation and shown how to use a variety of visualisation tools.
Visualisations may sound like the opposite of a black box, as they display the data provided. However, aside from 'truthiness' of things on a screen, lots of invisible algorithmic decisions affect what appears on the screen. Data used in visualisations is increasingly generated algorithmically rather than manually. What choices is software making for you, and whose world view do they reflect? Algorithms are choices - if you can't read the source code or access the learned model, how can you understand them?
Lightweight and ‘guerrilla’ usability testing for digital humanities projects
1. Lightweight and ‘guerrilla’
usability testing for digital
humanities projects
Digital Humanities at Oxford Summer School
DHOxSS 2014
Mia Ridge, Open University
@mia_out
2. Overview
• Usability testing in context
• How to plan lightweight usability tests
• How to run lightweight usability tests
• Putting it into practice: live example
• Finding out more
3. Knowing me, knowing you
https://twitter.com/dhoxss/status/488631220159676416/photo/1
7. How do you know which designs work?
https://www.flickr.com/photos/rooreynolds/11979470446
Test them!
8. What is usability?
The quality of a user's experience when
interacting with a product or system
•Learnability
•Efficiency
•Memorability
•Errors
•Satisfaction
9. What is usability testing?
Watching people try to use your
prototype/site/app/product in order to:
• make it easier to use
• measure how easy it is to use
10. Lightweight usability testing
• Bare minimum:
– one participant similar to your target audience,
– one facilitator,
– one laptop, tablet, mobile or paper prototype,
– one task and scenario,
– ten minutes.
• Review notes and debrief, prioritise fixes.
• Repeat 3-7 times for each type of user
11. Guerrilla usability testing
• Recruit in cafes, libraries, queues, train
stations
• Test whatever you can in the time
• Be nice, move fast, don't get in anyone's way
14. Ideal testing
• Carefully recruit participants
as close as possible to the
target audiences’
demographics, motivations
and expectations
• Test in contexts of use as
close as possible to the real
situation e.g. when and
where used
• Test until you get no new
data
‘Guerrilla’ testing
• Test with any accessible
group of people
• Test wherever you can find
people
• Test 3-5 people
vs
15. Things you can test
• Paper or PowerPoint prototypes
• Clickable PDF or HTML wireframes
• Alpha or beta sites
• Similar sites (e.g. competitor sites, projects
with similar materials)
21. When and why do lightweight testing?
• Any testing is always better than no testing
• See your product with fresh eyes
• Address usability concerns
• Test early and often
• Bonus: reminds you why you’re doing the
project
23. Planning usability testing...
Work out what to test,
choose tasks that’ll test that,
write scenarios to provide some context,
recruit participants,
pilot test, (update it) and do ‘pre-flight’ checks,
debrief, prioritise and report.
25. Designing tasks
• What do you want the user to do to help
answer your question?
– e.g. find contact information for DHOxSS
organisers; find out whether parking is available
• You can also reality check early designs or
content with a ‘first impressions’ task
26. Writing scenarios
• Give participants a brief story that provides
context, parameters for task
– e.g. you have a nut allergy and want to make sure
any caterers have that information
• Make them flexible where possible
• Outline what has to be done, not how
27. Activity: deciding what to test
• Suggest one of your projects, or a site you find
difficult (or hate) to use
• Suggest some questions about that site
– e.g. learnability, efficiency, memorability,
satisfaction or error rate/severity
– You can also test the ‘critical path’ (key task for
that site)
28. Activity: tasks and scenarios
• Think of one task for each usability question
– Think about what you’re measuring, how we’ll use
the results
– Think about test logistics and how the participant
might feel about doing the task
• Write a scenario for one of your tasks
– What context does the participant need?
– Can they adapt it?
– Are there any fixed parameters?
29. Recruiting participants
• Be hospitable!
• Krug: 'recruit loosely
and grade on a curve'
• 3 participants is
enough for one round
of testing
• Recruitment takes time
(and energy)
• Reward appropriately
https://twitter.com/mattkohl/status/488919059594219520/photo/1
31. Activity: looking for participants
• Think of three ways to recruit suitable
participants
• How could you reward them?
32. Preparing test scripts
• Ensures consistency and fairness
• Download sample scripts and forms at
http://www.sensible.com/downloads-
rsme.html
• Always pilot your script, update if necessary
33. Steve Krug’s sample script
Hi, ___________. My name is ___________, and I’m going to be walking you through this session
today.
Before we begin, I have some information for you, and I’m going to read it to make sure that I
cover everything.
You probably already have a good idea of why we asked you here, but let me go over it again
briefly. We’re asking people to try using a Web site that we’re working on so we can see whether
it works as intended. The session should take about an hour.
The first thing I want to make clear right away is that we’re testing the site, not you. You can’t do
anything wrong here. In fact, this is probably the one place today where you don’t have to worry
about making mistakes.
As you use the site, I’m going to ask you as much as possible to try to think out loud: to say what
you’re looking at, what you’re trying to do, and what you’re thinking. This will be a big help to us.
Also, please don’t worry that you’re going to hurt our feelings. We’re doing this to improve the
site, so we need to hear your honest reactions.
If you have any questions as we go along, just ask them. I may not be able to answer them right
away, since we’re interested in how people do when they don’t have someone sitting next to
them to help. But if you still have any questions when we’re done I’ll try to answer them then.
And if you need to take a break at any point, just let me know.
34. Pre- and post-test questions
• Ease into the test
• Provide information to interpret session
results
• Get a sense of overall satisfaction with site
• Always pilot your questions, update if
necessary
35. Recording tests
• Scribble quotes, errors on post-it notes
• Have a scribe take notes
• Screen capture
• Video and/or audio
• Photographs to document the test scenario
• Always explain and get consent in advance
36. Reporting tests
• Focus on agreeing and prioritising fixes
• If you have to report more formally, illustrate
with juicy quotes, key moments
• Involve stakeholders in the process if it'll help
convince them
38. Running usability tests
• You will need...
– Testable prototypes, internet connection, logins
– Facilitator (i.e. you) and participants
– Scenarios and introduction script
– Print-outs of consent forms
– Something to take notes on
• Optional:
– Note-taker/scribe/meet-and-greeter
– Video/audio recorder
– Screen recording/sharing software
39. Running usability tests
• Look after your participants
• Pre-flight checks
• During a test…
• After the tests
40. Pre-flight checks
• Pilot the entire test at least once
• Check the computer/prototype, network,
logins
• Test voice, screen recorders
• Clear browser history and previous test data
• Save any URLs, shortcuts to desktop
41. Optional 'First impressions' task
• Bring up the home page/start screen
• Ask the participant to think aloud:
– what they think the site is/does
– who it's for
– what content or functionality they think is
available
42. During the test
• Meet and greet
• During scenarios:
– watch and listen
– note key points
– probe if questions after a task
– look after participants
• Thank participant, note most important
issues, reset devices
44. Finding out more
• Usability 101: Introduction to Usability
http://www.useit.com/alertbox/20030825.html
• Steve Krug’s ‘Rocket Surgery Made Easy’ book
http://www.sensible.com/rsme.html
• US Government usability site
http://usability.gov/
He notes that in order for there to be a meaningful and valuable user experience, information must be:
Useful: Your content should be original and fulfill a need
Usable: Site must be easy to use
Desirable: Image, identity, brand, and other design elements are used to evoke emotion and appreciation
Findable: Content needs to be navigable and locatable onsite and offsite
Accessible: Content needs to be accessible to people with disabilities
Credible: Users must trust and believe what you tell them