• Save


Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Intranet Usability Testing



IBest practices for intranet usability evaluation and testing

IBest practices for intranet usability evaluation and testing



Total Views
Views on SlideShare
Embed Views



3 Embeds 11

http://www.slideshare.net 5
https://www.linkedin.com 4
http://www.linkedin.com 2



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

Intranet Usability Testing Intranet Usability Testing Presentation Transcript

  • Intranet Web Site Usability Testing Justice, E-Government and the Internet Conference By John Sorflaten, PhD, Certified Professional Ergonomist (CPE) Project Director 26 June, 2000 www.humanfactors.com 800-242-4480
    • Why do usability testing?
    • Types of usability testing
    • How to test functions before designing your pages
    • How to test your navigation scheme
    • How to test detailed designs
    • How to do an expert review
    • Questions from the audience
  • Will Your Design Be Correct? Being smart, educated, and/or well informed does not mean your first guess will be right.
  • The Telephone “ This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.” Western Union internal memo, 1876
  • The Wireless Music Box “ The wireless music box has no imaginable commercial value. Who would pay for a message sent to nobody in particular?” David Sarnoff’s associates in response to his urgings for investment in the radio in the 1920’s
  • Data Processing “ I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year.” the editor in charge of business books for Prentice Hall, 1957
  • What Works for Developers May Not be Viable for Users
  • Los Angeles Times Man Held in Fire at his Psychotherapist’s Home —Los Angeles Times
  • The Burlington Free Press Kicking Baby Considered to be Healthy — The Burlington (VT) Free Press
  • The New Jersey Herald Flier to Duplicate Miss Earhart’s Fatal Flight — The New Jersey Herald
  • Differences Between Developers and Users? _ T T F F S S E _
  • Developers Are Not Like Users
    • Different perspectives
    • Different mental set
    • Different training
    • Different language
  • Summary of Functional Preferences
    • Developers and user representatives:
      • Esoteric
      • Powerful
      • Complex
      • Error prone
      • System knowledge related
      • Impressive
    • Typical users:
      • Common
      • Simple
      • Practical
      • Sometimes wild
  • Testing HFI test: Testing a facility for printing tabs, only one test subject was able to complete it properly. That test subject was a rocket scientist. Survey of 2000 Adults in Oregon: Only 18% could find the time of departure of a bus on the published schedule. Why? Study in Florida: Only 22% of elderly users could use an ATM correctly.
  • Why Test? Find the Crayon
  • Discover the User’s Experience http://www.webpagesthatsuck.com/
  • NOT a button! Strong: Can NOT select underlined words “ Back” and “Fore” Weak : False: Discover Affordances
    • Tell test participants “Circle the things you think you can click on”
    • Count the responses
    Affordance Test
    • Use representative users
    • Use representative tasks
  • Usability Testing Strategies
    • “ Discount Testing”
    Before coding After coding 1. Open loop – decide on one plan and implement it 2. Long loop – occasional feedback 3. Tight loop – continual feedback
  • Usability Testing ROI Investment in testing Usability return on investment GUI High-Level Architecture System Integration Detailed Design High Low
    • Questionnaire: value of functions
    • Group focus: task flow practicality
    • Performance test: high- level navigation
    • Expert review (heuristic evaluation)
    • Protocol simulation trial
    Typical Test Strategy for an Important Site Testing timeline
  • Video of Detailed Design Test
    • Select subjects to match your market segments
    • Test the Web site—not the subject (tell them)
    • Please “think out loud” (to learn problems)
    • Give representative tasks (exercise site)
    • Have time limit goals (to establish task “failure”)
    • Revise time limit goals if subjects go over, but remain satisfied
    • Let subject continue trial and error, until you no longer learn
    • “ One size fits all” instruction…
      • “ Read out loud, talk out loud, and tell me what you are thinking”
    • Consider formal test for critical applications
      • tighter controls
      • task time measurement
      • precise error tracking
      • video record
    Protocol Simulation
  • Follow With Satisfaction Questionnaire
  • Consider Two-Stage Protocol Simulation 1. Test functional design, redesign, and then…. 2. Test with complete graphics and thematic material Functional Prototype (no graphics yet)
  • How Many Test Participants? 1-(1-P) n =R P = Probability of catching an error n = Number of subjects R = Reliability (overall chance of catching problems through testing)
  • Severe Problems Even With Few Subjects
    • 5: ________________
    • 6: ________________
    • 7: ________________
    • 8: ________________
    Test of High-Level Structure 1: _________________ 2: _________________ 3: _________________ 4: _________________ Can users understand your site structure, concept, and navigational methods?
  • Card Sort Test: Check “Mental Model” Create a trip Download current databases Find a location on the map Find a region on the map Find an address on the map Enter origin of a trip Enter destination of trip Enter stops along the way Find the quickest route Find the scenic route Find the shortest route One-way trip Round trip Trip Duration Dates of the trip Locate gas stations Locate food Locate rest stops Locate hotels Locate camping Locate special services Zoom in, Zoom out Change map and list views Clear the map Print the map Print the itinerary Save the trip plan Exit the application Show services on map Select services to show List of services Show itinerary Show map Set up traveler preferences Create a trip budget Show trip costs Check weather Check road construction Make emergency call Locate cool things nearby Daily meal budget Target price of gas Distance willing to travel in a day Hotel budget
  • Improving Your Menu Structures
    • Method: let users show their “mental model” by sorting cards with one option per card
    • Will users “jump” into the middle of the site often?
    • If so, keep high-level group names visible
    • Research says breadth is better than depth (but we have practical limits to top-level menus)
    • Aid comprehension of breadth with grouping of second-level menus (see next example)
  • Use Meaningful Groups
  • Heuristic Evaluations
    • Best suited for advanced prototypes
    • One or more independent reviewers perform the evaluation and consolidate findings into formal document
    • Look for design issues that may impact product success
    • Use a set of design guidelines
    Cons: Does not involve real subjects. May not find problems in task organization that may be critical to overall success. Pros: Can be done quickly. Experts will identify major problems in layout and general presentation.
  • Heuristic Evaluations (continued) The system should always keep users informed about what is going on, through appropriate feedback within a reasonable amount of time. The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialog. Support undo and redo. Visibility of system status Match between system and real world User control and freedom (Source : http://www.useit.com/papers/heuristic/heuristic_list.html) Principle Description
  • Heuristics Evaluations Principle Description Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions. Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Make objects, actions, and options visible. The user should not have to remember information from one part of the dialog to another. Instructions for use of the system should be visible or easily retrievable whenever possible. Accelerators—unseen by the novice user—may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions. Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use
  • Heuristics Evaluations Principle Description Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation Dialogs should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialog competes with the relevant units of information and diminishes their relative visibility. Error messages that should be expressed in plain language (no codes) precisely indicate the problem and constructively suggest a solution. Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.
    • Reviews are difficult
    • We concentrate on the negative
    • Every review finds “obvious” problems (that are NOT really obvious)
    • Our focus is to ensure that Web sites:
    • Attract users
    • Capture their attention
    • Provide value
    • Encourage users to return
    • Important reiteration:
    • We are Web site usability experts
    • We are not necessarily representative of actual users
    EPA Case Study: Expert Review
  • New Initially, the new home page design seems creative and cool. It also allows immediate access to many underlying functions without numerous transactions or levels. Both things are good. However, the interface technique, being somewhat unconventional, creates a set of challenges. Of even greater concern, the underlying structure may not work well as a funnel to help users find their desired content within this far-reaching site. Current Within the existing Web site, inconsistencies in the high-level architectures of sub-sites, as well as inconsistent design details, require users to learn many different sites within www.epa.gov. The site has a wealth of important information, but users may not be able to find it all, or to find it within a reasonable amount of time. Recommendations to improve the usability of the site will be discussed. New Home Page
  • Current design has very little dynamic information to catch the user’s attention. New Home Page Only a few words of news Consider providing more hot news and ideas on the home page.
  • Current design does not give an immediate hook, to draw people in to learn more. New Home Page Lots of things to select and view. Alternative: use an immediate hook, that relates to most people. Learn about YOUR Environment! Consider featuring a catchy facility on the home page.
  • The example works. But it will be very hard to routinely fit a meaningful headline in the space provided. Also, you might need several headlines. New Home Page Consider enlarging the news headline area. Try to make the amount of space more flexible.
  • News and Logo are logically reversed. Tell the users where they are before giving news. New Home Page We are aware of the EPA logo placement standard. However, it might be better to use the top left (like most sites).
  • If “About EPA” is the top item, it is more conventional to default the top item to be selected. New Home Page Top left default as selected. Consider making “About EPA” default, or reorganizing page to put “by topic” first.
  • The major breakdown for the site structure is “Explore” Vs. “Go to EPA:”. What is the difference between exploring and going? New Home Page The distinction seems very subtle Consider a primary site structure based on more concrete categories.
  • The connections between the first and second-level buttons have a lot of visual noise. New Home Page Consider a cleaner graphic connection like this…
  • About EPA section. Sequence is unclear. Grouping is not shown. New Home Page Consider more logical sequence and grouping. Avoid acronyms. Remove Unnecessary words. “ Glossary” would be more specific and short.
  • Your community section New Home Page Avoid EPA jargon. EPA jargon. What is “Envirofacts”, “Environmental Profiles”, and “EnviroMapper”?
  • Why force users to go to a separate search page? A simple search facility can easily fit on the home page. New Home Page Unusually prominent placement. Suggest FAQ at bottom. Hopefully a site guide will not be needed.
  • This mission statement excerpt is excellent. Obviously a lot of work went into it. Why force the user to discover it by holding the mouse over the image while not clicking? New Home Page Consider simply showing the excerpt from the mission and the link.
    • Private citizen - community
    • High school student - education
    • Teacher - education
    • Local official - programs
    • Researcher - science
    • Lawyer - regulation
    • Engineer - technical
    Using Personae and Representative Tasks
    • Private citizen - what is the water quality of the Chesapeake Bay and should I not swim in certain areas?
    • High school student - what is the history of Love Canal?
    • Teacher - can I get a lesson plan on ozone depletion and its effects?
    • Local official - what programs to reduce air pollution exist in my area?
    • Researcher - what are the current hot research areas being funded by EPA?
    • Lawyer - where can I find the latest Federal Register notices dealing with guidelines for industrial releases into water?
    • Engineer - what are the latest developments in scrubbers for fossil fuel power plants?
    Persona and Representative Tasks
    • Used to navigate the site looking for information to answer relatively specific questions that might be posed by typical users
    • Not exhaustive set of personae or tasks
    • Provided enough structure to identify “show stoppers” and many examples of usable and not-so-usable design details
    Persona and Representative Tasks
  • Starting Point for Task Walkthroughs
  • Private Citizen Not particularly well organized or grouped Does this include NGOs? (yes) Why just Regions 1, 6, and 8? What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen So decided to try here. Different high-level architecture and design details (2 clicks {+ a scroll} from Home Page). What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Is Search for CEIS or EPA? (EPA) Icons require rollovers. Not sure where to go for Chesapeake Bay. Tried Tree icon, looking for county-level data. Later realized I could have used the side bar to get to the same place. What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Scrolled down to map and selected Maryland, even though I wanted information on Virginia portion too. Scrolled down to Anne Arundel County even though I wanted information on the entire Bay. What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Selected surface water. Result when hit “Submit”. On a different WIN98 IE browser, got Security Alert and Internet Redirection warnings to click through. What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Then selected Anne Arundel again plus surface water, hit “Submit”, and got two more of the same warnings. Resultant Table had four choices that said the same thing: UPPER CHESAPEAKE BAY 02060001 Selected one of them and got to Surf Your Watershed. Map HUC codes do not match table. Different high-level architecture and design details What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Scrolled looking for information. Decided to pursue first link on page, Index of Watershed Indicators. Wasn’t sure what to make of much of these data... What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen “ ... latest overall score (October 1999, Version 1.3)…”? More serious and less vulnerable? Wasn’t sure what to make of these data. Swimming near Kent Narrows looks like a bad idea? Is downtown Baltimore really better? “ ...You can view STORET , fish advisories, and other IWI data as well as Envirofacts information on a watershed level. You can visit other geographical levels such as state and county…” What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen Wasn’t sure what to make of these data (numbers, scores, years, no benchmarks, etc.) What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
  • Private Citizen
    • In conclusion, for the task of a private citizen:
    • After about 20 “clicks” was more confused than informed
    • Experienced several different high-level architectures and numerous detailed design differences
    • Was concerned not only about usability, but also about utility or value of data accessed ( e.g., warnings of statistical quality, no comparison numbers such as better or worse than x years ago)
    What is the water quality of the Chesapeake Bay and should I not swim in certain areas?
    • A research-based, set of heuristics was proposed by Gerhardt-Powals (1996):
      • 1. Automate unwanted workload
        • - Free cognitive resources for high-level tasks
        • - Eliminate mental calculations, estimations, comparisons, and unnecessary thinking
      • 2. Reduce uncertainty
        • - Display data in a manner that is clear and obvious
      • 3. Fuse data
        • - Reduce cognitive load by bringing together lower-level data into a higher-level summation
      • 4. Present new information with meaningful aids to interpretation
        • - Use a familiar framework, making it easier to absorb
        • - Use everyday terms, metaphors, etc.
      • 5. Use names that are conceptually related to function
        • - Context-dependent
        • - Attempt to improve recall and recognition
    Heuristics to Guide Future Design
  • Heuristics to Guide Future Design
    • 6. Group data in consistently meaningful ways to decrease search time
    • 7. Limit data-driven tasks
      • - Reduce the time spent assimilating raw data
      • - Make appropriate use of color and graphics
    • 8. Only include in the displays the information needed by the user at a given time
      • - Allow users to remain focused on critical data
      • - Exclude extraneous information that is not relevant to current tasks
    • 9. Provide multiple coding of data when appropriate
    • 10.Practice judicious redundancy (to resolve the possible conflict between heuristics 6 and 8)
  • Approaches to Testing at Earlier Design Phases
  • Types of Function Tests (Test Your Task Definitions)
  • How Much Would You Pay? (Best Test)
    • How much would you pay for Item 1?
    • a. $0
    • b. $1 - $5
    • c. $6 - $10
    • d. $11 - $20
    • e. $21 - $50
    • f. $51 - $100
    • g. More than $100
    • How much would you pay for Item 2?
    • Etc.
  • Limited Funds:
    • Your development budget is limited to $100,000.
    • Please allocate the budget between the following items:
    • 1. Item One $_______
    • 2. Item Two $_______
    • 3. Item Three $_______
    • 4. Etc.
    How much would you spend on each?
  • Limited Number: Which Would You Pick?
    • Item 1 ___
    • Item 2 ___
    • Item 3 ___
    • Item 30 ___
    Typical approach: Please check the ten most important items. Alternative approach: Please check the ten items you would be most willing to do without.
  • Rate on a Numerical Scale
    • Item 1:
    • Not valuable - 1 2 3 4 5 6 - Extremely valuable
    • Item 2:
    • Not valuable - 1 2 3 4 5 6 - Extremely valuable
    • Item 3:
    • Etc.
    Please circle a number from 1 to 6 that best matches your opinion of the value of each of the following items: Unreliable, very general, but OK with small number of subjects
  • Test the Practicality of Your Task Design
  • Advantages of Walkthroughs
    • Allow early task flow evaluation
    • Can be set up quickly
    • Inexpensive
    • Evaluate competing solutions before substantial commitment of resources
  • Task Selection
    • Start with simple tasks
    • Select a range of tasks
    • Cover several core functions
    • Cover area of concern
    • Wilder individuals
    • Off-beat is OK
    • Wider range
    User Selection
  • Create a Storyboard
    • Outline the task flow
    • Represent each step graphically:
      • Basic page layout
      • Basic functions
      • Basic navigation
    • Use a specific task example
  • Extra Bonus!
    • You will often find problems just by completing the storyboards
  • I would like to have an alarm clock on my laptop...
    • It would let me:
    • set time
    • set alarm
    • turn on alarm
    • read the time
    • hear an alarm signal at appropriate time
    • respond to the alarm signal (snooze or turn off)
  • To set the time...
    • You are in a hotel room
    • You have changed time zones
    • Battery is out and the time was lost
    • There has been a change to daylight savings time
  • Walkthrough Techniques
    • Informal setting
    • Present the basic structure and design
    • Present alternatives
    • Encourage questions
    • Encourage comments
    • Be flexible
    • Be solution-oriented
  • Walkthrough Pitfalls to Avoid
    • Failing to listen
    • Criticizing
    • Dwelling on detailed design
    • Being defensive
    • Overuse of technical jargon
  • Never Ask a User How They Would Design It!
    • If they knew, they would have your job
  • Medical Research Organization
    • User complaints and outdated look drove redesign
    • Redesign failed due to
      • Many user types missing
      • Content of user types unpredictable
      • Poor navigation at document level where 84% of users entered via independent search engine
  • Initial Deployment Weak
    • The menu framework enforces a modal task flow, increasing steps to other tasks while within a task
    • Confusing menu
    • Client redesigned site to address user complaints
    National Medical Research Organization
  • Redesign Without User Research
    • New design has improved look
    • Content divided into confusing user categories
    • Users failed to find what they wanted
    • Users resisted labeling themselves by these names
    • Users got lost in the site - hard to tell where you are and where to go
    National Medical Research Organization
  • Redesign With User Research
    • A taskbar presents content by common user tasks
    • Users found what they wanted quickly
    • Consistent navigation bar on every page helps users know where they are and where they can go
    • Meaningful, useful content available on the home page
    National Medical Research Organization