Your SlideShare is downloading. ×
Hfes 2012   saqer et al
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Hfes 2012 saqer et al

194
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
194
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 600 Expanding the Usability Toolkit: Using PowerPoint™ to Perform Website Analysis and Testing Haneen Saqer, Brian Kidwell, Craig Stoudt, & Robert J. Youmans George Mason University, Fairfax, VA Many usability software packages exist to serve the needs of user experience practitioners. However, these options are often expensive and possess steep learning curves. The purpose of this paper is to provide nov- ice practitioners a usability toolkit that is easy to use, versatile, and affordable. Using basic presentation software, PowerPoint™, graduate students in a usability and redesign course performed card sorting tasks with several users and used the results to create website prototypes for usability testing. The detailed meth- ods for deploying these usability techniques via PowerPoint™, as well as the benefits of these methods, will be explored. INTRODUCTION soned analysts alike who seek to gain experience conducting usably analysis, but who do not have access to sophisticated A common requirement often cited on human factors psy- analysis equipment. Specifically, we have outlined here how chology employment opportunity listings is that new appli- card sorting, prototyping, and basic usability testing can all be cants have experience conducting usability analyses. Potential facilitated using the ubiquitous Microsoft PowerPoint™ employers rightly expect some minimum level of experience or presentation software. We do so by describing a recent usabil- proficiency with a range of basic usability techniques so that ity analysis that was conducted during the redesign of the the employee is ready to tackle domain specific tools or ad- George Mason University College of Visual and Performing vanced analysis methods once they are hired. But an increas- Arts (CVPA) website. Our goal in this paper is to demonstrate ingly common refrain among graduate students and novice how analysts can use PowerPoint™ software to conduct usability practitioners with limited industry experience is that somewhat sophisticated card sorting procedures via email, getting basic experience with usability analysis and testing is prototype multiple potential versions of an interactive product difficult. The software and hardware packages that are consid- like a live webpage, and to support high fidelity usability test- ered gold standards in the usability field, they argue, require ing conducted in later stages of development. Our hope is thatCopyright 2012 by Human Factors and Ergonomics Society, Inc. All rights reserved. DOI 10.1177/1071181312561125 high upfront purchase costs, monthly subscriptions, or expen- all analysts will find some utility in the procedures that we sive user licenses (WebSort, OptimalSort, Card Zort). It would describe here, but we especially wish to demonstrate methods seem that many new members of the human factors community that can be useful for newcomers to the human factors profes- need reassurance that the path to the advanced methods used in sion or those on limited budgets who are seeking ways to get university laboratories and well-funded private usability test- more experience with analysis work. ing facilities still begins by acquiring experience conducting well-designed tests that are facilitated by creatively applying The CVPA Redesign Project widely available technology. It has long been documented in the user experience com- The George Mason University CVPA was tasked with munity that analysts can learn a great deal through inexpensive redesigning the college website for the dual purpose of use as a paper-and-pencil analysis techniques including card sorting recruiting tool, as well as for an updated, modern-day aesthet- (Capra, 2005) and paper prototyping (Lim, Stolterman, & ic. The updated website is part of an ongoing focus on using Tenenberg, 2008; Snyder 2003). Because these methods re- the internet as an outreach tool for advertising the college to quire no specialized technology, are inexpensive to use, and potential students, as well as conveying information to current are easy to learn, they are excellent at providing usability ana- students and alumni. Contact for the CVPA was facilitated lysts with critical insight into problems with interactive system between the graduate students and a professor in the CVPA. design very early in the conceptual design process. But for all The collaboration was further supported by administrative the valuable information that paper prototyping can provide, officials in the CVPA. In order to better assess group goals, there are limitations to these methods. One challenge is that it graduate students from both departments participated in class can be difficult to use paper to prototype complex dynamic and group discussions. This facilitated a common understand- systems where user interactions happen quickly. Many forms ing of the intended website for all individuals involved. of mobile technology now also allow user to interact with mul- The original CVPA website (Figure 1) contained much of tiple systems at the same time, which can be a challenge to the same information as the intended redesign, but was not render in paper (Sefelin, Tscheligi, & Giller, 2003). Finally, designed with strong usability considerations in mind. The paper prototype testing is also very difficult to conduct with focus behind the original website, distributing relevant materi- users at a distance because the analyst is not physically present al for prospective and current students, remained the same to control the behavior of the paper prototype. during the redesign process. Usability and subjective For these and other reasons, we outline here several new measures were taken concerning the original website for com- or updated methods that might be of value to new and sea- parison to the redesigned version. Anecdotal opinions of the
  • 2. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 601original website indicated that individuals felt it looked dated, a brief overview of card sorting and specific instructions. Par-disorganized, and boring. ticipants were instructed to view the terms (i.e. the cards) on the subsequent slide and group them into four to seven distinct categories by cutting and pasting the terms onto the appropri- ate slides. Blank slides labeled as categories 1-7 were provided with the instructions “Rename Me!” This heading served to remind participants that in addition to grouping terms into the categories they deemed most appropriate, they were also re- quired to name that category. For this website 45 terms were derived from a proposed site map developed by a graduate class of graphic design students. The labels of these 45 terms were presented as equally sized and dispersed green tiles ar- ranged in random order on a slide. Participants were instructed to spend 15 to 20 minutes on the task and to return the results by email. The responses from the card sorting exercise were compiled and analyzed in the USORT and EZCALC software packages developed by IBM Corporation. Analysis Tools. The results of the card sorting exercise were analyzed with Hierarchical Cluster Analysis (HCA), a statistical method to find clusters of objects with similar char- acteristics. The EZCalc software provides utilities for perform- ing HCA. Statistical software like SPSS and R are also capable of creating dendrograms and cluster diagrams. However, identifying appropriate categories based on par-Figure1. Current CVPA homepage. ticipant responses can be performed in Excel™. The first step to analyzing the results in Excel™ is to review the category METHOD names created by all participants. Next, compile an inclusive list of these categories and create a spreadsheet in which theCard Sorting category names serve as column headings and the card names as row headings. The numbers in each cell represent the num- Participants. The card sorting exercise was administered ber of participants that included each card in the correspondingover 4 days to a total of 33 participants: 22 students in a graph- category. The sums across each row should equal the totalic design class and 11 students in a psychology class on usabil- number of participants. By visually scanning each row, itity analysis. All participants were students at George Mason should be apparent which category was most frequently select-University; the graphics design students were well-acquainted ed for each card. Once the counts have been tallied, thesewith the CVPA while the psychology students were less famil- counts can be converted into percentages by dividing each celliar with this school. by the total number of participants. Cells with low percentages Procedure. Card sorting is a simple yet effective tech- (i.e.,10% or lower) can be removed so that focus can be givennique for creating an information architecture that seems natu- to the strongest relationships. Starting with the first category,ral to users. It reveals how various kinds of users view the sim- cards are then sorted by descending percentage, revealingilarities and differences between terms in the information ar- which cards are most commonly sorted in that category. Thechitecture, how users intuitively group those terms into a hier- process is repeated for each category until all cards are as-archical structure, the number of groups that may be necessary, signed a grouping. If certain cards appear to consistently falland what those groups should be named. It is particularly use- into more than one category, designers should consider com-ful when there is no accepted or standardized taxonomy for bining the categories or renaming them to reflect intended dif-organizing the content, when there is great variety in the num- ferences.ber of terms, or when it is difficult to assign the terms to clear- In card sorting, similarity is measured by the number ofly defined groups. In addition, card sorting provides insight times two or more objects are grouped together. Distance tointo the ways users interpret the labels assigned to terms in the their common join point can be depicted graphically in a treeinformation architecture. diagram or dendrogram. The threshold (distance) between The basic methodology of the card sorting technique is to clusters can be determined via maximum, minimum or averagedevelop a list of terms, present these terms on individual cards, methods. The group average method appears to be most com-and then ask the participants to sort the cards into categories mon one used for analyzing card sorting data and was used toand to assign labels to those categories. Card sorting can be analyze the data collected in this study.performed manually with index cards or it can be performedon a computer. For this exercise the participants were provided with aPowerPoint™ presentation. The first slide of the file included
  • 3. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 602 Usability Testing Participants. Six adults (3 women and 3 men) with an av- erage age of 32.17 years participated in the usability test. Three of the participants were graduate students currently en- rolled at George Mason University. Procedure. A prototype of the redesigned website was created by incorporating static images of the proposed rede- signed homepage and child pages. Participants completed three navigation scenarios. The scenarios were designed to represent the needs of various users of the CVPA website: prospective graduate students, current undergraduate students, and prospective undergraduate students. Only the specific links needed to successfully complete each scenario were in- cluded in the prototype (i.e. if the link was not necessary toFigure 2. Sequence of slides for card sorting task. complete the task, it was not active). This allowed usability testers to quickly identify when users made errors. In thisInteractive Prototyping study, the scenarios were presented in sequential order, but future prototypes with more complex scenarios may be coun- Within PowerPoint™, we created a website prototype by terbalanced.structuring images representative of a planned site and creating Users were presented the PowerPoint™ prototype of thelinks for advancing slides only in positions necessary for com- redesigned website on a laptop while the usability tester readpleting basic tasks. To mimic website functionality, transparent specific instructions regarding each scenario. As the partici-shapes created in the presentation software were placed atop pant clicked through the prototype, the tester noted reactionthe static images and given hyperlink functionality. For exam- times and errors. Following each scenario participants wereple, a clear rectangle was placed over the university logo ap- asked to provide subjective feedback about their experience.pearing in the top left-hand corner and was linked to a slidewith a static image of the university homepage. Anytime this RESULTSlogo was clicked, the presentation would transition to the cor-responding slide. (It should be noted that when using presenta- Card Sortingtion software in this way, the default transitions for slide pro-gressions should be deactivated.) This method works much the The results from the two groups of students (graphics de-same way as other website design software (Balsamiq, Axure). sign and psychology) were compared to determine if the twoWith proper instruction of tasks, participants can perform sim- groups exhibited any differences in the way they organized theple functions in pursuit of a larger goal, while researchers can terms in the site map. For both groups, a HCA distance thresh-focus on measuring performance-based metrics (time, accura- old of 0.78 produced five distinct groups. Key results from thiscy, errors) of usability. exercise reveal: frequently identified categories; multiple in- Design students within CVPA created the visual features terpretations of labels; differences in category labeling; andfor the redesigned website. The static image of the design pro- labels that were difficult to categorize.vided the base for the dynamic prototype created in Power- Frequently identified categories. At the 0.78 threshold,Point™. The new website focused on streamlined text, a co- the categories created by the two groups of students exhibitedherent color motif, and creative use of white space. many similarities. Both groups created categories that were most often labeled „Academics,‟ „About CVPA,‟ „Admissions‟ and „Welcome.‟ The most noticeable difference was the fifth category; the psychology students created a „People‟ category while the graphics design students created a „News‟ category. The following table depicts the categories created by the two groups. Table 1 Categories created by design and psychology students Graphics Design Students Psychology Students Academics Academics About CVPA About CVPA Admissions AdmissionsFigure 3. The redesigned website image incorporated into Welcome Welcome People Newsdigital prototype developed for usability testing.
  • 4. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 603 Usability Test Navigation scenarios. Completion times, error frequen- cies, and subjective ratings (Likert 7-point scales) were col- lected for each of the three scenarios. Participants rated Sce- nario 2 (current undergraduate student seeking information for course requirements) as the easiest to complete and had the fastest completion time with the fewest number of errors. Sce- nario 3 (prospective undergraduate student seeking infor- mation about theater department) had the worst overall per- formance. Errors on this scenario were due to users not click- ing the “About Us” link, but rather searching for a link labeled “Academics” or “Departments.” The results for Scenario 1 (prospective graduate student seeking admission information) are better than for Scenario 3, but worse than Scenario 2. The most common error for all scenarios was due to the fact that hyperlinks were placed on small orange arrows at the end of lines of texts. The majority of users attempted to click on the words within the lines to jump to the desired page and it took several missed attempts before they realized that the hy- perlinks were found at the end of the line. This finding became apparent because of the high fidelity prototype that incorpo- rated actual screen shots of the design. Had the prototype been made with paper materials, this fine design detail would not have been incorporated and a major usability issue would have gone unnoticed. Additionally, usability testing revealed that many users were unaware that certain words and logos were clickable links. The minimalist design and font choice of the website garnered favorable subjective feedback regarding aesthetics from participants, but resulted in user confusion during naviga-Figure 4. Dendrogram for CVPA student card sorting data tion. Likewise, the small font choice with low contrast wasreveals that the .78 threshold results in five distinct categories. regarded highly during subjective feedback, but during usabil- ity testing, it was apparent that this design resulted in difficulty Multiple interpretations of labels. Analysis of the card when attempting to select the correct links. Again, withoutsorting task reveals that the two groups interpreted some of the incorporating the specific font and color choices of the designlabels differently. Labels for a performing arts center and an into this high fidelity prototype, a lower fidelity paper proto-arts academy within CVPA created the most confusion. Psy- type would have missed these usability findings.chology students interpreted these labels as part of the „Aca-demics‟ category while the graphics design students associated Table 2these links with the „About CVPA‟ category. This findingpoints to an opportunity for the client to rename the label or Performance Data and Subjective Ratingsprovide some kind of definition for users of the website. Completion Number Ease of Overall Differences in category labeling. The major difference in time (sec) of errors Navigation Satisfactionthe labeling of categories occurs in the fifth category: „People‟ Scenario 1 47.7 5 4.5 4.2in contrast to „News.‟ By asking participants to rename each of Scenario 2 29.5† 1† 5.5† 5.8†the categories, these differences in labeling became apparent. Scenario 3 57.7 7 3.3 3.5 †With this information the client can make an informed decision Best performance time or subjective ratingabout the focus for this category, or if two categories should becreated to include different information. DISCUSSION Difficult to categorize labels. „Center for the ArtsEvents Calendar‟ and „Facilities Rental‟ had furthest distances Although the categories produced by the two groups werefrom other terms in their categories, which indicates that par- similar, closer inspection revealed some interesting differ-ticipants found it difficult to categorize these terms. This may ences. Psychology students did not associate the terms „Port-be a potential issue for users because there is no clear category folio Review Guidelines‟ and „Preparing a Portfolio‟ with theassociated with these terms. In other words, users specifically category „Admissions.‟ This indicates that the psychologylooking for these terms will not know which headings to click students lack familiarity with the application process for an Artto access these sites. School. Psychology students associated the links „Hylton Per- forming Arts Center‟ and „Potomac Arts Academy‟ with the
  • 5. PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012 604„Academics‟ category, while CVPA students associated these about the instructions of the task, or if the user had any specif-links with the „About CVPA‟ category. This suggests that ic questions regarding the meaning of the content pages. It alsousers from outside CVPA are unclear regarding the role of the prevented the testers from conducting any think-aloud or ver-Performing Arts Center & Arts Academy within CVPA. In bal protocol analyses of the users as they sorted the cards.general, multiple interpretations highlight the need to rename For the electronic prototype, it was evident that using alabels so that they are intuitively clear to the user. There were high fidelity semi-functional site via the use of PowerPoint™other interesting differences in labeling. CVPA students creat- resulted in usability findings that would have been missed withed a separate „People‟ category while Psychology students paper prototypes. The specific font color, size, and contrastincluded these in the „Welcome‟ category. This reveals the choices used in the design generally resulted in favorable userneed for evaluation to determine if „People‟ contains enough feedback regarding aesthetics. However, they also caused usercontent to warrant its own category. Other labels were diffi- confusion during navigation. These details would have beencult to categorize for both groups; for example, the „Center for difficult to create in paper prototypes. Additionally, by creat-the Arts Events Calendar‟ and „Facilities Rental‟ terms exhib- ing a self-sufficient prototype, one tester was able to test eachited the furthest distances from their categories. Large distanc- user. In paper prototype testing, the tester is often focused ones emphasize a need to assess content pages to determine if manually navigating the paper prototype for the user with eachthey should be renamed, eliminated, or consolidated. “click.” User responses are typically coded by another tester in Users had a fairly easy time of navigating the site in the room, or the session is video recorded and coded after-search of information regarding the graduate admissions pro- wards. The use of this electronic prototype allowed the testerscess and course requirements for a specific program. However, to observe the user responses more closely and record reactionwhen participants were instructed to find specific information times and error rates instantaneously. Clearly, this benefit inregarding theater department faculty, it was not readily appar- tester time is worth the up-front time necessary to create proto-ent that this information would be included in the “About Us” types in PowerPoint™. By using an electronic prototype, test-section. Some participants felt that this information would be ers can also implement screen capture software to capture re-better suited under an “Academics” or “Departments” title sponses for more complex tasks. However, there are limita-within the primary navigation menu. Several users also felt that tions to this method, which include the inability to capturethe secondary navigation specific to user type (i.e. current stu- loading times of websites once hosted on a live site and to in-dents, prospective students, etc.) was redundant and distract- corporate embedded dynamic components from websites (e.g.ing. Although there is a potential for the secondary navigation interactive chat windows, flash slideshows, java applets). Con-to provide shortcuts for frequent visitors and user-specialized sidering these elements are most likely finalized in later stagescontent, we feel that if the primary navigation is made more of the design cycle, we recommend testing these elements verysalient and follows the recommended information architecture, close to when the site is ready to go live.the secondary navigation becomes unnecessary. The placement The usability findings culled from the use of electronicof the secondary navigation on the top right-hand corner of the card sorting tasks and high fidelity prototypes created in Pow-screen was also distracting because it drew the viewer‟s eyes erPoint™ guided specific design recommendations for theaway from the central portion of the page. client. These methods significantly reduced the tester time The specific usability recommendations gleaned via the needed to perform the analyses and provided the client withPowerPoint™ card sorting presentation and digital prototype tangible digital records of the user testing sessions. Althoughmay have been missed with lower fidelity paper versions. One specific software currently exist to address these needs forsignificant benefit of the card sorting presentation was the abil- experienced practitioners, PowerPoint™ offers a low-costity to email the file to multiple users and receive the responses easy-to-implement solution that can be adapted in a multitudevia email. (It also eliminated any issues with soft- of situations.ware/hardware compatibility because PowerPoint™ is compat-ible with both Windows™ and Apple™ operating systems). REFERENCESThis afforded the usability testers the opportunity to gather Capra, M. (2005). Factor Analysis of Cardsort data: an alternative to hierar-information quickly from more than 30 users remotely. This chical cluster analysis. Proceedings of the Human factors and Ergonom-quick access to information allowed for the feedback of two ics society 49th annual meeting. Blacksburg: VA.different groups of users which provided insight into unique Lim, Y., Stolterman, E., and Tenenberg, J. (2008). The anatomy of proto-usability issues for specific types of users. Additionally, the types: Prototypes as filters, prototypes as manifestations of design ideas.digital record of the card sorting results allowed the testers the ACM Transactions on Computer-Human Interaction 15, 1- 27. Sefelin, R., Tscheligi, M., & Giller, V. (2003). Paper prototyping – what is itopportunity to revisit the files often and analyze results for good for?: A comparision of paper- and computer based low-fidelityspecific user types. With paper card sorting tasks video tapes prototyping. In Proceedings of the Extended Abstracts on Human Fac-or pictures of the physical cards would have taken a much tors in Computing Systems (CHI‟03). Ft. Lauderdale, FL. ACM Press,longer time to analyze due to time for coding and transcribing. New York, NY, 778-779. Snyder, C. (2003). Paper prototyping: The fast and easy way to define andHowever, it should be noted that in certain instances, particu- refine user interfaces. Morgan Kaufmann Publishers, San Francisco,larly in early stages of design, testers may want to invest the CA.time for face-to-face interaction while conducting card sorting USORT and EZCALC software accessed fromtasks. By using the remote electronic version, testers were un- http://web.archive.org/web/20040205000418/http://www- 3.ibm.com/ibm/easy/eou_ext.nsf/Publish/410faware of situations in which users may have been confused