Your SlideShare is downloading. ×
VIPEr Usability Study
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

VIPEr Usability Study

506

Published on

Published in: Technology, Design
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
506
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Hello – I’m Michael Gough, and I’ll be talking about the Usability report for ionicviper.org.
  • In this presentation, I will be reviewing the evaluation strategies I used to evaluate the site, the testing methods that were used in the usability testing phase, explain the testing results, and link the inspection defects found in the prevous inspection report with the results from usability testing. Then I will offer my recommendations for the site. Feel free to email me at michaelgough@depauw.edu if you have any questions about these results.
  • This is a review of the evaluation strategies employed in this study. Previously, I gave you a report of my Scenario and Heuristic inspections. These inspections were used to find all the possible problems on the site from the perspective of the site goals, and the user’s goals. During the heuristic inspection, I used industry standard rules of thumb for evaluating usability of websites. Review the Usability Inspection report for the problems that were found during this stage of the study. This talk will be focused on the Usablity Testing stage of the study. However, the three types of evaluation are connected and do come together to form a full evaluation of your site. That final outcome will also be discussed during the reccomendations stage of this presentation.
  • From the usability inspection, talking with the stakeholder, and from learning about the site’s goals. I was able to determine 5 key user goals users have with this site. <Read list> These tasks were used to form a testing script that was given to the testers.
  • Because Ionic’s users are scattered across the country, testing for this site was very difficult to do in a traditional testing lab environment. In order to facilitate sychonous testing at a distance, I used Skype and it’s screen sharing feature to view testers screens. I recorded users screens and timed their progress at each step. I also asked them some questions up front about their background and then gathered some general feedback about the site at the end. This method was not without its challenges, I lost the video for two of the tests as the software malfunctioned, also the bandwidth was a factor on some of the tests, which caused me to rely on the tester to verbalize every little detail as they worked. Overall, this method was successful and I was able to collect rich data that can help improve you site for it’s target audience.
  • I did have some trouble finding 10 inorganic chemists who were able to particpate. However, 12 of the 13 particpants were affilated with a higher education institution. 5 were instructors or professors, 4 were undergraduates, 2 were librarians and one was a non-teaching graduate student.
  • 9 of the 13 testers had at least an advanced undergraduate background in chemistry, however only 5 were at the graduate level in inorganic chemistry specifically. Three had taken some kind of general chem class and one librarian particpant had taken no chemistry classes. It is important to note, that with my small study, I wasn’t able to find any conclusive trends between the level of chemistry taken and the problems users encountered, with the execption of when asked to find content for the 2nd year of study. More advanced testers tended to use their own knowledge to guess what learning objects would be appropriate for the 2nd year where others would look for some kind of indicator on the site.
  • The first testing scenario was to learn about the site. Since many of the testers are using the site for the first time, this is an opportunity for me to watch as they learn how to use the site. The first task was to find the site goals. Users had to guess where the goals might be and they didn’t always know to look at the “who we are page.” The three who failed at this task felt it was not obvious to find goals on this page. They thought “who where are” would have contact information.Users also had some trouble defining a learning object. Users could either read a defintion to me, or define it in their own terms, they had a difficult time finding the defintion, and the term was sometimes confused with learning objective. One professor even noted that this is not a common term he would use, though he could figure it out. Finally, part way through the test, one tester discovered the security certificate on the site had expired.
  • Task 2 Defining a learning object had the lowest success rate in this scenario at 69%. Also users tended to visit a lot of pages when searching for the definition. Task 3 tended to take a while as well, but much of the time was spent reading pages out loud to answer the question. In this task I was looking for them to say that the had to create an account and get faculty privleges in order to share content. If possible, you might consider making sure this information is near to the top of the how to contribute page so it takes less time to find this.
  • In Scenario 2, users had to create an account and then comment on a news post and then delete their comment. Most users didn’t have any trouble creating their account. Because the previous task asked testers to tell me that they need faculty privledges to share content, most of the users understood what the request for additional privileges check box was for. This is still a problem that should be addressed as someone starts creating their account without reading this information first might be confused. Another minor problem users encountered was with the email notification. A couple users didn’t see the notice at first and didn’t know they had to check their email to complete the account creation process.Finally, when the first couple testers tried to delete their comments, the found it was impossible to do so. I later tested and confirmed this. I had to delete their comments for them after testing.
  • Notice that 0% were able to compete task 3 which was deleting their comment. The other tasks had a high success rate of 100% able to accomplish with 1 needing some assistance. The account creation process seemed straight forward for most users. Most users took around 4 minutes to complete the task. The commenting task (task 2) varied a bit in the number of steps, but this was in part due to communication/bandwidth troubles I had with a couple of the users.
  • For the next scenario, users had to find 3 in-class activities that were appropriate for the second year of study. They were to find a lab activity in the Main Group category that had no or only general chem as a prerequisite. Finally they had to log out. The purpose of these tests were to see how well the site allowed users to find specific content they need to build their course. As predicted in the usability inspection, users had some trouble with finding resources by course level. A couple users pointed out that the course levels are a bit ambigous. Does second year mean second year of chemistry or inorganic? Does upper level include second year? These were common questions I received. Those who failed this task, often didn’t find a learning object labeled as second year.Users had fewer troubles with the next task as they now have figured out how to find content on the site. One tester commented that it would be helpful if visiting hyperlinks changed colors on the site so you don’t have to rely on memory for which objects you had clicked on previously.Finally, Logging out was far from obvious for many testers. Every one of the testers first looked to the upper right corner where they logged in, to try to log out. When they didn’t find it, then scanned the page and had to scroll down to find the log out button.
  • Task 1 had a low success rate of only 43 percent completing it without help. This is because users tended to try to guess if a learning object was appropriate for second year using their own knowledge, or they assumed 2nd year and upper level were synonymous. Task 1 and 2 have a large about of variance in time and number of steps due to the fact that users were really hunting around looking for the learning objects that were appropriate for second year. Those who were lucky and guessed the right ones faster, tended to have fewer steps and a faster time.
  • Once users are comfortable working on the site, they now have to attempt to share some content on the site, which they would later delete. The first task was really just to get them logged in as a faculty member. Next they had to upload a 5 slide presentation and place it into a specific category with a specific prerequisite and course level. Finally they deleted their content. This process was fairly straight forward for most of the testers. The biggest problem encountered was that the site doesn’t accept .pptx files. Also, users didn’t exactly understand the draft approval process. This process should be better explained at the top of the form. Along with that should be some instructions on how to find your content once it is submitted as a draft. When users left there content page, they had a difficult time finding their content to delete it. Two users first clicked on my account before looking in the draft content area, or failing at the task.
  • Finding the form and filling it out was fairly straight forward for most users as nearly 80% completed this task without help. The users who failed at deleting their slides were not able to find their content. Once I showed them where it was, they were able to do so easily. Task 2, took the longest time of all the tasks as users often had to convert their PowerPoint document to an old version to upload it. The one tester who failed, did so because the site was not accepting any type of files.
  • The final scenario was to test how easily users could find a forum topic and comment on it. This was the easiest set of tasks for the users, with the exception of deleting their comment, this was the easiest set of tasks. By now users have learned how the site works and it’s menus and general interface. Users did find that the forum search is broken. However most all able to find the iPad in teaching forum without it.
  • The success rate for these tasks were all very high with the exception of deleting comments. The number of steps to complete the tasks really didn’t vary all that much as users were able to find the forum pretty easily. The main cause for the some variance is with the bad search function. Note how logging out was much easier the second time. This shows that this task is at least learnable in the short term and could perhaps be remembered over a longer period of time.
  • This is a table of the problems that were found and how the usablity defects from the insepction report might have been an underlying cause. Also, I have a recommendation for how to improve or fix each problem. Those problems marked very high or high should be fixed as soon as possible to improve the usability of the site.Read each problem
  • New problems were found that did not tie back to a defect found in the inspection process. These problems were missed due to the limited scope of the inspection process, or because something changed between the two evaluation methods. Nonetheless some of them are high priority problems. < List them…>
  • Finally, there are a number of defects listed in the usability inspection that did not seem to cause problems with users. <List>
  • Here are the high priority reccomendations for improving the site from a usablity perspective. First, with the content level problems, you might consider using a different term besides “learning object.” Perhaps learning resources is an option. Also, you might make a specific page for course goals, or create an about ionic page as reccomended in the usablity inspection report. Many users were confused as to what happens to their draft comment, you might explain this either by linking from the form or adding the content at the top of the content sharing form. The technical issues that should be fixed right away include allowing .pptx and other office 2007 file types, allowing users to delete comments, and making sure the site is accepting file uploads properly.
  • At the design level, you might consider adding the course level to the browse content tables. This would make it easier for users to find content specifically for the class level they are looking for. You might also make the email notification easier to see when a user creates an account. Perhaps the notice could have a read background so it stands out against the blue grey site design. Finally, I should remind you that there are a number of minor problems listed in the full report and on the usability inspection report that should be tackled when it is easy to do so, such as when upgrading to a new version of the site, or during a site redesign.
  • Finally, I wanted to include a lot of the positive comments I got from users about this site. Reporting all the problems tends to make this a rather negative report. The site really is a neat resource for it’s audience and is certainly valuable. If you have any questions about this evaluation study, feel free to email me a michaelgough@depauw.edu.
  • Transcript

    • 1. Usability Report for Ionicviper.org
      Michael Gough – michaelgough@depauw.edu
    • 2. Presentation Overview
      Overview of usability evaluation strategies
      Usability testing methods
      Usability testing results
      Linking inspection defects
      Recommendations
      References
    • 3. Evaluation strategies
    • 4. Usability testing methods
      Tested 5 key goals
      Learning about the site
      Creating an account and comment
      Looking for content
      Sharing content
      Discussion
    • 5. Usability testing methods
      Synchronous testing at a distance
      Testers first asked some basic background questions
      Used Skype’s screen sharing feature
      Testers were recorded and timed
      Debriefing and post interview
      Technical issues did cause for a few glitches
      Lost video for two tests
      Bandwidth varies
    • 6. About the testers
      13 testers from 12 colleges and universities and 1 high school
    • 7. About the testers
    • 8. Scenario 1 - Learn about the site
      1. Find out what the goals of the site ionicviper.org are. List one of the goals.
      2. Learn what a learning object is based upon the site’s definition. List the definition.
      3. Find out how to contribute learning objects to the site. What do you need to do before you can contribute to the site?
      Users had to guess that goals would be on the “Who we are page”
      “Learning object” is not a common term
      Also hard to find definition on the site
      Site’s security certificate expired
      Tasks
      Problems found
    • 9. Quantitative analysis
    • 10. Scenario 2 – Create an account and comment
      1. Sign up for a new account.
      2. View the Site’s Recent News and click on the 2010 Nobel Prize in chemistry article and leave a short comment.
      3. You decided you didn’t want to comment after-all. Delete your comment
      Additional privileges should be explained
      Email notification was missed by some testers
      Users cannot delete comments
      Tasks
      Problems found
    • 11. Quantitative analysis
    • 12. Scenario 3 – Looking for content
      1. Find 3 learning objects that are both in-class activities and appropriate for the second year of study.
      2. Find a lab activity on Main Group Chemistry that either has no prerequisite or only lists general chemistry listed as a prerequisite.
      3. Log out
      Not convenient to sort by course level
      Course levels are ambiguous
      Visited hyperlinks don’t change color
      Logging out is not obvious
      Tasks
      Problems found
    • 13. Quantitative analysis
    • 14. Scenario 4 – Sharing content
      1. Log in using the following test account
      2. Create and share a 5 slide presentation…
      3. Delete your slides from the site
      Site does not accept .pptx files
      Site stopped allowing file uploads
      Users confused about the submit for publishing option
      Users couldn’t find their content to delete it
      Tasks
      Problems found
    • 15. Quantitative analysis
    • 16. Scenario 5 - Discussion
      1. Check to see if there is already a discussion on the topic (iPad in teaching).
      2. Add a thread or comment about your iPad app.
      3. You learned that someone already shared information about your app, so you want to remove your comment.
      4. Log out
      Forum search is broken
      Cannot delete comments
      Tasks
      Problems found
    • 17. Quantitative analysis
    • 18. Linking problems with inspection report
    • 19. New problems found
    • 20. Inspection elements did not effect users
    • 21. High priority Recommendations
      Content level
      “Learning object” change to learning resources
      Create a page to list course goals
      Link from the create content page to a page that explains draft management
      Technical problems
      Configure site to allow .pptx files
      Allow users to delete comments
      Make sure the site is accepting file uploads
    • 22. High priority Recommendations
      • Design level
      Add course level to browse content tables
      Make the email notification easier to notice – perhaps use a red background
    • 23. What users said about the site
      All users said they would use the site in the future (for non-chemists it was a hypothetical question)
      Users in general felt they liked the site and that it was a good idea
      Most users felt that the site could be improved, but the site is still useful
      Some felt the site took a while to learn, but once you learn your way around it works well
    • 24. References
      Nielsen, J., & Mack, R. L. (1994). Usability Inspection Methods. John Wiley and Sons.
      Stone, D., Jarrett, C., Woodroffe, M., & Mincocha, S. (2005). User Intervace Design and Evaluation. San Francisco, CA: ELSEVIER.
      Triacca, L. MILE+ (Milano-Lugano Evaluation method). Annex A_1: Library of Technical Heuristics, 2005.

    ×