BetterEvaluation - Webinar number 7: Report and support useSimon Hearn
Part of a series of American Evaluation Association coffee-break webinars presented by BetterEvaluation. This is the seventh, on the 'Report and Support Use' cluster of the BE Rainbow Framework.
Webinar: Feel the Love from Your Students: Brightspace Tools for Increasing E...D2L Barry
Brightspace Teaching & Learning Community Webinar Series.
Feb 7, 2017
Presenter: Thomas J. Tobin, PhD, MSLS, PMP, MOT
Description: Many campuses have moved entirely or partially to electronic survey instruments for end-of-course student ratings of teaching effectiveness. Because online surveys don’t provide the “captive audience” of the old pencil-and-bubble-sheet days, response rates on eSurveys are often lower than when we used paper. This webinar will share four specific tactics that are proven to increase the response rates on electronic end-of-semester e-surveys. We will also shatter myths about four common practices that do not help response rates at all.
Participants in this webinar will learn how to apply four techniques to increase the response rates on e-survey student ratings of teaching effectiveness.
After attending the webinar, participants will be able to
a) identify four common e-survey strategies that do not actually help to increase response rates,
b) implement four specific strategies that do increase e-survey response rates, and
c) apply survey-lifecycle techniques using Brightspace tools to help e-survey adoption rates.
Rethinking Search Results from a UX PerspectiveBrian Frank
Post-secondary education websites have evolved a lot over the past decade. Search results pages have hardly changed. We’re long overdue to envision better ways to help users find what they’re looking for, faster and with fewer frustrations.
By looking at tested examples of user interfaces from ecommerce and other industries, we’ll explore ideas for radically rethinking the search experience on post-secondary websites. We’ll also discuss tips for using research to guide these decisions and avoid copying design patterns that aren’t suited to post-secondary information or user needs.
BetterEvaluation - Webinar number 7: Report and support useSimon Hearn
Part of a series of American Evaluation Association coffee-break webinars presented by BetterEvaluation. This is the seventh, on the 'Report and Support Use' cluster of the BE Rainbow Framework.
Webinar: Feel the Love from Your Students: Brightspace Tools for Increasing E...D2L Barry
Brightspace Teaching & Learning Community Webinar Series.
Feb 7, 2017
Presenter: Thomas J. Tobin, PhD, MSLS, PMP, MOT
Description: Many campuses have moved entirely or partially to electronic survey instruments for end-of-course student ratings of teaching effectiveness. Because online surveys don’t provide the “captive audience” of the old pencil-and-bubble-sheet days, response rates on eSurveys are often lower than when we used paper. This webinar will share four specific tactics that are proven to increase the response rates on electronic end-of-semester e-surveys. We will also shatter myths about four common practices that do not help response rates at all.
Participants in this webinar will learn how to apply four techniques to increase the response rates on e-survey student ratings of teaching effectiveness.
After attending the webinar, participants will be able to
a) identify four common e-survey strategies that do not actually help to increase response rates,
b) implement four specific strategies that do increase e-survey response rates, and
c) apply survey-lifecycle techniques using Brightspace tools to help e-survey adoption rates.
Rethinking Search Results from a UX PerspectiveBrian Frank
Post-secondary education websites have evolved a lot over the past decade. Search results pages have hardly changed. We’re long overdue to envision better ways to help users find what they’re looking for, faster and with fewer frustrations.
By looking at tested examples of user interfaces from ecommerce and other industries, we’ll explore ideas for radically rethinking the search experience on post-secondary websites. We’ll also discuss tips for using research to guide these decisions and avoid copying design patterns that aren’t suited to post-secondary information or user needs.
Presentation from the popular Fast Track Impact training on how to evaluate and prove impact claims from your research. Find our more at www.fasttrackimpact.com/resources
Presentation about collaborative social media projects between students at Florida A&M University in Tallahassee, Fla., and at Midwestern State University in Wichita Falls, Texas. Students used wiki, blog, e-mail and videoconferencing to create, promote and analyze results of an online survey about the 2008 presidential election.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
Presentation on the collaboration between students at Florida A&M University in Tallahassee, Fla., and Midwestern State University in Wichita Falls, Texas. Students used a wiki, blog, e-mail and videoconference to create, promote and analyze the results of an online survey about the 2008 presidential election. This presentation was given at the 2009 AEJMC national conference.
Integrating impact into your UKRI case for supportMark Reed
Webinar slides by Prof Mark Reed.
View the video at: https://www.youtube.com/channel/UCvr-7zuEcX-8dEsIZsFoMyg.
View the full guide at: https://www.fasttrackimpact.com/post/how-to-integrate-impact-into-a-ukri-case-for-support.
Primo Usability: What Texas Tech Discovered When Implementing PrimoLynne Edgar
This presentation discusses the usability study of Primo, an Ex Libris discovery tool, immediately after its implementation by Texas Tech University Libraries. Problems and potential solutions are explored by four librarians.
Does the field of user-centered design mystify you? Does user research seem like the last thing you have time to think about?
Any team can look at analytics to understand what users are doing and how often they’re doing it. What analytics won’t tell you is *why* users are doing certain things — sometimes you need more context. That’s where user research comes in. This session will map out a framework for incorporating user research into your development cycle.
The two key project objectives were to develop a better approach to the LDA’s sponsorship program and to identify potential improvements to program content and processes.
Are you looking to gather insights from your potential customers? When it comes to your prospects, do you really know what they want? Many startup teams tell us they are missing the key information they need to get into their users' mind. Without this information, the products often fall short of delighting users.
There are those that believe that user research and usability testing must be a complex and scientific process that takes lots of time, money, and resources. However, in the real world, most startups don't have the luxury to spend weeks or months on their user research. That's where guerrilla research techniques come into play.
How to ask better questions and how to assess UX using surveys.
This workshop at UXLX 2014 in Lisbon was a deep dive into two important topics in survey design for user research.
We used the four-step model of how people answer questions to work on better questions, then we focused on two special uses of questionnaires in user research: the post-test assessment of satisfaction, and then how to gather information from users for redesign.
Thanks to all the attendees for making this workshop a lot of fun.
Caroline Jarrett @cjforms
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Prototype Version 1:
Tasks for interviewees
Task #1
Try to get in contact with a research project of your degree
Task #2
Find a FAQ/forum where other students have asked questions
Task #3
Find research project associated with a certain professor
Study Protocol
- Introduction to project
- (Undergraduate Research Projects)
- (Help support undergraduates in finding relevant research opportunities)
- Provide instructions
- Administer relevant demographics to determine interviewee attributes
- Ask if we can record
- Ask them to speak their thoughts out loud
- Show them the paper prototype they are going to interact on
- Ask them to try perform task 1
- Don't provide help
- Ask what they are expected/looking for
- Tell them to say when they have ‘ended/finished’ and found what they are
looking for
- Repeat for task 2 and 3
- Ask questions depending on their past actions (if relevant)
- Administer. UX questionnaires
Quantitative (Scale of 1 to 5 if applicable)
- I found the system unnecessarily complex
- I thought the system was easy to use
- I felt confident about using the system
- I did not feel stressed or lost while navigating the interface
- # of errors doing all the task
Qualitative (descriptive)
- Thoughts on the process
- Did anything feel unnecessary but helpful?
- Did you feel as if you reached your goal too quickly
- Did you feel as if you required some technical background
- Ask follow up interview questions if their answers warrant it
- Wrap up
- Ask if they have any questions they want to ask
- Give group code
Executive Summary
Topic
Our topic for this research project is Undergraduate Research Opportunities.
Top Three Design Goals
Our top three design goals were centered around making sure that the website had relevant
links, as many of the links on the website currently lead to dead pages. Additionally, making the
primary reason that most users are on the website for the main focus of the website was
another design goal. Currently, the website feels as though it does not center itself specifically
around research opportunities for undergraduate students, as it requires the user to navigate
through the website for a bit before being able to find this. Finally, our third most important
design goal was making sure that the user was aware of where they were at any given time.
Currently the user can easily get lost trying to find specific information, being unaware as to
whether or not they are on the right track.
Discoveries during research
1. Dead links riddle the website, making it irrelevant as a source for finding many of the
information it claims to provide
2. The website itself does not feel like a branch from the school's original website. It does
not aesthetically look pleasing nor fitting
3. The website could be minimized a lot and focus primarily on research opportunities and
other things second
4. Many of the p ...
Usability testing: rapid results when you need them. Have a question about whether a new feature or design idea works for users? It’s easy to find out early, so your design process is as responsive as your code. We'll look at ways to run quick usability test, how to find users in the wild, and when to add it to your project plan. Yes, it can be fast, good, and cheap.
Presentation at the dotgov design conference - March 27, 2015
The Influence of Participant Personality in Usability TestsCSCJournals
This paper presents the results of a study investigating the impact of participant personality on usability testing. Data were collected from 20 individuals who participated in a series of usability tests. The participants were grouped into 10 introverts and 10 extroverts, and were asked to complete a set of four experimental tasks related to the usability of an academic website. The results of the study revealed that extroverts were more successful than introverts in terms of finding information as well as discovering usability problems, although the types of problems found by both groups were mostly minor. It was also found that extroverts spent more time on tasks but made more mistakes than introverts. From these findings, it is evident that personality dimensions have significant impacts on usability testing outcomes, and thus should be taken into consideration as a key factor of usability testing.
Presentation from the popular Fast Track Impact training on how to evaluate and prove impact claims from your research. Find our more at www.fasttrackimpact.com/resources
Presentation about collaborative social media projects between students at Florida A&M University in Tallahassee, Fla., and at Midwestern State University in Wichita Falls, Texas. Students used wiki, blog, e-mail and videoconferencing to create, promote and analyze results of an online survey about the 2008 presidential election.
Using Web 2.0 Technologies to Facilitate Learninglarae9411
Presentation on the collaboration between students at Florida A&M University in Tallahassee, Fla., and Midwestern State University in Wichita Falls, Texas. Students used a wiki, blog, e-mail and videoconference to create, promote and analyze the results of an online survey about the 2008 presidential election. This presentation was given at the 2009 AEJMC national conference.
Integrating impact into your UKRI case for supportMark Reed
Webinar slides by Prof Mark Reed.
View the video at: https://www.youtube.com/channel/UCvr-7zuEcX-8dEsIZsFoMyg.
View the full guide at: https://www.fasttrackimpact.com/post/how-to-integrate-impact-into-a-ukri-case-for-support.
Primo Usability: What Texas Tech Discovered When Implementing PrimoLynne Edgar
This presentation discusses the usability study of Primo, an Ex Libris discovery tool, immediately after its implementation by Texas Tech University Libraries. Problems and potential solutions are explored by four librarians.
Does the field of user-centered design mystify you? Does user research seem like the last thing you have time to think about?
Any team can look at analytics to understand what users are doing and how often they’re doing it. What analytics won’t tell you is *why* users are doing certain things — sometimes you need more context. That’s where user research comes in. This session will map out a framework for incorporating user research into your development cycle.
The two key project objectives were to develop a better approach to the LDA’s sponsorship program and to identify potential improvements to program content and processes.
Are you looking to gather insights from your potential customers? When it comes to your prospects, do you really know what they want? Many startup teams tell us they are missing the key information they need to get into their users' mind. Without this information, the products often fall short of delighting users.
There are those that believe that user research and usability testing must be a complex and scientific process that takes lots of time, money, and resources. However, in the real world, most startups don't have the luxury to spend weeks or months on their user research. That's where guerrilla research techniques come into play.
How to ask better questions and how to assess UX using surveys.
This workshop at UXLX 2014 in Lisbon was a deep dive into two important topics in survey design for user research.
We used the four-step model of how people answer questions to work on better questions, then we focused on two special uses of questionnaires in user research: the post-test assessment of satisfaction, and then how to gather information from users for redesign.
Thanks to all the attendees for making this workshop a lot of fun.
Caroline Jarrett @cjforms
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Prototype Version 1:
Tasks for interviewees
Task #1
Try to get in contact with a research project of your degree
Task #2
Find a FAQ/forum where other students have asked questions
Task #3
Find research project associated with a certain professor
Study Protocol
- Introduction to project
- (Undergraduate Research Projects)
- (Help support undergraduates in finding relevant research opportunities)
- Provide instructions
- Administer relevant demographics to determine interviewee attributes
- Ask if we can record
- Ask them to speak their thoughts out loud
- Show them the paper prototype they are going to interact on
- Ask them to try perform task 1
- Don't provide help
- Ask what they are expected/looking for
- Tell them to say when they have ‘ended/finished’ and found what they are
looking for
- Repeat for task 2 and 3
- Ask questions depending on their past actions (if relevant)
- Administer. UX questionnaires
Quantitative (Scale of 1 to 5 if applicable)
- I found the system unnecessarily complex
- I thought the system was easy to use
- I felt confident about using the system
- I did not feel stressed or lost while navigating the interface
- # of errors doing all the task
Qualitative (descriptive)
- Thoughts on the process
- Did anything feel unnecessary but helpful?
- Did you feel as if you reached your goal too quickly
- Did you feel as if you required some technical background
- Ask follow up interview questions if their answers warrant it
- Wrap up
- Ask if they have any questions they want to ask
- Give group code
Executive Summary
Topic
Our topic for this research project is Undergraduate Research Opportunities.
Top Three Design Goals
Our top three design goals were centered around making sure that the website had relevant
links, as many of the links on the website currently lead to dead pages. Additionally, making the
primary reason that most users are on the website for the main focus of the website was
another design goal. Currently, the website feels as though it does not center itself specifically
around research opportunities for undergraduate students, as it requires the user to navigate
through the website for a bit before being able to find this. Finally, our third most important
design goal was making sure that the user was aware of where they were at any given time.
Currently the user can easily get lost trying to find specific information, being unaware as to
whether or not they are on the right track.
Discoveries during research
1. Dead links riddle the website, making it irrelevant as a source for finding many of the
information it claims to provide
2. The website itself does not feel like a branch from the school's original website. It does
not aesthetically look pleasing nor fitting
3. The website could be minimized a lot and focus primarily on research opportunities and
other things second
4. Many of the p ...
Usability testing: rapid results when you need them. Have a question about whether a new feature or design idea works for users? It’s easy to find out early, so your design process is as responsive as your code. We'll look at ways to run quick usability test, how to find users in the wild, and when to add it to your project plan. Yes, it can be fast, good, and cheap.
Presentation at the dotgov design conference - March 27, 2015
The Influence of Participant Personality in Usability TestsCSCJournals
This paper presents the results of a study investigating the impact of participant personality on usability testing. Data were collected from 20 individuals who participated in a series of usability tests. The participants were grouped into 10 introverts and 10 extroverts, and were asked to complete a set of four experimental tasks related to the usability of an academic website. The results of the study revealed that extroverts were more successful than introverts in terms of finding information as well as discovering usability problems, although the types of problems found by both groups were mostly minor. It was also found that extroverts spent more time on tasks but made more mistakes than introverts. From these findings, it is evident that personality dimensions have significant impacts on usability testing outcomes, and thus should be taken into consideration as a key factor of usability testing.
Mapping the Digital Preservation Wilderness: What you need to knowJody DeRidder
A comparison of three well-known "maps" of the territory, to identify the areas where we need best practices... including a quick review of the status in each area. Then: the patterns of experience we and others are undergoing, in facing the wilderness of Digital Preservation.
Did we get the cart before the horse? (faculty researcher feedback) Jody DeRidder
We've spent a great deal of time and money developing great digital content, captured to high specifications with the best quality metadata we could afford-- and the best search engines and interfaces we could develop. But something is missing: it's not reaching our users!
Developing a digital library doesn’t end when content goes online.
You need to know whether what you are doing is effective; whether you’re reaching your users, whether you’re providing them with what they need in the form they need it, and whether you are doing this in the most cost-effective way that you can. This presentation examines the challenges inherent in assessing three different aspects of digital libraries: costs, user needs, and benefits.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
3. Methodology Efficiency: Time on task, number of clicks Effectiveness: Successful task completion Satisfaction: Ranking of perceived difficulty and positive vs. negative comments Learnability: Improvement in time, clicks, and success over 4 tasks in a single session
5. Procedure Task 1: Locate a piece of legal correspondence. Task 2 : Locate an estate document. Task 3: Locate some kind of deed. Task 4 : Locate a family history document or some other family item.
6. Results: Efficiency, Effectiveness, Satisfaction of item-level described content, compared to finding aid access: Efficiency: 35% less time, 48% fewer clicks Not surprising: finding aid provides more context. Effectiveness: Success rates 7.5% higher Not surprising: no EAD search function or navigation box. Satisfaction: Preferred by a ratio of 3:1 Not surprising: these are novice users.
7.
8. 41% more time and 13% less success in the finding aid interface
12. Further Research Indicated More tests on the finding aid interface to determine what actually improves usability. Suggestions from the research include: • replacing archival terminology • Providing search in page feature • Providing navigation links for sections of the finding aid on the left THEN: learnability tests for novice users that span multiple sessions.
13. Conclusions Don’t compare item level access to finding aid access; they aren’t comparable. Find ways to make the EAD more user-friendly. EAD delivery works for us; let’s make it work for our users! Certain images and/or photos on this page are the copyrighted property of 123RF Limited, its Contributors or Licensed Partners and are being used with permission under license. These images and/or photos may not be copied or downloaded without permission from 123RF Limited.
14.
15. Cory Nimer and J. Gordon Daines III, “What Do You Mean It Doesn’t Make Sense? Redesigning Finding Aids from the User’s Perspective,” Journal of Archival Organization 6, no. 4 (2008), http://dx.doi.org/10.1080/15332740802533214
16. Wendy Scheir, “First Entry: Report on a Qualitative Exploratory Study of Novice User Experience with Online Finding Aids,” Journal of Archival Organization 3, no. 4 (2006), http://dx.doi.org/10.1300/J201v03n04_04
17. Tom Tullis and Bill Albert, Measuring the User Expereience: Collecting, Analyzing, and Presenting Usability Metrics (Burlington, MA: Morgan Kaufmann, 2008, 92-94.
18. Tim West, Kirill Fesenko, and Laura Clark Brown, “Extending the Reach of Southern Sources: Proceeding to Large-Scale Digitization of Manuscript Collections,”Final Grant Report for the Andrew W. Mellon Foundation, Southern Historical Collection, University Library, University of North Carolina at Chapel Hill, June 2009, http://www.lib.unc.edu/mss/archivalmassdigitization/download/extending_the_reach.pdf
Hi, I’m Jody DeRidder from the University of Alabama Libraries. I’m here to talk about a usability study on access to digitized content via the finding aid, without hand-created item-level metadata.
In this time of reduced funding and increased demand for access to primary source materials online, a number of institutions have turned to online access to digitized items via the finding aid. The intent is to leverage the EAD descriptions for search and retrieval, while simulating the user experience in the reading room. Content is presented in the order encountered in the boxes and folders. This approach enables low-cost digitization of even large manuscript collections, providing online access to material that otherwise may never be digitized. In a grant project partially funded by the NHPRC, the University of Alabama Libraries developed a low-cost model and supporting open-source software for implementation. The grant project included a usability test which compared the resulting interface to a similar collection delivered with item-level descriptions accessed outside the finding aid. At an estimated cost of 79.5 cents per page, our mass-digitization method costs less than a third of our usual item-level description access. However, we needed to know how useful this interface is for our patrons.
We sought to measure efficiency, effectiveness, satisfaction and the critical element of learnability.Efficiency was defined by the two measures of time on task and total number of steps (clicks) required for a participant to successfully complete a task. We counted double clicks as single clicks, and a string of text entered as a single interaction. Effectiveness was measured by whether or not a task was completed successfully. We measured satisfaction by both the users’ overall perceived difficulty of each interface (on a 1-5 scale), as well as the total number of positive versus negative comments about an interface recorded during testing. Negative comments were assigned a value of -1 and positive comments assigned a value of +1; duplicate comments by the same user were disregarded.For learnability, we examined improvements from task 1 to 4 for both interfaces, in the following variables: time to first click (which may indicate indecision), total time to locate content, number of steps (as defined above), and success in completing a task.
Experienced researchers have already been shown to prefer the finding aid as interface. Both Scheir and Chapman found that novice users experienced a learning curve during exposure to finding aids, gaining confidence and ease with time. For our study, participants were primarily novice users.This study included twenty participants: 8 undergraduate students, 10 graduate students, one post-graduate volunteer and one college-educated staff member. Subsets of these participants included those with and without digital library experience, special collections experience, background in history, and English as a second language.
The test consisted of 4 known-item searches, repeated for each of two similar collections accessible in different ways. The item-level described digitized items in the Jemison collection could be searched using a search box, while the Cabaniss items were accessible via links embedded in the EAD finding aid. The order in which we presented collections to participants was alternated so that one collection would not always benefit from the participant’s experience with the previous collection.Prior to performing the tasks in each collection, participants were given short introductions to both the collection and the interface. The Jemison collection was presented as a result list of items with the search options set to isolate queries to this collection.To locate items within the Cabaniss collection, participants had to navigate the EAD finding aid, as no “search within page” option was available. Participants with no previous experience with special collections were told that a finding aid is a guide to the collection created by the archivists, and, in this instance at least, is similar to a table of contents for the collection. Morae software was used to record the sessions. One researcher sat with the participant and provided verbal directions while another observed and captured data. After completing the tasks, participants were asked to rank the interfaces and to comment on their experiences and preferences. After completing the survey, participants were given a 1 GB flash drive in exchange for their assistance.
Overall, participants required an average of 35% less time and 48% fewer interactions with the item-level described collection than with the finding aid as web interface. This indication of reduced efficiency is to be expected without a search option in the finding aid page, as results may only be obtained via browsing. Success rates via the item-level search interface were 7.5% higher. Participants as a whole clearly prefer the item-level interface, by a factor of 3:1, though 40% of those with a background in history, and a third of those with special collections experience or without digital library experience preferred the finding aid interface.
Marked differences in efficiency were evident in both interfaces for participants for whom English is a second language, but the difficulty was more pronounced in the finding aid interface (51% more time and 10% less success in the item-level interface; 41% more time and 13% less success in the finding aid interface). 80% of the participants for whom English is a second language preferred the item-level interface. Interestingly, those without previous digital collection experience found the finding aid interface significantly easier than those who claimed familiarity with the more traditional digital library interface. The EAD interface took them 42% less time, 27% fewer clicks, and provided 12% more success.This bodes well for future acceptance of this method of web delivery, and corroborates Chapman’s findings that “groups that showed the most significant improvement over time were novice participants and Internet users with a beginning proficiency level.”
For the learnability interface comparisons, we reduced the sample to 14 by removing six users from the Jemison-First group. This was necessary to control for an experience effect in which some aspect of the first interface encountered could impact performance in the second. Unfortunately, the resulting sample contained three less non-native speakers. If indeed an interface is more learnable, we would expect to see statistically significant improvements in effectiveness and efficiency from task 1 to 4 for all users in each interface separately. Although improvement from task 1 to 4 did occur 62.5% of the time in favor of Jemison, these differences were not statistically significant .
We need more tests on the finding aid interface to determine what actually helps users. Suggestions from the research include: • replacing archival terminology• Providing search in page feature• Providing navigation links for sections of the finding aid on the leftTHEN: we need learnability tests for novice users that span multiple sessions. Tullis and Albert make an excellent argument for comparing multiple sessions with the same participants, capturing the same metrics, as this would more closely simulate real-life experience.
Finding aids present digital materials in the context of the collection, and hence provide far more information to be sifted than content described solely on the item level. Efficiency and effectiveness measures should not be applied in comparing the EAD interface with item-described content. The result is a comparison of apples and oranges.What is truly at issue here is learnability, particularly for novice users and those for whom English is a second language. Modifications to the display and terminology should be tested to verify that these changes increase access and learnability.By increasing the ease of use and verifying the learnability of the finding aid interface, we will be better positioned to leverage this low-cost digitization method to provide online access to large manuscript collections.
I’ve included a bibliography for the research to which I’ve referred, as well as links to our wiki, project site and display. An article about this project has been submitted to American Archivist and hopefully will soon be available there.