Presented at LibTech Conference, March 15, 2018. Creating and maintaining research guides that students use and find helpful is an ongoing challenge. On deciding that our subject guides were due for a significant update, librarians at Kennesaw State University realized we needed to learn what our students wanted from these resources. We conducted a study focused on learning what information students expect to find on research guides, as well as how they would organize it. During this presentation I will share the study results and how that information was used to design a new subject guide template in LibGuides CMS. Additionally, I will explain beta-testing the new template prior to updating all subject guides for the fall 2018 semester. Attendees will take away recommendations for transferable design characteristics for your own guides, card sorting, and usability testing methodologies you can use to learn what your own community is looking for.
D4D session d21 - Really baking it in integrating the ux design process with...Suhui Ho
The UX design process strives to create the best experience for users while meeting an organization’s business goals. The UC San Diego Library consciously applied this process in its most recent website redesign, from initial planning to the design and build stages of the project. We will discuss our decisions about selecting and using multiple UX tools and methodologies and the effect this approach has had on the project. Presentation was given in Designing for Digital 2017
NCompass Live - March 9, 2016.
http://nlc.nebraska.gov/ncompasslive/
Librarians put hours into creating research guides, but usage studies show that they're often confusing, intimidating and generally under-used by students. Learn some effective techniques of instructional design and web usability that any librarian can apply to make online guides better and more useful to your students, whether you are using LibGuides or a home-brew system.
Presenter: Jason Puckett of Georgia State University is the author of the new book Modern Pathfinders: Creating Better Research Guides from ACRL Publications.
Library Service Design - Website Prototype IntroductionTroyDeRego
After the initial research phase of work on the Mississippi State University Libraries website, we created a working prototype of the site and explained out process to the faculty and staff before releasing it to them for review.
This session will focus on the development of digital leadership
skills for librarians in the area of collection management.
Within this context digital leadership refers to leadership as a
responsibility as opposed to a role. It will demonstrate a digital
leadership model that can be reused in different work contexts
and the use of online training to develop core competencies.
Demystifying Ethnography: Exploring Student Use of Library SpacesAmy Gratz Barker
This presentation was given at the Minnesota Library Association 2010 Annual Conference by Julie Gilbert, Amy Gratz, Anna Hulseberg, and Sarah Monson. Please note that all images are copyright to the Folke Bernadotte Memorial Library at Gustavus Adolphus College, with the exception of images on slide 37 (Image Association). These images are creative commons licensed and belong to their respective creators: Dalboz17, chris5aw, jisc_infonet, and Christopher Chan.
D4D session d21 - Really baking it in integrating the ux design process with...Suhui Ho
The UX design process strives to create the best experience for users while meeting an organization’s business goals. The UC San Diego Library consciously applied this process in its most recent website redesign, from initial planning to the design and build stages of the project. We will discuss our decisions about selecting and using multiple UX tools and methodologies and the effect this approach has had on the project. Presentation was given in Designing for Digital 2017
NCompass Live - March 9, 2016.
http://nlc.nebraska.gov/ncompasslive/
Librarians put hours into creating research guides, but usage studies show that they're often confusing, intimidating and generally under-used by students. Learn some effective techniques of instructional design and web usability that any librarian can apply to make online guides better and more useful to your students, whether you are using LibGuides or a home-brew system.
Presenter: Jason Puckett of Georgia State University is the author of the new book Modern Pathfinders: Creating Better Research Guides from ACRL Publications.
Library Service Design - Website Prototype IntroductionTroyDeRego
After the initial research phase of work on the Mississippi State University Libraries website, we created a working prototype of the site and explained out process to the faculty and staff before releasing it to them for review.
This session will focus on the development of digital leadership
skills for librarians in the area of collection management.
Within this context digital leadership refers to leadership as a
responsibility as opposed to a role. It will demonstrate a digital
leadership model that can be reused in different work contexts
and the use of online training to develop core competencies.
Demystifying Ethnography: Exploring Student Use of Library SpacesAmy Gratz Barker
This presentation was given at the Minnesota Library Association 2010 Annual Conference by Julie Gilbert, Amy Gratz, Anna Hulseberg, and Sarah Monson. Please note that all images are copyright to the Folke Bernadotte Memorial Library at Gustavus Adolphus College, with the exception of images on slide 37 (Image Association). These images are creative commons licensed and belong to their respective creators: Dalboz17, chris5aw, jisc_infonet, and Christopher Chan.
In early 2014, Michigan State University Libraries’ User Experience Work Group set out to determine why patrons value the Main Library facilities. Seating sweeps and surveys were conducted to gather quantitative and qualitative data, which was combined with existing data such as gate counts, chat transcripts, and computer logins, to learn about patrons’ interaction with library spaces. Join three members of the UXWG to discuss design, implementation, and analysis of a space study in a library that supports social, academic, and community activities. Attendees will be encouraged to share and reflect upon the “space value” of their library facilities.
Nuanced and Timely: Capturing Collections Feedback at Point of Use (Online NW...Rick Stoddart
Nuanced and Timely: Capturing Collections Feedback at Point of Use
Richard A. Stoddart, Assessment Librarian, Oregon State University Libraries & Press
Jane Nichols, Collection Development Librarian, Oregon State University Libraries & Press (@janienickel)
Terry Reese, Head, Digital Initiatives, The Ohio State University
While libraries use sophisticated metrics to determine e-resources usefulness, impact and cost effectiveness, much of this reflects past usage. To elicit qualitative data, an open-source application that inserts a pop-up survey between a citation and its full-text was tested. Inspired by MINES for Libraries®, this pop-up survey aims to capture users’ real-time reasons for selecting a given resource. Join us to learn about the application, users responses to the survey and to discuss future uses.
Kara Jones (University of Bath) "Getting there from here: changes for academi...ARLGSW
Presentation from the 6th CILIP ARLG-SW Discover Academic Research and Training Support Conference (DARTS6). Dartington Hall, Totnes, Thursday 24th – Friday 25th May 2018
Presented by Chris Bulock and Lynn Fields.
Discovery is a key component of a library's services, and user expectations are high. Even if a web-scale discovery system isn't in the cards, there is plenty a library can do to improve discovery for their users. Librarians at Southern Illinois University Edwardsville have been engaged in an ongoing discovery improvement project encompassing the website, catalog, database lists and more, all based on extensive user feedback. The presenters will share successful strategies for evaluating and improving discovery, no expensive software or programming skills necessary.
Discovery on a budget: Improved searching without a Web-scale discovery productNASIG
Discovery is a key component of a library's services, and user expectations are high. Even if a web-scale discovery system isn't in the cards, there is plenty a library can do to improve discovery for their users. Librarians at Southern Illinois University Edwardsville have been engaged in an ongoing discovery improvement project encompassing the website, catalog, database lists and more, all based on extensive user feedback. The presenters will share successful strategies for evaluating and improving discovery, no expensive software or programming skills necessary.
Chris Bulock and Lynette Fields, Southern Illinois University Edwardsville
In early 2014, Michigan State University Libraries’ User Experience Work Group set out to determine why patrons value the Main Library facilities. Seating sweeps and surveys were conducted to gather quantitative and qualitative data, which was combined with existing data such as gate counts, chat transcripts, and computer logins, to learn about patrons’ interaction with library spaces. Join three members of the UXWG to discuss design, implementation, and analysis of a space study in a library that supports social, academic, and community activities. Attendees will be encouraged to share and reflect upon the “space value” of their library facilities.
Nuanced and Timely: Capturing Collections Feedback at Point of Use (Online NW...Rick Stoddart
Nuanced and Timely: Capturing Collections Feedback at Point of Use
Richard A. Stoddart, Assessment Librarian, Oregon State University Libraries & Press
Jane Nichols, Collection Development Librarian, Oregon State University Libraries & Press (@janienickel)
Terry Reese, Head, Digital Initiatives, The Ohio State University
While libraries use sophisticated metrics to determine e-resources usefulness, impact and cost effectiveness, much of this reflects past usage. To elicit qualitative data, an open-source application that inserts a pop-up survey between a citation and its full-text was tested. Inspired by MINES for Libraries®, this pop-up survey aims to capture users’ real-time reasons for selecting a given resource. Join us to learn about the application, users responses to the survey and to discuss future uses.
Kara Jones (University of Bath) "Getting there from here: changes for academi...ARLGSW
Presentation from the 6th CILIP ARLG-SW Discover Academic Research and Training Support Conference (DARTS6). Dartington Hall, Totnes, Thursday 24th – Friday 25th May 2018
Presented by Chris Bulock and Lynn Fields.
Discovery is a key component of a library's services, and user expectations are high. Even if a web-scale discovery system isn't in the cards, there is plenty a library can do to improve discovery for their users. Librarians at Southern Illinois University Edwardsville have been engaged in an ongoing discovery improvement project encompassing the website, catalog, database lists and more, all based on extensive user feedback. The presenters will share successful strategies for evaluating and improving discovery, no expensive software or programming skills necessary.
Discovery on a budget: Improved searching without a Web-scale discovery productNASIG
Discovery is a key component of a library's services, and user expectations are high. Even if a web-scale discovery system isn't in the cards, there is plenty a library can do to improve discovery for their users. Librarians at Southern Illinois University Edwardsville have been engaged in an ongoing discovery improvement project encompassing the website, catalog, database lists and more, all based on extensive user feedback. The presenters will share successful strategies for evaluating and improving discovery, no expensive software or programming skills necessary.
Chris Bulock and Lynette Fields, Southern Illinois University Edwardsville
Data-Informed Decision Making for Libraries - Athenaeum21Megan Hurst
Athenaeum21 presents three case studies of assessment and evaluation programs in libraries--one past, one current, and one future. The cases use three different modes of data gathering and analysis to show the power of understanding user needs and how well your organization is meeting them.
Data-Informed Decision Making for Digital ResourcesChristine Madsen
This session will provide three case studies of assessment and evaluation programs in libraries--one past, one current, and one future. The cases use three different modes of data gathering and analysis and show the power of understanding user needs and how well your organization is meeting them.
How user research shaped the thinking towards developing our institutions cen...Brendan Owers
How user research shaped the thinking towards developing our institutions central web portal, presented at ALT-C 2019
https://altc.alt.ac.uk/2019/sessions/a-013/
UCD and Technical Communication: The Inevitable MarriageChris LaRoche
Presentation about the increasingly collaboration and needs of technical communication to work with and become competent within UX and UCD methods and principles.
This was a presentation I gave to administrators and instructors at UIC College of Liberal Arts and Sciences, as they debated putting more courses online.
Mark Dehmlow, Head of the Library Web Department at the University of Notre Dame
At the University of Notre Dame, we recently implemented a new website in concert with rolling out a “next generation” OPAC into production for our campus. While much of the pre-launch feedback was positive, once we implemented the new systems, we started receiving a small number of intense criticisms and a small wave of problem reports. This presentation covers how to plan for big technology changes, prepare your organizations, effectively manage the barrage of post implementation technical problems, and mitigate customer concerns and criticisms. Participants are encouraged to bring brief war stories, anecdotes, and suggestions for managing technology implementations.”
Presented at the OCLC Research Library Partnership meeting by Senior Program Officer, Karen Smith-Yoshimura and hosted by the University of Sydney in Sydney, NSW Australia, 17 February 2017. This meeting provided an opportunity for Research Library Partners to touch base with each other on issues of common concern and explore possible areas of future engagement with the OCLC Research Library Partnership and OCLC Research.
Digging into assessment data: Tips, tricks, and tools of the trade.Lynn Connaway
Hofschire, L., & Connaway, L. S. (2018). Digging into assessment data: Tips, tricks, and tools of the trade. Part 2 in 3-part webinar series, Evaluating and sharing your library's impact, presented by OCLC Research WebJunction, August 14, 2018.
At UNC Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision making and prioritize our users’ perspectives as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run.
To catch up, we adapted Harvard Libraries’ Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog.
This session will outline how we planned and executed Test Fest and what we learned from using this approach. We’ll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing, via affinity diagrams and lots of post-it notes.
The focus of this session is on our methodologies with an aim to include time for attendees to discuss how they would have approached the backlog, setting up Test Fest, and analyzing the data.
Presentation at Empirical Librarians 2018 in Knoxville, TN.
At UNC Chapel Hill, the User Experience and Assessment department regularly runs usability tests to inform our decision making and prioritize our users’ perspectives as we make changes. But there are more things to test than there are hours in the day. Our projects have a variety of stakeholders who are very interested in improving their services, and we found ourselves with a long list of tests we wanted to run.
To catch up, we adapted Harvard Libraries’ Test Fest model: five tests run simultaneously, with five participants rotating through the set of tests. Over a span of two hours, we completed 25 individual usability tests. In this one event, we caught up on much of our testing backlog.
This session will outline how we planned and executed Test Fest and what we learned from using this approach. We’ll also discuss how we approached analyzing the large amount of qualitative data that was gathered during testing, via affinity diagrams and lots of post-it notes.
The focus of this session is on our methodologies with an aim to include time for attendees to discuss how they would have approached the backlog, setting up Test Fest, and analyzing the data.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxEduSkills OECD
Andreas Schleicher presents at the OECD webinar ‘Digital devices in schools: detrimental distraction or secret to success?’ on 27 May 2024. The presentation was based on findings from PISA 2022 results and the webinar helped launch the PISA in Focus ‘Managing screen time: How to protect and equip students against distraction’ https://www.oecd-ilibrary.org/education/managing-screen-time_7c225af4-en and the OECD Education Policy Perspective ‘Students, digital devices and success’ can be found here - https://oe.cd/il/5yV
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
How to Split Bills in the Odoo 17 POS ModuleCeline George
Bills have a main role in point of sale procedure. It will help to track sales, handling payments and giving receipts to customers. Bill splitting also has an important role in POS. For example, If some friends come together for dinner and if they want to divide the bill then it is possible by POS bill splitting. This slide will show how to split bills in odoo 17 POS.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
2. Background Information
• Kennesaw State
University
– 35,000 students
• Library System
– Two libraries
– 22 librarians managing 60
undergraduate subject
guides
3. Background Information
• Why update design?
– An accessibility audit had revealed significant
issues with library content
– Some thoughts of moving away from a template
– Anecdotally, librarians disliked the template
– And…
5. Why do a Study?
• Not much information out there on version 2
of LibGuides, especially the side-navigation
layout option
• Needed to find support for continuing to use a
template
• When using a template, making significant
changes is a big project – wanted to get it
right
6. ResearchTeam
• Current:
– Ashley Hoffman, eLearning Librarian
– Michael Luther, Assessment Librarian
– Adam Olsen,Web Services Librarian
– Jon Hansen, Director ofVirtual Services
• Former:
– Rita Spisak, Strategic Marketing Librarian
– Xueying Chen, Donations Coordinator
– Ashley Dupuy, former Director of Research and
Instructional Services
7. ProjectTimeline
Study design,
IRB approval
• May - June
Phase 1:
Cardsorting
• July -
November
Beta design, IRB
approval
• December -
January
Phase 2:
UsabilityTesting
• January -
April
9. Methodology
• Individual, open card sort
• 60-minute session
– 40 minute card-sorting
– 10 minute discussion
• Discussion focused on
design (navigation and
visual preferences) and
predicted use (mobile,
multiple subjects)
10. Cardsorting: Issues andTips
• Recruitment! (or rather, attendance)
• If doing individual sorting, add a think-aloud
protocol and record
• Consider time carefully – 40 cards in 40 minutes was
probably the upper limit
• Streamline data collection/entry – number cards,
use envelopes
• Take photos immediately when finished
11. Cardsorting Analysis
• Complicated, messy data!
• Basic Process:
1. Identify similar categories (e.g.
“Search Effectively,” “SearchTips”)
2. Look for correlation: which cards
were frequently sorted into that
category?Which categories did a
specific card get sorted into?
3. Decide on a structure you’ll use
12. Cardsorting Results
• Process-oriented structure
• About 1/3 of students put search
tools first
• 50% of students preferred tabbed
navigation
• Preferred consistent layout across
subjects
• Average of 6 categories
13. NewTemplate
Design:
version 1
Major Design
Choices
• More images,
less text
• Clean,
uncluttered
feel
• Process-
oriented
• Less jargon
• Larger fonts
• Adaptable to
mobile
https://libguides.kennesaw.edu/engresearchguidebeta
15. Methodology
• Iterative process with 4 rounds
of testing, updates to the
design between each round
• 15-minute task-based, think-
aloud protocol
• One-on-one
• Recording screen and audio
for each session using
BlackBoard Collaborate Ultra
16. Tasks
• Your English professor has assigned you a research paper
onVirginiaWoolf’s novel To the Lighthouse. Using this
site, how would you get started?
• Your professor is requiring you to use a scholarly article
for the paper.Where would you go on this site to get
started?
• Your professor suggests you use literary criticism as well.
If you had never heard of this before, could you use this
site to find out more about it?
• You will need to cite your sources, but you aren’t familiar
with MLA style. Using this site, what would you do?
17. UsabilityTesting:Tips
• Emphasize that the participant can’t do anything wrong
• Don’t answer participant questions during testing
– If they can’t complete a task, the fault is with the design
• Create tasks that mimic actual use, align to the
different areas you want to test, and don’t use leading
phrasing
• Consider varying what order you ask the tasks in
• 15 minutes really is enough if you plan right!
• Recruiting “live” has worked well!
18. Round 1 Results
• Links to citation guides needed to be moved off
the help page
• Search page was too cluttered
• Page names were not always clear
• In general, needed to reduce the amount of text
and remove some content
• Navigation was sometimes confusing, partly due
to page length
20. NewTemplate
Design:
Version 2
Major Changes:
• Deleted content
• Renamed pages
• Moved citation
info to “Find
Sources”
• Reorganized
information
from “Search”
and “Sources”
into “Find
Sources” and
“SearchTips”
• All boxes now
display without
outlines
• Updated
navigation
21. Round 2 Results
• Mostly positive feedback on the visuals, layout,
and navigation
• Citation guide links were easier to find, but some
students still had problems
• All tasks could be completed on the “Find
Sources” page – which students seemed to like
• Definite preference for the layout with the
tabbed box, instead of a long page
22. UsabilityTesting:WhereWe’re at Now
• Conducting Round 3 onWednesday
– Updating design to use tabbed boxes more,
continuing to reduce length
– Creating new tasks to collect feedback on specific
pages
• Planning for Round 4 – final round
– Creating subject-specific content for Political Science
– Creating the new CMS SystemTemplate (as opposed
to aTemplateGuide)
23. Transferable Design Characteristics
• Identify your target population – who are you
designing for?
• Process-oriented organization
• Minimize scrolling
• Keep it clean and simple – but not too simple!
• Avoid text-only sections
• Provide navigation options
24. Future Plans @ KSU
• Launching completed design across all
undergraduate subject guides in July
• Embedding guides into D2L and marketing to
faculty and students
• Collecting usage data, etc. to assess impact
• Peer training on how to apply the lessons
learned here to other KSU guides
25. Recommended Resources
• Krug, S. (2010). Rocket surgery made easy:The do-it-yourself guide to finding and fixing
usability problems. Berkeley, CA: New Riders.
• Krug, S. (2014). Don’t make me think, revisited: A common sense approach toWeb usability
(Third edition.). San Francisco, California: New Riders, Peachpit, Pearson Education.
• Kuniavsky, M. (2003). Observing the user experience: A practitioner’s guide to user
research. San Francisco, Calif.: Morgan Kaufmann.
• Sinkinson,C., Alexander, S., Hicks,A., & Kahn, M. (2012). Guiding Design: Exposing
Librarian and Student Mental Models of Research Guides. Portal : Libraries and the
Academy; Baltimore, 12(1), 63–84.
• Spencer, D. (2009). Card sorting: Designing usable categories. NewYork: Rosenfeld Media.
• Spencer, D. (n.d.).Card sorting: Resources. Retrieved December 13, 2017, from
http://rosenfeldmedia.com/books/card-sorting/
• Thorngate, S., & Hoden, A. (2017). Exploratory usability testing of user interface options
in LibGuides 2. College & Research Libraries, 78(6), 844–861.
https://doi.org/https://doi.org/10.5860/crl.78.6.844
Good morning! As she said, my name is Amy Gratz, and I’m the Learning & Teaching Services Librarian at Kennesaw State University. One of my main duties there is to coordinate the content and design of our online guides.
Before I jump into my main topic, I just wanted to give you a little bit of context since I imagine most of you aren’t familiar with Kennesaw. We’re currently the 3rd largest university in Georgia with around 35,000 students; we have 32 librarians working to provide them with services, and 22 of us manage at least one undergraduate subject guide, which is the type of guide the project I’m talking about is focused on.
I started at KSU a year ago January, and one of the first things I decided to tackle was updating these guides. Partly because of accessibility issues, partly because I was hearing from my new colleagues that they wanted to move away from having a template at all. KSU has a template because it facilitates maintenance, ensures consistency of design and quality, and it helps those librarians who don’t have enough time or subject expertise to create one on their own. Regardless, though, I heard from several of my new colleagues that they disliked our current template.
…personally, I agree with them – I think our current design looks dated. This is our current English subject guide, which uses a design created in LibGuides v1 back in 2011 – and similar to what Audrey and Heather mentioned in their presentation yesterday, not everything transferred well with the change! This design is a template that librarians don’t have a lot of say in, and although guide owners can make changes with approval, almost all information is identical from one guide to the next, except for different databases listed under “find articles” and different featured books. I have included the URL there if anyone would like to have a look
I decided that before we could start making changes, we needed to do a study for 3 main reasons. I wasn’t seeing much in the literature about version 2 of LibGuides, especially the side-navigation option, although there was plenty about guide design and website design in general. I also wanted to find support for continuing to use a template, since I think it makes the most since at KSU for our situation. Although if it turned out students really didn’t like or need a template, I was open to that. And finally, since we do have a template, making significant changes is a big project, and I wanted to get it right.
Since this is a big project, there’s no way I could do it alone! That said, you might not need such a large team, but I think you need at least 3 just to provide different perspectives.
Although they’re not here with me, I wanted to take a minute to acknowledge everyone else who has been involved in the project. The current Research Team has been involved the entire way through; those you see listed as former members had to leave the team for various reasons, but were very helpful during the first phase of the study.
So this study has had two different main phases, both of which I’ll be talking about in more detail, but to give you a sense of the overall timeline, we started planning this project almost a year ago, and decided to break it into two main phases, each of which went through the IRB approval process separately. We completed the first phase late fall, and we’re currently in the second phase. The goal is to launch our new guides this summer.
Okay, so – Phase 1. How many if you are familiar with cardsorting? It’s a testing method that lets you create an overall organizational structure. Participants had a deck of cards with information on them, and were asked to create groups of similar cards – basically to organize the information in ways that made sense to them. This gives us a theoretical structure for the guide.
We decided to do an individual open card sort with decks of 40 cards – you can see an example on the right. Each card described a LibGuides box – the title and a brief description of the content in it. Most of these were from our current template, with some additions from course guides. “Open sort” means that each student was able to create their own categories – as many or as few as they wanted, with whatever names they wanted. These were 60 minute sessions, 40 minutes of card-sorting, 10 minutes of discussion, and 10 minutes for consent/instructions/etc. The discussion at the end was focused on the guide design and predicted use.
Recruitment: Used a one-question survey (What would you expect to find on a “research guide”?) with the option of signing up for the focus groups. Many students who said they would attend failed to show up, enough that we had to turn group sorting into individual sorting and add an additional round of testing
Think-Aloud: We couldn’t, but a helpful substitute was an accidental addition, requesting participants share anything they thought we should know while we analyzed their data
Timing: In analysis, it did appear that many cards were sorted solely based on the title of the card, not the description, so more time might have helped
Streamline data collection/entry: we numbered the backs of the cards so we could quickly enter data into the spreadsheet, and had students write their group names on envelopes so we could put all the cards in that group into the envelope, in case we needed to check anything later
Photos – SO helpful for going back to the original! We took photos exactly as the students left everything, then follow-ups with the numbers of cards showing
The data you get is complicated and messy, and it will take time to work through – you can get a sense of that from the screenshot here. I found an Excel template available as a free download (link at end of presentation), which I strongly recommend using, as it was very helpful. Ended up also creating alternatives so we could sort the data about 100 different ways
Basic process has 3 steps:
Identify similar categories and give them a standardized name (this name can change later). 4 of us worked on this – after familiarizing ourselves with the data, we found it very effective to do the first step together as a group. You will need to make adjustments – it can be easy to misinterpret what they said
Look for correlations – cards in each cat, cats for each card.
Decide on a structure you’ll use
We ended up with the structure you see on the left, which I’ll explain more in a moment – I know it’s too small to read! The general themes from our 18 participants in this phase were (read points). …we decided to use 4 categories, although two of these had sub-categories, as well.
Start
Sub-groups: Learn about the Research Process, Find & Develop a Topic, Get the Basics. Basic library/research information and definitions and information on working with a research topic
Search
Sub-groups: Start Searching, Search Tips. Strategies for working with keywords and other search tips, database and other resource links.
Sources
Information about different types of sources and places to search for them. Intended to vary by discipline.
Help
Contact information for the library and writing center, FAQ and citation guides.
From that, we created this Beta guide design. We decided on side-navigation because there wasn’t a strong preference among participants, it mimics the KSU website, and requires our librarians to think less about mobile design. Other Major design choices were to use more images, less text; aim for a clean, uncluttered feel; process-oriented; less jargon.
At this point we were ready to move on to the next phase of our study:
Usability testing! This is our current phase – and I my opinion, the more fun part!
Before I get into this, I do just want to say that I don’t think you have to do the cardsorting – it was useful for us in giving us some overall guidance, but I think the usability testing is the more useful of the two. And you could use a lot of different ways to get ready for doing usability testing!
We’re using an iterative design process, where we tweak the design between each round of testing. Doing 1 round of testing a month, about 3 weeks between for analysis and updates.
Our testing is short – just 15 minutes with one participant and the moderator. Because only the moderator is in the room, we are recording these for analysis.
Could really use any software that lets you share a screen and audio, and record a session.
These are the tasks we used in Round 1 of testing – essentially, we give the students an example assignment to get started on, then ask them to start finding a scholarly article, learn about literary criticism, and find information about MLA citations.
We did change the first task in Round 2 to remove the specific novel, but otherwise these are the tasks we’ve tested so far. Our goal was to create tasks that mimic actual use of the guide.
Before I start getting into the results, I want to share some tips on conducting usability testing (read from slide)
15 minutes – most students have only needed 10
Recruitment – we’re in a high-traffic area – everyone walks past as they enter or exit the building. One of the research team is actively soliciting participation – I’ve found it effective to look for students who are alone, will make eye contact, and don’t look like they’re in a rush. You do have to be very accepting of “no,” though!
All of that said, I imagine you want to know what we’ve found!
11 participants in this round – more than enough to see consistent trends. Our goal here is NOT a representative sample!
Citations – only about 2 students found those on their own, and it wasn’t the first place they looked
Search page – students had a hard time finding what they wanted
Page names were too ambiguous (Sources)
Too much text – which we expected. I’m going to get this quote wrong, but Steve Krug, whose books we’ve used a lot in this process, suggests writing what you want to say, then cutting it in half, and in half again. And then a third time. We’ve got a few more halves to go!
Navigation – length of pages and those ambiguous names were both issues
So, before I show you what version 2 looked like, I wanted to refresh your memory of the first version of the guide. Which turned into…
…This – (slide info). Tried to avoid changing some content we hadn’t received feedback on.
Additionally, we created 2 versions of the “Find Sources” page due to the length, the second of which had the same content in a tabbed box. Participants were asked to view this version during the Q&A at the end, rather than mixing it into the actual testing. Decision was made partly based on the amount of time we had left before testing.
13 participants in this round. Usability seemed to improve, so we’re headed in the right direction!
So I don’t have another version to show you, because we’re still making changes. We’ll be doing Round 3 next week – our time-table slightly condensed so we can finish testing before students get too caught up in finals. For this round, we’re focusing on reducing scrolling by using tabbed boxes, editing content, etc. 2 librarians working on content – switched which pages we’re looking at this time so we can edit one-another’s work. We’re also updating our tasks to make sure we get feedback on the entire guide – slight flaw in our methods for round 2!
We’re also starting to prepare for Round 4, scheduled for April 11. We’ll be testing a different subject, since this is intended to work across disciplines. We’re also planning to use the actual CMS System Template option, instead of creating custom code for this guide, since that is what we’ll be doing going forward. Couldn’t start creating the System Template until we were fairly confident what design changes we were going to keep
Even though we’re not done, we have seen some trends I don’t think will change in terms of overall design.
I know, the first point is more about content than design. But it’s important to identify this at the beginning! Who you’re creating content for will dictate the design to an extent.
Process oriented structure – our students don’t necessarily think the way we do!
Minimizing scrolling – there are a lot of ways to do that; I really like the accordion structure Audrey and Heather showed in their presentation yesterday. We’re making choices that work based on what LibGuides gives us, but you can certainly be more creative!
Keep it clean and simple – our students expect to be able to find answers in seconds, and whatever we might think about it, we have to work with that expectation. But do make sure your labels actually have meaning!
Avoid text-only sections – the students aren’t reading them, anyway!
Provide navigation options – our students really used the menu a lot in the last round of testing, and the option to turn on the next/prev page links were handy, too.
Once we finish our project, we’ll be training colleagues and creating updated versions of all 60 guides, to be launched in July. Hopefully buy-in won’t be a big issue for us!
We’re also working on plans to embed these directly into D2L and market them more directly to faculty and students to improve visibility. We’ll be collecting some data to assess the impact – I can’t be more specific because I don’t know exactly what that will look like!
Finally, I’m planning to do some peer training with my colleagues on those transferrable design characteristics, because the rest of our LibGuides also need a change!
I really used all of these, but I want to particularly draw attention to Steve Krug’s books on DIY usability testing and web design, Donna Spencer’s work if you’re interested in the card sorting, and finally, the article by Thorngate and Hoden – their model really inspired a lot of what we’ve done.