Teaming up to teach a cohort of incoming freshman athletes during the Fall 2015 semester, a librarian and a faculty member designed a research study to examine the short and long term effects of embedded librarianship on incoming student athletes’ GPA, information literacy skills, and perceptions of research. In this non-credit Life Skills course required for all incoming athletes, the librarian was charged with infusing critical thinking and information literacy in the context of the weekly topics. Both formative and summative assessments were conducted in order to chart student learning. In-class exercises were designed to teach students to approach problems and gaps in their knowledge like researchers. Frequent checks for learning with formal and informal assessments were used in nearly each class. This presentation will not delve into the larger research project but instead will detail the sustainability, practicality, and effectiveness of attempting to measure student learning in the weekly classroom activities.
Similar to Score! Using Competitive Assessment Approaches to Chart Growth in Critical Thinking and Information Literacy with Incoming First-Year Student Athletes
Similar to Score! Using Competitive Assessment Approaches to Chart Growth in Critical Thinking and Information Literacy with Incoming First-Year Student Athletes (20)
Score! Using Competitive Assessment Approaches to Chart Growth in Critical Thinking and Information Literacy with Incoming First-Year Student Athletes
1. Using Competitive Assessment Approaches to Chart Growth
in Critical Thinking and Information Literacy With Incoming
First-Year Student-Athletes
Colleen Mullally and John Watson
Pepperdine University, Malibu, CA, USA
2. The Study and Assessment
Aim of the Study
Improve first-year student-athletes’ knowledge and skills relative to critical inquiry and information literacy
The Participants
Randomly selected first-semester scholarship student-athletes at NCAA Division I institution
Formative Assessment
Weekly in-class assignments related to the topic introduced at the beginning of class
Summative Assessment
Project SAILS Information Literacy Assessment - Individual Scores Test (Pretest & Posttest)
Student Perception of Research (Pretest & Posttest)
Student Reflection on Challenges during First Semester of College
6. Type Design & Administration Data Generated Data Analysis Uses
Online polling Create Q and answer type:
● Y/N
● choose one
● brief free text responses
Students answer in class
Anonymous
Wide variety based on poll
Quantitative or qualitative
data
All results can display
immediately in Poll
Everywhere
Overall understanding
Engage students
Scavenger
hunts
Another colleague created
Groups select questions related to
library space and resources and use
library to answer
Learning related to library
Total # correct
Question types attempted
Arduous scoring of
paper-based game
Self-directed learning
Competitive activity
Online quizzes K-State U librarians created
Each student completes the New
Literacies Alliance tutorials and
assessments
Quantitative
Performance-based
Easy to interpret
Time to receive data
depends on librarians at
K-State
Introduce/refresh concepts
Mid-stream individual data
Determine follow-up needed
Final individual
assignment and
group
assignment
Created by professor and librarian
Completed over several classes,
students must think critically about
information and sources and justify their
thinking based on the text provided
Comparative individual and
group data
Evaluated by one rubric
Time-consuming to score
Evaluate student work
A Few Examples of Formative Assessment Artifacts
11. Assessing Students’ Reflective Work:
Identifying Areas for Further Discussion and Learning
Our Question:
What did you learn from tonight’s session on scanning large bodies of text
for content, context, audience, and purpose?
Students Identified These Key Areas About Their Learning:
Utility in locating relevant sources (2 out of 5 groups)
Specific strategies for scanning text (1 out of 5 groups)
Process of skimming (1 out of 5 groups)
Our Question:
What proved challenging during tonight’s session on scanning large
bodies of text for content, context, audience, and purpose?
Students Identified a Variety of Challenges They Experienced:
Time limit (all 5 groups)
Process of scanning (versus reading) (4 out of 5 groups)
Determining meaning from scanning (2 out of 5 groups)
Length of text (2 out of 5 groups)
13. Thank you!
We welcome your questions, feedback, or comments!
Colleen Mullally mullallyc@northandoverpublicschools.com
John Watson john.g.watson@pepperdine.edu
Editor's Notes
Our 8 minutes will focus on a discussion of the data collected in a beta class where both formative and summative assessment data was collected and used in evaluating the growth of the student athletes. [Optional text: We’ll be speaking about the effectiveness and issues in using SAILS as an information literacy test for measurement of student growth and identification of student weaknesses will be detailed. We’ll also highlight some of the assessment artifacts most useful in formally measuring learning and others that proved more effective as a dialogue for modifying the remaining weeks’ lesson plans.]
{John} - this slide is about the good news of our summative assessment findings; like the previous slide, it also has two points (and a discussion):
(1) findings from SAILS from beginning to end of the semester
(2) findings from pre and post survey
(3) discussion of what we’ve learned as a result (and why formative assessments give us more evidence than just the survey and SAILS … and how these assessments helped us gauge what students were learning).
Image credit: latimes.com http://www.latimes.com/sports/more/la-sp-pepperdine-academics-20140427-story.html
Summary of Pre-Test and Post-Test results using Doctorate Benchmark Data.
Student Perception of Research Survey - Most significant findings
The assessment process involving individual class sessions was more revealing of learning progress than the survey data alone.
We will highlight how a few types of assessments were used in the coming slides. This slide represents a few additional assessment approaches we took toward evaluating student learning and their performance and understanding of the concepts. We frequently depended upon gathering and evaluating the learning with student reflective papers, worksheets, and Google docs-based exercises. We really dove into assessment …
… and next you’ll get to hear about how we used formative assessments to inform our teaching. We hope that our examples will provide a basis for starting a conversation with librarians who provide classroom instruction on how they can use assessment for more effective classes.
Photo credit: usatoday.com http://www.usatoday.com/sports/services/photos/USATSI-238872/
Despite having an outline for the class fully developed prior to the start of the semester, the librarian and faculty member changed and adapted upcoming lessons in response to results of the in-class activities. Not all classes went well, and fumbles early on enabled us to see what we needed to change.
Very early in the semester, we set out to evaluate how well students could synthesize the guest speaker’s topic on time management for athletes into relevant keywords and phrases and organize their terms into broader, narrower, and related keywords. What we ended up learning was much more about students’ comfort with technology and our need to provide additional guidance before introducing a classroom activity.
Let’s take a look.
What could we glean from this? There was a wide range in students’ abilities!
1.) their ability to use Google technology
2.) their familiarity with spreadsheets
3.) their concept of broader, narrower
4.) their ability to generate any related terms and synthesize their learning
What did we learn and how did we change?
1.) providing a class practice example for all remaining activities
2.) allotting more time for setup of all remaining Google docs exercises
3.) more practice throughout the semester in generating keywords, broader, narrower, and related terms
4.) were more strategic in using technology for remainder of class assignments
TRANSITION: Next, let’s take a look at how we tackled formative assessment from another lesson.
In another class, we wanted students to learn how to approach a large body of text armed with tools that would allow them to critically evaluate the author’s content, context, audience, and purpose …. all without having to do more than scan the article.
We wanted to know whether practicing critical thinking on visual materials would be an appropriate means for introducing this concept of quickly scanning for information. We collaborated with the librarian for Special Collections who pulled largely visual artifacts from the University Archives. In small groups, students were put to work practicing these skills on images and newspaper clippings like you see in the slide but also some interesting University Archives ephemera such as a ticket from the 1984 Summer Olympics sporting event held on the Malibu campus, and a team roster or program.
We found that students were very engaged in this exercise and enjoyed competing against each other in listing as many properties about the picture as they could. After class, we evaluated their responses to the questions on our worksheets to see how well they could synthesize the properties of the artifacts with their prior knowledge they had to answer: what is the main topic of the item? what other information can you gather about the items, such as date, author, or type of item?, if you were going to write a history of the topic you listed, what do you learn from these items that would help you in your research?
While we didn’t “score” the students on their writing or class participation, we did evaluate the effectiveness of the archival activities to introduce and apply the concept of critical inquiry to different types of information sources.
But that’s not all we set out to assess during that evening’s class. We also wanted to know how students had processed their practice of scanning large bodies of text for content, context, audience, and purpose so that we could follow up in future classes. Before leaving class that night, we asked groups to write brief responses about what they learned from the technique of scanning articles for CCAP, what was challenging about this, and what questions they still had.
Images courtesy of Pepperdine University Digital Archives
http://pepperdine.contentdm.oclc.org/cdm/singleitem/collection/p271401coll15/id/11678
Insert other image of football injury stored on NAPS drive
We used this information to directly respond to students and address their concerns in the very next class.
Post-Game Review:
On Evaluating
--Sustainability
--Practicality
--Effectiveness
Not every class activity needs to be assessible!
How do we measure success? Sure, the SAILS and survey results were largely indicative of improved outcomes for the class. We conducted interviews of students to follow up about their experiences as well. Looking back, what was (and wasn’t) sustainable, practical, and effective.
Sustainable: (positive) assessing progress via in class exercises with assessment component
(negative) because of the level of collaboration and reflection afterwards, not necessarily sustainable with staffing and other constraints
Practical: (positive) assignment to practice learning, to measure learning of topic early and later
(negative) not always enough time to do evaluation of student work immediately following the classes
Effective: (positive) seeing the scores, evaluating the summative data
(negative) question of motivation & engagment on in-class assignments - hard to evaluate for longer in-depth assignments - due to lack of a grade
Summer 1984 Olympics images from Pepperdine University Archives Digital Collections (keyword: olympicshighlights) http://pepperdine.contentdm.oclc.org/cdm/search/collection/p271401coll15/searchterm/olympicshighlights/field/all/mode/all/conn/and