SAIL away: comparing the cohort test to build your own test version of Project SAILS in a Canadian context - Eva & Graham

IL Group (CILIP Information Literacy Group)
IL Group (CILIP Information Literacy Group)IL Group (CILIP Information Literacy Group)
SAIL AWAY: COMPARING THE
COHORT TEST TO THE BYOT
VERSIONS OF PROJECT SAILS
IN A CANADIAN CONTEXT
Rumi Graham, Nicole
Eva & Sandra Cowan
University of
Lethbridge
Alberta, Canada
CONTEXT
2
WHAT IS SAILS?
Skills tested:
 Developing a research
strategy
 Selecting finding tools
 Searching
 Using finding tool features
 Retrieving sources
 Evaluating sources
 Documenting sources
 Understanding Economic,
Legal, and Social Issues
Test versions:
• Individual scores test (U.S.)
• Cohort test (U.S.)
• International cohort test (our only option in 2015)
Based on
older
ACRL
‘compete
ncy
standards
’
3
WHY SAILS?
Luck of the draw – literally!
Teaching Development Fund
After the fact, few alternatives for Canadian
institutions
4Yosemite~commonswiki CC BY-SA 3.0,
https://commons.wikimedia.org/w/index.php?curid=520257
THE RESEARCH: 2015
Research purpose:
 reliable, objective data on information literacy (IL) levels of first-year
undergrads before and after librarian-created IL instruction
Key questions:
 Levels of IL possessed by incoming first-year students?
 Improvement in students’ IL abilities after IL instruction?
 Correlations between students’ IL attainment levels and
 their year of study?
 number or format of IL instruction sessions?
5
https://pixabay.com/en/question-speech-bubbles-speech-
1828268/
ACCESSING THE TEST
Web-based; custom link for CMS (Moodle);
custom web consent page
Unique SAILS ID# for each consenting student
No identifying data gathered/stored on SAILS
server – total student anonymity
6
THE SAMPLE
Academic Writing (10 sections) = 250
students
Liberal Education 1000 = 87 students
THE INTERVENTIONS
In-class instruction
Online modules
THE INCENTIVES
Writing: Draw for 1 of 2 $100 bookstore gift cards (must do
both)
Liberal Education: Draw + 3% bonus (1% pre-test + 2% post-
test)
7
THE RESULTS
71.3% of respondents came from LBED
1000
68% first year students
Approx 80% did both pre and post
THE COHORT
Cohort: 14 institutions; 6370 students
Doctoral cohort
8
U OF L RESULTS AGAINST
BENCHMARK
Pre-test Post-test
Above Retrieving Sources
Evaluating Sources
Developing a research strategy
Searching
Developing a research strategy
Searching
At benchmark Selecting finding tools
Documenting sources
Using finding tool features
Documenting sources
Retrieving Sources
Using finding tool features
Evaluating sources
Below Selecting finding tools
(in order of how well they performed)
9
420
440
460
480
500
520
540
560
580
Developing a research strategy
Selecting finding tools
Searching
Using finding tool featuresRetrieving sources
Evaluating sources
Documenting sources
U of L Results against Benchmark (doctorate institutions)
Pre-test Benchmark Post-test
10
THE LESSONS LEARNED
Limitations of cohort test
 We were the only international participant
 Our own benchmark; comparing ourselves to ourselves
 Swamped by large institution results (Ashford U 43%)
 Cannot track individual results from pre to post test
 Cannot choose which questions students get
Limitations of separating sections of Academic Writing -
cohort report
Importance of incentives!
Time of semester – competing obligations
11
2016: NEW SAILS TESTING OPTION
SAILS Build Your Own Test (BYOT) launched January
2016
12
 Customizable version of
“individual scores test”
 Test scores tracked for
each test-taker
 Hand-picked test
questions
 No minimum number of
test questions; maximum
of 50
 Available to international
institutions
2016: THE RESEARCH
Ran study again using BYOT and modified set
of courses
THE SAMPLE
 Library Science (LBSC) 0520 ~ 30 students
 First Nations Transition Program course taught by a
librarian
 Library Science (LBSC) 2000 ~ 30 students
 Arts & Science full-credit course taught by a librarian
 Liberal Education (LBED) 1000 ~ 95 students
 4 labs taught by a librarian; embedded in full-credit course
13
2016: INCENTIVES & INTERVENTIONS
THE INCENTIVES
 LBSC 0520 and LBSC 2000
 In-class time to write pre-test and post-test
 Bonus marks: 2% pre-test; 3% post-test
 Draw $100 gift card (one per course)
 LBED 1000
 Bonus marks: 1% for pre-test; 2% for post-test
 Draw for $100 gift card
THE INTERVENTIONS
 In-class instruction; online modules (LBED 1000 only)
14Alan O’Rourke http://bit.ly/2mjCzBQ CC BY 2.0
2016: BUILDING THE BYOTS
 IL instructors identified questions on
topics not covered in course
(eliminated 50 questions)
 From remaining 112 questions,
researchers not involved in grading
any coursework selected 2 matching,
non-overlapping sets of questions
 Both sets had same number of
questions in each skill area reflecting
same range of difficulty
 Pre-test and post-test each contained
26 SAILS questions (42% shorter than
45-question cohort test) 15http://bit.ly/2mx4iOu CC0
2016: TEST QUESTION MIX
#
Questions
Easy Moderat
e
Difficult
Developing Research
Strategy 4 2 1 1
Selecting Finding Tools 3 1 1 1
Searching 4 1 2 1
Using Finding Tool
Features 3 1 1 1
Retrieving Sources 3 1 1 1
Evaluating Sources 4 1 2 1
Documenting Sources 3 1 1 1
Economic, Legal, Social
Issues 2 1 1 16
2016: PARTICIPATION RATE
17
#
Enrolled % Enrolled Wrote Pre-test
or Post-test
Participated in
Study
LBED
1000 95 60.5% 80 84.2%
LBSC
0520 32 20.4% 31 96.9%
LBSC
2000 30 19.1% 30 100%
All
Students 157 100% 141 89.8%
2016: PRE-TEST QUESTION ON PRIOR IL
INSTRUCTION (n=124)
18
2016: MEAN SCORES IMPROVED IN ALL
COURSES?
LBED 1000 LBSC 0520 LBSC 2000
Pre-test lowest 14.5% 23.1% 23.1%
Pre-test
highest 80.8% 61.5% 80.8%
Pre-test mean 56.4% 41.7% 56.9%
Post-test
lowest 15.4% 15.4% 23.1%
Post-test
highest 84.6% 73.1% 80.8%
Post-test mean 59.9% 48.2% 59.8% 19
*Pre-test: n=124 Post-test: n=126
2016: MEAN SCORES IMPROVED IN ALL
YEARS?
20
1st Year 2nd Year 3rd Year+
Pre-test
lowest 23.1% 15.4% 30.8%
Pre-test
highest 80.8% 80.8% 80.8%
Pre-test
mean 51.8% 56.4% 59.3%
Post-test
lowest 15.4% 26.9% 23.1%
Post-test*Pre-test: n=124 Post-test: n=126
2016: INDIVIDUAL STUDENTS
IMPROVED?
107 students wrote the pre- and post-tests
(68%)
 Mean pre-test score = 53.95%
 Mean post-test score = 58.16%
 Mean difference = +4.21%
 Margin of error = ± 2.9 %
21
2016: STUDENTS IMPROVED IN ALL
COURSES? (n=107)
22
LBED
1000
LBSC
0520
LBSC
2000
Difference between pre-
test and post-test mean
scores 5.8% 4.9% 0.004%
2016: STUDENTS IMPROVED IN ALL
YEARS? (n=107)
23
1st Year
2nd
Year
3rd
Year+
Difference between pre-test
and post-test mean scores 6.30% 4.26% -5.28%
2016: POST-TEST SELF-ASSESSMENT
(n=126)
24
2016: LESSONS LEARNED
BYOT
•Mean time to complete post-test ~ 12 to 15 min. (LBSC courses),
suggesting 26-question BYOT not overly demanding
•Greater likelihood of statistically significant results with larger
classes
•Mean scores all well below Proficiency level (70% or better), but 31
students reached Proficiency and 3 reached Mastery level (85% or
better) in post-test
INCENTIVES
•Bonus marks a large incentive, but in-class time to write the tests
even more effective
•Were upper-level students more pragmatic in their participation
efforts? 25
CONCLUSIONS
BYOT ADVANTAGES
•You determine which questions are included and overall test
length
•Permits singular focus on only your students’ test results
•Permits tracking individual students’ scores
•Affords wide range of statistical analyses
COHORT TEST ADVANTAGES
•Easier to prepare for (no need to select questions)
•Useful for institutions committed to large-scale, longitudinal
testing
•No data analysis! (just interpretation)
•Slightly less expensive than individual scores/BYOT 26
SOURCES
Project SAILS website: https://www.projectsails.org/
• International Cohort Assessment:
https://www.projectsails.org/International
• Build Your Own Test: https://www.projectsails.org/BYOT
Cowan, S., Graham, R. & Eva, N. (2016). How information
literate are they? A SAILS study of (mostly) first-year students
at the U of L. Light on Teaching, 2016-17, 17-20. Retrieved
from http://bit.ly/2dlOTi6
Questions?
27
2016: NEW SAILS TESTING OPTION
SAILS Build Your Own Test (BYOT) launched January
2016
28
 Customizable version of
“individual scores test”
 Test scores tracked for
each test-taker
 Hand-picked test
questions
 No minimum number of
test questions; maximum
of 50
 Available to international
institutions
2016: THE RESEARCH
Ran study again using BYOT and modified set
of courses
THE SAMPLE
 Library Science (LBSC) 0520 ~ 30 students
 First Nations Transition Program course taught by a
librarian
 Library Science (LBSC) 2000 ~ 30 students
 Arts & Science full-credit course taught by a librarian
 Liberal Education (LBED) 1000 ~ 95 students
 4 labs taught by a librarian; embedded in full-credit course
29
2016: INCENTIVES & INTERVENTIONS
THE INCENTIVES
 LBSC 0520 and LBSC 2000
 In-class time to write pre-test and post-test
 Bonus marks: 2% pre-test; 3% post-test
 Draw $100 gift card (one per course)
 LBED 1000
 Bonus marks: 1% for pre-test; 2% for post-test
 Draw for $100 gift card
THE INTERVENTIONS
 In-class instruction; online modules (LBED 1000 only)
30Alan O’Rourke http://bit.ly/2mjCzBQ CC BY 2.0
2016: BUILDING THE BYOTS
 IL instructors identified questions on
topics not covered in course
(eliminated 50 questions)
 From remaining 112 questions,
researchers not involved in grading
any coursework selected 2 matching,
non-overlapping sets of questions
 Both sets had same number of
questions in each skill area reflecting
same range of difficulty
 Pre-test and post-test each contained
26 SAILS questions (42% shorter than
45-question cohort test) 31http://bit.ly/2mx4iOu CC0
2016: TEST QUESTION MIX
#
Questions
Easy Moderat
e
Difficult
Developing Research
Strategy 4 2 1 1
Selecting Finding Tools 3 1 1 1
Searching 4 1 2 1
Using Finding Tool
Features 3 1 1 1
Retrieving Sources 3 1 1 1
Evaluating Sources 4 1 2 1
Documenting Sources 3 1 1 1
Economic, Legal, Social
Issues 2 1 1 32
2016: PARTICIPATION RATE
33
#
Enrolled % Enrolled Wrote Pre-test
or Post-test
Participated in
Study
LBED
1000 95 60.5% 80 84.2%
LBSC
0520 32 20.4% 31 96.9%
LBSC
2000 30 19.1% 30 100%
All
Students 157 100% 141 89.8%
2016: PRE-TEST QUESTION ON PRIOR IL
INSTRUCTION (n=124)
34
2016: MEAN SCORES IMPROVED IN ALL
COURSES?
LBED 1000 LBSC 0520 LBSC 2000
Pre-test lowest 15.4% 23.1% 23.1%
Pre-test
highest 80.8% 61.5% 80.8%
Pre-test mean 56.4% 41.7% 56.9%
Post-test
lowest 15.4% 15.4% 23.1%
Post-test
highest 84.6% 73.1% 80.8%
Post-test mean 59.9% 48.2% 59.7% 35
*Pre-test: n=124 Post-test: n=126
2016: MEAN SCORES IMPROVED IN ALL
YEARS?
36
1st Year 2nd Year 3rd Year+
Pre-test
lowest 23.1% 15.4% 30.8%
Pre-test
highest 80.8% 80.8% 80.8%
Pre-test
mean 51.8% 56.5% 59.3%
Post-test
lowest 15.4% 26.9% 23.1%
Post-test*Pre-test: n=124 Post-test: n=126
2016: INDIVIDUAL STUDENTS
IMPROVED?
107 students wrote the pre- and post-tests
(68%)
 Mean pre-test score = 53.95%
 Mean post-test score = 58.16%
 Mean difference = +4.21%
 Margin of error = ± 2.89 %
37
2016: STUDENTS IMPROVED IN ALL
COURSES? (n=107)
38
LBED
1000
LBSC
0520
LBSC
2000
Difference between pre-
test and post-test mean
scores 5.767% 4.933% 0.004%
2016: STUDENTS IMPROVED IN ALL
YEARS? (n=107)
39
1st Year
2nd
Year
3rd
Year+
Difference between pre-test
and post-test mean scores 6.30% 4.26% -5.28%
2016: POST-TEST SELF-ASSESSMENT
(n=126)
40
2016: LESSONS LEARNED
BYOT
•Mean time to complete post-test ~ 12 to 15 min. (LBSC courses),
suggesting 26-question BYOT not overly demanding
•Greater likelihood of statistically significant results with larger
classes
•Mean scores all well below Proficiency level (70% or better), but 31
students reached Proficiency and 3 reached Mastery level (85% or
better) in post-test
INCENTIVES
•Bonus marks a large incentive, but in-class time to write the tests
even more effective
•Were upper-level students more pragmatic in their participation
efforts? 41
CONCLUSIONS
BYOT ADVANTAGES
•You determine which questions are included and overall test
length
•Permits singular focus on only your students’ test results
•Permits tracking individual students’ scores
•Affords wide range of statistical analyses
COHORT TEST ADVANTAGES
•Easier to prepare for (no need to select questions)
•Useful for institutions committed to large-scale, longitudinal
testing
•No data analysis! (just interpretation)
•Slightly less expensive than individual scores/BYOT 42
SOURCES
Project SAILS website: https://www.projectsails.org/
• International Cohort Assessment:
https://www.projectsails.org/International
• Build Your Own Test: https://www.projectsails.org/BYOT
Cowan, S., Graham, R. & Eva, N. (2016). How information
literate are they? A SAILS study of (mostly) first-year students
at the U of L. Light on Teaching, 2016-17, 17-20. Retrieved
from http://bit.ly/2dlOTi6
Questions? 43
1 of 43

Recommended

Using Learning Analytics to Create our 'Preferred Future' by
Using Learning Analytics to Create our 'Preferred Future'Using Learning Analytics to Create our 'Preferred Future'
Using Learning Analytics to Create our 'Preferred Future'John Whitmer, Ed.D.
779 views28 slides
Using Learning Analytics to Assess Innovation & Improve Student Achievement by
Using Learning Analytics to Assess Innovation & Improve Student Achievement Using Learning Analytics to Assess Innovation & Improve Student Achievement
Using Learning Analytics to Assess Innovation & Improve Student Achievement John Whitmer, Ed.D.
928 views50 slides
The Achievement Gap in Online Courses through a Learning Analytics Lens by
The Achievement Gap in Online Courses through a Learning Analytics LensThe Achievement Gap in Online Courses through a Learning Analytics Lens
The Achievement Gap in Online Courses through a Learning Analytics LensJohn Whitmer, Ed.D.
1.2K views39 slides
What data from 3 million learners can tell us about effective course design by
What data from 3 million learners can tell us about effective course designWhat data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
493 views27 slides
New_Mathways_AACC_Session_Deck_js_041116 by
New_Mathways_AACC_Session_Deck_js_041116New_Mathways_AACC_Session_Deck_js_041116
New_Mathways_AACC_Session_Deck_js_041116Jeff Shaver, PhD
227 views49 slides
The 'unknown unknowns' of assessment rubrics in practice, policy and research by
The 'unknown unknowns' of assessment rubrics in practice, policy and researchThe 'unknown unknowns' of assessment rubrics in practice, policy and research
The 'unknown unknowns' of assessment rubrics in practice, policy and researchPhillip Dawson
2.3K views27 slides

More Related Content

What's hot

Blackboard Learning Analytics Research Update by
Blackboard Learning Analytics Research UpdateBlackboard Learning Analytics Research Update
Blackboard Learning Analytics Research UpdateJohn Whitmer, Ed.D.
1.1K views38 slides
Designing effective assessment by
Designing effective assessment Designing effective assessment
Designing effective assessment David Carless
429 views48 slides
Technology-enabled learning-oriented assessment by
Technology-enabled learning-oriented assessmentTechnology-enabled learning-oriented assessment
Technology-enabled learning-oriented assessmentDavid Carless
503 views50 slides
Developing a collaborative learning design framework for open cross-instituti... by
Developing a collaborative learning design framework for open cross-instituti...Developing a collaborative learning design framework for open cross-instituti...
Developing a collaborative learning design framework for open cross-instituti...Open Education Consortium
429 views31 slides
Standards rollout for Pickaway County 2010 by
Standards rollout for Pickaway County 2010Standards rollout for Pickaway County 2010
Standards rollout for Pickaway County 2010JD Williamson
290 views29 slides
Assessment possibilities by
Assessment possibilitiesAssessment possibilities
Assessment possibilitiesDavid Carless
175 views41 slides

What's hot(20)

Blackboard Learning Analytics Research Update by John Whitmer, Ed.D.
Blackboard Learning Analytics Research UpdateBlackboard Learning Analytics Research Update
Blackboard Learning Analytics Research Update
John Whitmer, Ed.D.1.1K views
Designing effective assessment by David Carless
Designing effective assessment Designing effective assessment
Designing effective assessment
David Carless429 views
Technology-enabled learning-oriented assessment by David Carless
Technology-enabled learning-oriented assessmentTechnology-enabled learning-oriented assessment
Technology-enabled learning-oriented assessment
David Carless503 views
Developing a collaborative learning design framework for open cross-instituti... by Open Education Consortium
Developing a collaborative learning design framework for open cross-instituti...Developing a collaborative learning design framework for open cross-instituti...
Developing a collaborative learning design framework for open cross-instituti...
Standards rollout for Pickaway County 2010 by JD Williamson
Standards rollout for Pickaway County 2010Standards rollout for Pickaway County 2010
Standards rollout for Pickaway County 2010
JD Williamson290 views
Assessment possibilities by David Carless
Assessment possibilitiesAssessment possibilities
Assessment possibilities
David Carless175 views
Beyond teacher comments: Designing for student uptake of feedback by David Carless
Beyond teacher comments: Designing for student uptake of feedbackBeyond teacher comments: Designing for student uptake of feedback
Beyond teacher comments: Designing for student uptake of feedback
David Carless1.9K views
Supporting Students in Generating and Using Feedback by David Carless
Supporting Students in Generating and Using FeedbackSupporting Students in Generating and Using Feedback
Supporting Students in Generating and Using Feedback
David Carless857 views
Learning-oriented assessment in higher education by David Carless
Learning-oriented assessment in higher educationLearning-oriented assessment in higher education
Learning-oriented assessment in higher education
David Carless958 views
Embedding MOOCs in University courses: experiences and lessons learned by Sólveig Jakobsdóttir
Embedding MOOCs in University courses: experiences and lessons learnedEmbedding MOOCs in University courses: experiences and lessons learned
Embedding MOOCs in University courses: experiences and lessons learned
Towards broader conceptions of feedback by David Carless
Towards broader conceptions of feedbackTowards broader conceptions of feedback
Towards broader conceptions of feedback
David Carless593 views
Lak2017 Herodotou, Christothea by Thea24
Lak2017 Herodotou, ChristotheaLak2017 Herodotou, Christothea
Lak2017 Herodotou, Christothea
Thea241.6K views
Session 1 0620 1 by chemiturk
Session 1 0620 1Session 1 0620 1
Session 1 0620 1
chemiturk400 views
Assessment for digital futures by David Carless
Assessment for digital futuresAssessment for digital futures
Assessment for digital futures
David Carless537 views
OER LEARNING DESIGN GUIDELINES FOR BRAZILIAN K-12 TEACHERS SUPPORTING THE DEV... by Global OER Graduate Network
OER LEARNING DESIGN GUIDELINES FOR BRAZILIAN K-12 TEACHERS SUPPORTING THE DEV...OER LEARNING DESIGN GUIDELINES FOR BRAZILIAN K-12 TEACHERS SUPPORTING THE DEV...
OER LEARNING DESIGN GUIDELINES FOR BRAZILIAN K-12 TEACHERS SUPPORTING THE DEV...
Indian MOOC Learners on FutureLearn by Janesh Sanzgiri
Indian MOOC Learners on FutureLearnIndian MOOC Learners on FutureLearn
Indian MOOC Learners on FutureLearn
Janesh Sanzgiri527 views
Feedback literacy as a key to ongoing improvement by David Carless
Feedback literacy as a key to ongoing improvementFeedback literacy as a key to ongoing improvement
Feedback literacy as a key to ongoing improvement
David Carless579 views
A Study on Indian Learners in MOOCs - GO-GN Presentation by Janesh Sanzgiri
A Study on Indian Learners in MOOCs - GO-GN PresentationA Study on Indian Learners in MOOCs - GO-GN Presentation
A Study on Indian Learners in MOOCs - GO-GN Presentation
Janesh Sanzgiri392 views
Towards deeper and more sustained implementation of Assessment for learning by David Carless
Towards deeper and more sustained implementation of Assessment for learningTowards deeper and more sustained implementation of Assessment for learning
Towards deeper and more sustained implementation of Assessment for learning
David Carless405 views

Similar to SAIL away: comparing the cohort test to build your own test version of Project SAILS in a Canadian context - Eva & Graham

Professional development in MCQ writing by
Professional development in MCQ writingProfessional development in MCQ writing
Professional development in MCQ writingSusie Macfarlane
1.8K views36 slides
MULTI-LEARNING SPECIAL SESSION / EDUCON 2018 / EMADRID TEAM by
MULTI-LEARNING SPECIAL SESSION /  EDUCON 2018 / EMADRID TEAMMULTI-LEARNING SPECIAL SESSION /  EDUCON 2018 / EMADRID TEAM
MULTI-LEARNING SPECIAL SESSION / EDUCON 2018 / EMADRID TEAMeMadrid network
407 views24 slides
Continuous Improvement in Teaching and Learning – The Community College Open ... by
Continuous Improvement in Teaching and Learning – The Community College Open ...Continuous Improvement in Teaching and Learning – The Community College Open ...
Continuous Improvement in Teaching and Learning – The Community College Open ...Wa State Board Community and Technical Colleges
700 views30 slides
Gauging Effectiveness of Instructional Grants by
Gauging Effectiveness of Instructional GrantsGauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional GrantsLynda Milne
491 views28 slides
TESTA Interactive Masterclass by
TESTA Interactive MasterclassTESTA Interactive Masterclass
TESTA Interactive MasterclassTansy Jessop
139 views71 slides
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a... by
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...EADTU
379 views11 slides

Similar to SAIL away: comparing the cohort test to build your own test version of Project SAILS in a Canadian context - Eva & Graham(20)

Professional development in MCQ writing by Susie Macfarlane
Professional development in MCQ writingProfessional development in MCQ writing
Professional development in MCQ writing
Susie Macfarlane1.8K views
MULTI-LEARNING SPECIAL SESSION / EDUCON 2018 / EMADRID TEAM by eMadrid network
MULTI-LEARNING SPECIAL SESSION /  EDUCON 2018 / EMADRID TEAMMULTI-LEARNING SPECIAL SESSION /  EDUCON 2018 / EMADRID TEAM
MULTI-LEARNING SPECIAL SESSION / EDUCON 2018 / EMADRID TEAM
eMadrid network407 views
Gauging Effectiveness of Instructional Grants by Lynda Milne
Gauging Effectiveness of Instructional GrantsGauging Effectiveness of Instructional Grants
Gauging Effectiveness of Instructional Grants
Lynda Milne491 views
TESTA Interactive Masterclass by Tansy Jessop
TESTA Interactive MasterclassTESTA Interactive Masterclass
TESTA Interactive Masterclass
Tansy Jessop139 views
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a... by EADTU
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...
Quality frameworks for MOOCs: Checking MOOC quality afterwards: the case of a...
EADTU379 views
The Virtuous Loop of Learning Analytics & Academic Technology Innovation by John Whitmer, Ed.D.
The Virtuous Loop of Learning Analytics & Academic Technology Innovation The Virtuous Loop of Learning Analytics & Academic Technology Innovation
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
Wcse slides bates_galloway_denny by Simon Bates
Wcse slides bates_galloway_dennyWcse slides bates_galloway_denny
Wcse slides bates_galloway_denny
Simon Bates3.1K views
Online Tests: Can we do them better? | Bopelo Boitshwarelo, Jyoti Vemuri, Han... by Blackboard APAC
Online Tests: Can we do them better? | Bopelo Boitshwarelo, Jyoti Vemuri, Han...Online Tests: Can we do them better? | Bopelo Boitshwarelo, Jyoti Vemuri, Han...
Online Tests: Can we do them better? | Bopelo Boitshwarelo, Jyoti Vemuri, Han...
Blackboard APAC173 views
Getting to grips with TESTA methods by Tansy Jessop
Getting to grips with TESTA methodsGetting to grips with TESTA methods
Getting to grips with TESTA methods
Tansy Jessop44 views
Scholarly communication competencies: An analysis of confidence among Austral... by Danny Kingsley
Scholarly communication competencies: An analysis of confidence among Austral...Scholarly communication competencies: An analysis of confidence among Austral...
Scholarly communication competencies: An analysis of confidence among Austral...
Danny Kingsley161 views
Fall 2011 Online Reviews: An Interactive Approach by Carla Bradley
Fall 2011 Online Reviews: An Interactive ApproachFall 2011 Online Reviews: An Interactive Approach
Fall 2011 Online Reviews: An Interactive Approach
Carla Bradley573 views
Library Instruction that Improves Self-Efficacy & Academic Achievement by CSNLibrary
Library Instruction that Improves Self-Efficacy & Academic AchievementLibrary Instruction that Improves Self-Efficacy & Academic Achievement
Library Instruction that Improves Self-Efficacy & Academic Achievement
CSNLibrary439 views
Integration of the graduate profiles and academic literacy capabilities into ... by Neda Zdravkovic
Integration of the graduate profiles and academic literacy capabilities into ...Integration of the graduate profiles and academic literacy capabilities into ...
Integration of the graduate profiles and academic literacy capabilities into ...
Neda Zdravkovic106 views
Assessing Information Literacy Online by Margot
Assessing Information Literacy OnlineAssessing Information Literacy Online
Assessing Information Literacy Online
Margot 856 views

More from IL Group (CILIP Information Literacy Group)

Flierl, M. Mis-Information and dis-Information on Social Media: What are we t... by
Flierl, M. Mis-Information and dis-Information on Social Media: What are we t...Flierl, M. Mis-Information and dis-Information on Social Media: What are we t...
Flierl, M. Mis-Information and dis-Information on Social Media: What are we t...IL Group (CILIP Information Literacy Group)
54 views16 slides
Linsey, S., Southern, L. & Dawson, L. Health Literacy – a Key Life Skill for All by
Linsey, S., Southern, L. & Dawson, L. Health Literacy – a Key Life Skill for AllLinsey, S., Southern, L. & Dawson, L. Health Literacy – a Key Life Skill for All
Linsey, S., Southern, L. & Dawson, L. Health Literacy – a Key Life Skill for AllIL Group (CILIP Information Literacy Group)
24 views1 slide
Jackson, B. Recipe for Success: Delivering bite-size information skills train... by
Jackson, B. Recipe for Success: Delivering bite-size information skills train...Jackson, B. Recipe for Success: Delivering bite-size information skills train...
Jackson, B. Recipe for Success: Delivering bite-size information skills train...IL Group (CILIP Information Literacy Group)
32 views1 slide
Chen, S. Collaborating with Students: Reflections on University College Cork ... by
Chen, S. Collaborating with Students: Reflections on University College Cork ...Chen, S. Collaborating with Students: Reflections on University College Cork ...
Chen, S. Collaborating with Students: Reflections on University College Cork ...IL Group (CILIP Information Literacy Group)
25 views1 slide
Hornshaw, R. An Information Literacy skill tree- a tool for engaging students by
Hornshaw, R. An Information Literacy skill tree- a tool for engaging studentsHornshaw, R. An Information Literacy skill tree- a tool for engaging students
Hornshaw, R. An Information Literacy skill tree- a tool for engaging studentsIL Group (CILIP Information Literacy Group)
62 views1 slide
Crow, R. Libraries are not neutral: A pocket sized guide to libraries and the... by
Crow, R. Libraries are not neutral: A pocket sized guide to libraries and the...Crow, R. Libraries are not neutral: A pocket sized guide to libraries and the...
Crow, R. Libraries are not neutral: A pocket sized guide to libraries and the...IL Group (CILIP Information Literacy Group)
46 views2 slides

More from IL Group (CILIP Information Literacy Group)(20)

Recently uploaded

Drama KS5 Breakdown by
Drama KS5 BreakdownDrama KS5 Breakdown
Drama KS5 BreakdownWestHatch
71 views2 slides
Dance KS5 Breakdown by
Dance KS5 BreakdownDance KS5 Breakdown
Dance KS5 BreakdownWestHatch
68 views2 slides
7 NOVEL DRUG DELIVERY SYSTEM.pptx by
7 NOVEL DRUG DELIVERY SYSTEM.pptx7 NOVEL DRUG DELIVERY SYSTEM.pptx
7 NOVEL DRUG DELIVERY SYSTEM.pptxSachin Nitave
58 views35 slides
Ch. 7 Political Participation and Elections.pptx by
Ch. 7 Political Participation and Elections.pptxCh. 7 Political Participation and Elections.pptx
Ch. 7 Political Participation and Elections.pptxRommel Regala
72 views11 slides
REPRESENTATION - GAUNTLET.pptx by
REPRESENTATION - GAUNTLET.pptxREPRESENTATION - GAUNTLET.pptx
REPRESENTATION - GAUNTLET.pptxiammrhaywood
83 views26 slides
MercerJesse2.1Doc.pdf by
MercerJesse2.1Doc.pdfMercerJesse2.1Doc.pdf
MercerJesse2.1Doc.pdfjessemercerail
115 views5 slides

Recently uploaded(20)

Drama KS5 Breakdown by WestHatch
Drama KS5 BreakdownDrama KS5 Breakdown
Drama KS5 Breakdown
WestHatch71 views
Dance KS5 Breakdown by WestHatch
Dance KS5 BreakdownDance KS5 Breakdown
Dance KS5 Breakdown
WestHatch68 views
7 NOVEL DRUG DELIVERY SYSTEM.pptx by Sachin Nitave
7 NOVEL DRUG DELIVERY SYSTEM.pptx7 NOVEL DRUG DELIVERY SYSTEM.pptx
7 NOVEL DRUG DELIVERY SYSTEM.pptx
Sachin Nitave58 views
Ch. 7 Political Participation and Elections.pptx by Rommel Regala
Ch. 7 Political Participation and Elections.pptxCh. 7 Political Participation and Elections.pptx
Ch. 7 Political Participation and Elections.pptx
Rommel Regala72 views
REPRESENTATION - GAUNTLET.pptx by iammrhaywood
REPRESENTATION - GAUNTLET.pptxREPRESENTATION - GAUNTLET.pptx
REPRESENTATION - GAUNTLET.pptx
iammrhaywood83 views
Use of Probiotics in Aquaculture.pptx by AKSHAY MANDAL
Use of Probiotics in Aquaculture.pptxUse of Probiotics in Aquaculture.pptx
Use of Probiotics in Aquaculture.pptx
AKSHAY MANDAL89 views
Community-led Open Access Publishing webinar.pptx by Jisc
Community-led Open Access Publishing webinar.pptxCommunity-led Open Access Publishing webinar.pptx
Community-led Open Access Publishing webinar.pptx
Jisc74 views
UWP OA Week Presentation (1).pptx by Jisc
UWP OA Week Presentation (1).pptxUWP OA Week Presentation (1).pptx
UWP OA Week Presentation (1).pptx
Jisc74 views
Psychology KS5 by WestHatch
Psychology KS5Psychology KS5
Psychology KS5
WestHatch77 views
Class 10 English lesson plans by TARIQ KHAN
Class 10 English  lesson plansClass 10 English  lesson plans
Class 10 English lesson plans
TARIQ KHAN257 views
The Open Access Community Framework (OACF) 2023 (1).pptx by Jisc
The Open Access Community Framework (OACF) 2023 (1).pptxThe Open Access Community Framework (OACF) 2023 (1).pptx
The Open Access Community Framework (OACF) 2023 (1).pptx
Jisc85 views
American Psychological Association 7th Edition.pptx by SamiullahAfridi4
American Psychological Association  7th Edition.pptxAmerican Psychological Association  7th Edition.pptx
American Psychological Association 7th Edition.pptx
SamiullahAfridi482 views
JiscOAWeek_LAIR_slides_October2023.pptx by Jisc
JiscOAWeek_LAIR_slides_October2023.pptxJiscOAWeek_LAIR_slides_October2023.pptx
JiscOAWeek_LAIR_slides_October2023.pptx
Jisc79 views
ISO/IEC 27001 and ISO/IEC 27005: Managing AI Risks Effectively by PECB
ISO/IEC 27001 and ISO/IEC 27005: Managing AI Risks EffectivelyISO/IEC 27001 and ISO/IEC 27005: Managing AI Risks Effectively
ISO/IEC 27001 and ISO/IEC 27005: Managing AI Risks Effectively
PECB 545 views
Lecture: Open Innovation by Michal Hron
Lecture: Open InnovationLecture: Open Innovation
Lecture: Open Innovation
Michal Hron96 views
AUDIENCE - BANDURA.pptx by iammrhaywood
AUDIENCE - BANDURA.pptxAUDIENCE - BANDURA.pptx
AUDIENCE - BANDURA.pptx
iammrhaywood69 views

SAIL away: comparing the cohort test to build your own test version of Project SAILS in a Canadian context - Eva & Graham

  • 1. SAIL AWAY: COMPARING THE COHORT TEST TO THE BYOT VERSIONS OF PROJECT SAILS IN A CANADIAN CONTEXT Rumi Graham, Nicole Eva & Sandra Cowan University of Lethbridge Alberta, Canada
  • 3. WHAT IS SAILS? Skills tested:  Developing a research strategy  Selecting finding tools  Searching  Using finding tool features  Retrieving sources  Evaluating sources  Documenting sources  Understanding Economic, Legal, and Social Issues Test versions: • Individual scores test (U.S.) • Cohort test (U.S.) • International cohort test (our only option in 2015) Based on older ACRL ‘compete ncy standards ’ 3
  • 4. WHY SAILS? Luck of the draw – literally! Teaching Development Fund After the fact, few alternatives for Canadian institutions 4Yosemite~commonswiki CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=520257
  • 5. THE RESEARCH: 2015 Research purpose:  reliable, objective data on information literacy (IL) levels of first-year undergrads before and after librarian-created IL instruction Key questions:  Levels of IL possessed by incoming first-year students?  Improvement in students’ IL abilities after IL instruction?  Correlations between students’ IL attainment levels and  their year of study?  number or format of IL instruction sessions? 5 https://pixabay.com/en/question-speech-bubbles-speech- 1828268/
  • 6. ACCESSING THE TEST Web-based; custom link for CMS (Moodle); custom web consent page Unique SAILS ID# for each consenting student No identifying data gathered/stored on SAILS server – total student anonymity 6
  • 7. THE SAMPLE Academic Writing (10 sections) = 250 students Liberal Education 1000 = 87 students THE INTERVENTIONS In-class instruction Online modules THE INCENTIVES Writing: Draw for 1 of 2 $100 bookstore gift cards (must do both) Liberal Education: Draw + 3% bonus (1% pre-test + 2% post- test) 7
  • 8. THE RESULTS 71.3% of respondents came from LBED 1000 68% first year students Approx 80% did both pre and post THE COHORT Cohort: 14 institutions; 6370 students Doctoral cohort 8
  • 9. U OF L RESULTS AGAINST BENCHMARK Pre-test Post-test Above Retrieving Sources Evaluating Sources Developing a research strategy Searching Developing a research strategy Searching At benchmark Selecting finding tools Documenting sources Using finding tool features Documenting sources Retrieving Sources Using finding tool features Evaluating sources Below Selecting finding tools (in order of how well they performed) 9
  • 10. 420 440 460 480 500 520 540 560 580 Developing a research strategy Selecting finding tools Searching Using finding tool featuresRetrieving sources Evaluating sources Documenting sources U of L Results against Benchmark (doctorate institutions) Pre-test Benchmark Post-test 10
  • 11. THE LESSONS LEARNED Limitations of cohort test  We were the only international participant  Our own benchmark; comparing ourselves to ourselves  Swamped by large institution results (Ashford U 43%)  Cannot track individual results from pre to post test  Cannot choose which questions students get Limitations of separating sections of Academic Writing - cohort report Importance of incentives! Time of semester – competing obligations 11
  • 12. 2016: NEW SAILS TESTING OPTION SAILS Build Your Own Test (BYOT) launched January 2016 12  Customizable version of “individual scores test”  Test scores tracked for each test-taker  Hand-picked test questions  No minimum number of test questions; maximum of 50  Available to international institutions
  • 13. 2016: THE RESEARCH Ran study again using BYOT and modified set of courses THE SAMPLE  Library Science (LBSC) 0520 ~ 30 students  First Nations Transition Program course taught by a librarian  Library Science (LBSC) 2000 ~ 30 students  Arts & Science full-credit course taught by a librarian  Liberal Education (LBED) 1000 ~ 95 students  4 labs taught by a librarian; embedded in full-credit course 13
  • 14. 2016: INCENTIVES & INTERVENTIONS THE INCENTIVES  LBSC 0520 and LBSC 2000  In-class time to write pre-test and post-test  Bonus marks: 2% pre-test; 3% post-test  Draw $100 gift card (one per course)  LBED 1000  Bonus marks: 1% for pre-test; 2% for post-test  Draw for $100 gift card THE INTERVENTIONS  In-class instruction; online modules (LBED 1000 only) 14Alan O’Rourke http://bit.ly/2mjCzBQ CC BY 2.0
  • 15. 2016: BUILDING THE BYOTS  IL instructors identified questions on topics not covered in course (eliminated 50 questions)  From remaining 112 questions, researchers not involved in grading any coursework selected 2 matching, non-overlapping sets of questions  Both sets had same number of questions in each skill area reflecting same range of difficulty  Pre-test and post-test each contained 26 SAILS questions (42% shorter than 45-question cohort test) 15http://bit.ly/2mx4iOu CC0
  • 16. 2016: TEST QUESTION MIX # Questions Easy Moderat e Difficult Developing Research Strategy 4 2 1 1 Selecting Finding Tools 3 1 1 1 Searching 4 1 2 1 Using Finding Tool Features 3 1 1 1 Retrieving Sources 3 1 1 1 Evaluating Sources 4 1 2 1 Documenting Sources 3 1 1 1 Economic, Legal, Social Issues 2 1 1 16
  • 17. 2016: PARTICIPATION RATE 17 # Enrolled % Enrolled Wrote Pre-test or Post-test Participated in Study LBED 1000 95 60.5% 80 84.2% LBSC 0520 32 20.4% 31 96.9% LBSC 2000 30 19.1% 30 100% All Students 157 100% 141 89.8%
  • 18. 2016: PRE-TEST QUESTION ON PRIOR IL INSTRUCTION (n=124) 18
  • 19. 2016: MEAN SCORES IMPROVED IN ALL COURSES? LBED 1000 LBSC 0520 LBSC 2000 Pre-test lowest 14.5% 23.1% 23.1% Pre-test highest 80.8% 61.5% 80.8% Pre-test mean 56.4% 41.7% 56.9% Post-test lowest 15.4% 15.4% 23.1% Post-test highest 84.6% 73.1% 80.8% Post-test mean 59.9% 48.2% 59.8% 19 *Pre-test: n=124 Post-test: n=126
  • 20. 2016: MEAN SCORES IMPROVED IN ALL YEARS? 20 1st Year 2nd Year 3rd Year+ Pre-test lowest 23.1% 15.4% 30.8% Pre-test highest 80.8% 80.8% 80.8% Pre-test mean 51.8% 56.4% 59.3% Post-test lowest 15.4% 26.9% 23.1% Post-test*Pre-test: n=124 Post-test: n=126
  • 21. 2016: INDIVIDUAL STUDENTS IMPROVED? 107 students wrote the pre- and post-tests (68%)  Mean pre-test score = 53.95%  Mean post-test score = 58.16%  Mean difference = +4.21%  Margin of error = ± 2.9 % 21
  • 22. 2016: STUDENTS IMPROVED IN ALL COURSES? (n=107) 22 LBED 1000 LBSC 0520 LBSC 2000 Difference between pre- test and post-test mean scores 5.8% 4.9% 0.004%
  • 23. 2016: STUDENTS IMPROVED IN ALL YEARS? (n=107) 23 1st Year 2nd Year 3rd Year+ Difference between pre-test and post-test mean scores 6.30% 4.26% -5.28%
  • 25. 2016: LESSONS LEARNED BYOT •Mean time to complete post-test ~ 12 to 15 min. (LBSC courses), suggesting 26-question BYOT not overly demanding •Greater likelihood of statistically significant results with larger classes •Mean scores all well below Proficiency level (70% or better), but 31 students reached Proficiency and 3 reached Mastery level (85% or better) in post-test INCENTIVES •Bonus marks a large incentive, but in-class time to write the tests even more effective •Were upper-level students more pragmatic in their participation efforts? 25
  • 26. CONCLUSIONS BYOT ADVANTAGES •You determine which questions are included and overall test length •Permits singular focus on only your students’ test results •Permits tracking individual students’ scores •Affords wide range of statistical analyses COHORT TEST ADVANTAGES •Easier to prepare for (no need to select questions) •Useful for institutions committed to large-scale, longitudinal testing •No data analysis! (just interpretation) •Slightly less expensive than individual scores/BYOT 26
  • 27. SOURCES Project SAILS website: https://www.projectsails.org/ • International Cohort Assessment: https://www.projectsails.org/International • Build Your Own Test: https://www.projectsails.org/BYOT Cowan, S., Graham, R. & Eva, N. (2016). How information literate are they? A SAILS study of (mostly) first-year students at the U of L. Light on Teaching, 2016-17, 17-20. Retrieved from http://bit.ly/2dlOTi6 Questions? 27
  • 28. 2016: NEW SAILS TESTING OPTION SAILS Build Your Own Test (BYOT) launched January 2016 28  Customizable version of “individual scores test”  Test scores tracked for each test-taker  Hand-picked test questions  No minimum number of test questions; maximum of 50  Available to international institutions
  • 29. 2016: THE RESEARCH Ran study again using BYOT and modified set of courses THE SAMPLE  Library Science (LBSC) 0520 ~ 30 students  First Nations Transition Program course taught by a librarian  Library Science (LBSC) 2000 ~ 30 students  Arts & Science full-credit course taught by a librarian  Liberal Education (LBED) 1000 ~ 95 students  4 labs taught by a librarian; embedded in full-credit course 29
  • 30. 2016: INCENTIVES & INTERVENTIONS THE INCENTIVES  LBSC 0520 and LBSC 2000  In-class time to write pre-test and post-test  Bonus marks: 2% pre-test; 3% post-test  Draw $100 gift card (one per course)  LBED 1000  Bonus marks: 1% for pre-test; 2% for post-test  Draw for $100 gift card THE INTERVENTIONS  In-class instruction; online modules (LBED 1000 only) 30Alan O’Rourke http://bit.ly/2mjCzBQ CC BY 2.0
  • 31. 2016: BUILDING THE BYOTS  IL instructors identified questions on topics not covered in course (eliminated 50 questions)  From remaining 112 questions, researchers not involved in grading any coursework selected 2 matching, non-overlapping sets of questions  Both sets had same number of questions in each skill area reflecting same range of difficulty  Pre-test and post-test each contained 26 SAILS questions (42% shorter than 45-question cohort test) 31http://bit.ly/2mx4iOu CC0
  • 32. 2016: TEST QUESTION MIX # Questions Easy Moderat e Difficult Developing Research Strategy 4 2 1 1 Selecting Finding Tools 3 1 1 1 Searching 4 1 2 1 Using Finding Tool Features 3 1 1 1 Retrieving Sources 3 1 1 1 Evaluating Sources 4 1 2 1 Documenting Sources 3 1 1 1 Economic, Legal, Social Issues 2 1 1 32
  • 33. 2016: PARTICIPATION RATE 33 # Enrolled % Enrolled Wrote Pre-test or Post-test Participated in Study LBED 1000 95 60.5% 80 84.2% LBSC 0520 32 20.4% 31 96.9% LBSC 2000 30 19.1% 30 100% All Students 157 100% 141 89.8%
  • 34. 2016: PRE-TEST QUESTION ON PRIOR IL INSTRUCTION (n=124) 34
  • 35. 2016: MEAN SCORES IMPROVED IN ALL COURSES? LBED 1000 LBSC 0520 LBSC 2000 Pre-test lowest 15.4% 23.1% 23.1% Pre-test highest 80.8% 61.5% 80.8% Pre-test mean 56.4% 41.7% 56.9% Post-test lowest 15.4% 15.4% 23.1% Post-test highest 84.6% 73.1% 80.8% Post-test mean 59.9% 48.2% 59.7% 35 *Pre-test: n=124 Post-test: n=126
  • 36. 2016: MEAN SCORES IMPROVED IN ALL YEARS? 36 1st Year 2nd Year 3rd Year+ Pre-test lowest 23.1% 15.4% 30.8% Pre-test highest 80.8% 80.8% 80.8% Pre-test mean 51.8% 56.5% 59.3% Post-test lowest 15.4% 26.9% 23.1% Post-test*Pre-test: n=124 Post-test: n=126
  • 37. 2016: INDIVIDUAL STUDENTS IMPROVED? 107 students wrote the pre- and post-tests (68%)  Mean pre-test score = 53.95%  Mean post-test score = 58.16%  Mean difference = +4.21%  Margin of error = ± 2.89 % 37
  • 38. 2016: STUDENTS IMPROVED IN ALL COURSES? (n=107) 38 LBED 1000 LBSC 0520 LBSC 2000 Difference between pre- test and post-test mean scores 5.767% 4.933% 0.004%
  • 39. 2016: STUDENTS IMPROVED IN ALL YEARS? (n=107) 39 1st Year 2nd Year 3rd Year+ Difference between pre-test and post-test mean scores 6.30% 4.26% -5.28%
  • 41. 2016: LESSONS LEARNED BYOT •Mean time to complete post-test ~ 12 to 15 min. (LBSC courses), suggesting 26-question BYOT not overly demanding •Greater likelihood of statistically significant results with larger classes •Mean scores all well below Proficiency level (70% or better), but 31 students reached Proficiency and 3 reached Mastery level (85% or better) in post-test INCENTIVES •Bonus marks a large incentive, but in-class time to write the tests even more effective •Were upper-level students more pragmatic in their participation efforts? 41
  • 42. CONCLUSIONS BYOT ADVANTAGES •You determine which questions are included and overall test length •Permits singular focus on only your students’ test results •Permits tracking individual students’ scores •Affords wide range of statistical analyses COHORT TEST ADVANTAGES •Easier to prepare for (no need to select questions) •Useful for institutions committed to large-scale, longitudinal testing •No data analysis! (just interpretation) •Slightly less expensive than individual scores/BYOT 42
  • 43. SOURCES Project SAILS website: https://www.projectsails.org/ • International Cohort Assessment: https://www.projectsails.org/International • Build Your Own Test: https://www.projectsails.org/BYOT Cowan, S., Graham, R. & Eva, N. (2016). How information literate are they? A SAILS study of (mostly) first-year students at the U of L. Light on Teaching, 2016-17, 17-20. Retrieved from http://bit.ly/2dlOTi6 Questions? 43

Editor's Notes

  1. The University of Lethbridge is a mid-sized research university (about 8,000 students) with a growing graduate program and a focus on Liberal Education in southern Alberta. Founded in 1967 – our 50th anniversary this year.
  2. International cohort test officially became available for use June 2014
  3. Students were mainly first year, enrolled in first year courses Writing students watched online modules and had one in-class session with a librarian; LBED students watched the modules and had four labs with a librarian (rest of course is taught by Arts and Science faculty) The incentive for the students, apart from knowing that they were contributing to research, was a chance to win a draw for one of two $100 gift certificates from the U of L bookstore. The Liberal Education students were also given a 3% bonus for completing both tests. This worked especially well, as we saw a very good participation rate among the students in this class – 61 out of 87 students completed the pre-test, and 61 out of 84 completed the post-test. The draw alone did not seem to be sufficient incentive, as out of 10 participating sections of Writing 1000 (potentially 250 students), only 26 students completed the pre- test, and 22 the post-test.