The Collegiate Learning Assessment (CLA+):
An institutional case study of using standardised
testing to measure students’ generic skills.
Professor Stuart Brand and Jamie Morris
SEDA Spring Conference 2018
Institutional Context – Birmingham City
University
• Post 1992 University
• ~24,000 Students
• 8 Campuses moving to 2/3
• Vocational approach to learning
• Multicultural student population
• Commuter students (69%)
International Context - The RAND Report
(2015)
http://www.hefce.ac.uk/pubs/rereports/year/20
15/learninggain/
• Commissioned by the Department for Business Innovation and Skills
(BIS), the Higher Education Funding Council for England
(HEFCE) and the Higher Education Academy (HEA)
• Rand Europe
What is ‘Learning Gain’?
• Broadly, it is an attempt to measure the improvement in knowledge, skills,
work-readiness and personal development made by students during
their time spent in higher education.
• In our case - what level of skills do students enter and leave with when they
undertake their course of study?
Discussion Point
• What does Learning Gain mean for you in your specific subject? What do
you expect your graduates to have when they exit?
• In light of the subject-level TEF, how might you demonstrate the impact
your programme has on your students?
HEFCE and ‘Learning Gain’
●HEFCE released call for a new learning gain project to be used in selected
Universities throughout England
●Decisions released formally at end of July 2015
●13 collaborative projects funded
●Birmingham City bid with three partners: Coventry, Liverpool John Moores
and Staffordshire
Project Overview - Revised Proposal
• Original Proposal – longitudinal study, 1,000 students with 4 testing points.
Revised to:
• 2 longitudinal studies from 2015-18 and 2016-19
• Follows an additional ~1000 students
• 2 Cross sectional study in Semester 2 of 2016-17 and 2017-18 academic year
• Test a mixture of L4 (1st year), L5 (2nd year) and L6 (3rd year)
• 4 testing points reduced to 3:
• Autumn of year 1
• Spring of year 2
• Spring of year 3
Methodologies
• Standardised tests - These measure the skills students acquire, which may be
generic or specialised (for example, a skill specific to a discipline).
• Self-reporting surveys - Such surveys ask students to report the extent to which
they think they have gained knowledge and developed skills.
• Other qualitative methods - These encourage students to reflect on their
learning, acquired skills and skills gaps.
• Mixed methods - This draws on a range of tools and indicators to track
improvement in performance, (for example through a combination of grades,
student learning data and student surveys).
Emergent approaches
• Cognitive gain – what students think and know
• Soft skills development – affective measures of attitudes and how
students feel
• Employability and careers readiness – mainly behavioural measures of
activities students have undertaken in preparation for the world of work
The Council for Aid to Education (CAE)
• In 2002, CAE, a not for profit based in NYC, launched a national effort to
assess the quality of undergraduate education
• CLA+ (Collegiate Learning Assessment) is a web administered, value added
assessment programme
• The aim of CLA+ is to provide an objective assessment about the
critical-thinking skills a student possesses as they enter and exit college.
The Collegiate Learning Assessment
(CLA+)
• The Collegiate Learning Assessment (CLA+) is a performance based test that
measures critical thinking, problem solving, analytic reasoning, and writing
skills.
• The CLA+ uses a Performance Task (60”) and 25 Selected-response
Questions (30”).
• Can take up to 2 hours to complete - Tutorials and Demographic Survey.
• PT responses are scored (1-6) by staff leads in each institution - requires scorer
training.
Why CLA+? (CLA post 2011)
• Generates an individual student report based on a Mastery Level (1-5)
• Mastery Levels are formed via a combination of scores for the Performance Task
and Selected-Response Questions.
• Provides a rich source of data from the U.S and other participating countries e.g.
Italy, Germany, Chile and Peru using the CLA+.
• Key point - Emphasis on highlighting areas of development for the student.
How have we embedded the CLA+? –
Optional vs Compulsory
• 1st and 2nd year of testing - financial incentives
• Low attendance - first two years
• Successful engagement with the Birmingham City Business School
• Embedded into Professional Studies Module as formative assessment.
• Importance of engaging programme and modular level staff
Findings from Optional Testing – Focus
Groups
• Structure – Issues with lack of spell check and viewing content.
• Most students believed that the test was not subject independent.
• The time the test was taken in the academic year was an issue
• Students would take the CLA+ test if employers and other HEIs valued the scores
from it.
• Students who had taken the test twice noted enhancement of other skills e.g. time
management
Focus Group Findings (optional testing) –
cont…
• Cross-sectional study Spring 2017 (n=78)
• Students also reported specific disciplines may be advantaged in the test
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
Computing Graphics Jewellery Marketing
Average Mastery Level
Autumn (Fall) 2017 Testing – Compulsory
Testing
● 201 L4 students tested (81% response rate) in a week
● Previous issues with recruitment no longer encountered
● Focus on one participating School - Birmingham City Business School
● Testing embedded into a core module
● Tied directly to summative assessment – E-Portfolio
MKT4020 - Professional Development
● Developed via the University’s Transforming the Curriculum (TtC) process
● CLA+ embedded into Week 2 of the Module, focusing on Critical Thinking:
○ LO1 - Undertake an audit of skills and capabilities, reflecting on the outcomes and
engage with a programme of professional development.
○ LO3 - Evidence skills in the context of professional development and evaluate impact.
● Other sessions include:
○ Building competency with contemporary IT software e.g. Microsoft Office
○ Enhancing presentation skills
○ Forming a professional portfolio and profile via e-portfolio and Social Media
MKT4020 - Professional Development
● Reports distributed in Week 7 employability week in bespoke activity
● Tied into extra curricular awards framework – Graduate+
● Majority of the reports collected!
● CLA+ individual student report submitted as evidence for summative
assessment
● Electronic copies also available to send to students who could not attend.
Results - Fall 2017
Average Scores vs Effort
0
200
400
600
800
1000
1200
1400
Effort 1 Effort 2 Effort 3 Effort 4 Effort 5
Average scores vs Effort
Performance Task average score Selected Response average score Linear (Performance Task average score) Linear (Selected Response average score)
Cont..
Fall – Mastery Levels by Ethnicity
3.09
2.1
2.24
0
3.00
2.32 2.38
2.14
1.60
0.50
2.23
0.00
0.50
1.00
1.50
2.00
2.50
3.00
3.50
White Black Carribean Black African Black Other Mixed Ethnicity Indian Asian Pakistani Asian Bangladeshi Asian Chinese Prefer not to say Not Applicable
Average Mastery Levels Across Ethnicity (Fall 2017)
Native vs 2nd Language
1232.64
1093.54
1057.53
961.94
0
200
400
600
800
1000
1200
1400
Native 2nd Language
Average of Performance Task & Selected Response Scores Across Language (Fall 2017)
Performance Task Selected Response
Summary of Testing Numbers 2015/2018 -
Birmingham City University
• Institution Total:
• 2015/16 academic year - 220
• 2016/17 academic year - 299
• 2017/18 academic year - 259
• Birmingham City University total - 778 students
• Overall Total (Across 4 partners):
• 2015-2018 longitudinal study – 845 students tested
• 2016-2019 longitudinal study – 693 students tested
• 2017 (Spring) cross-sectional study – 201 students tested
• 2017 (Autumn) cross-sectional study – 248 students tested
• 2018 (Spring) cross-sectional study – TBC (awaiting CLA+ test window to close)
CLA+ as a diagnostic tool – Future
Developments
• Interest from numerous staff at BCU on utilising the test as part of a
foundation degree.
• Could also provide comparative data if L4s are tested.
• CLA+ to be continue being used in Birmingham City Business School
programmes
• CAE to provide anonymised, aggregated analysis of student performance
across 4 partners
Thank you for listening!
Any questions?

The collegiate learning assessment (CLA+)

  • 1.
    The Collegiate LearningAssessment (CLA+): An institutional case study of using standardised testing to measure students’ generic skills. Professor Stuart Brand and Jamie Morris SEDA Spring Conference 2018
  • 2.
    Institutional Context –Birmingham City University • Post 1992 University • ~24,000 Students • 8 Campuses moving to 2/3 • Vocational approach to learning • Multicultural student population • Commuter students (69%)
  • 3.
    International Context -The RAND Report (2015) http://www.hefce.ac.uk/pubs/rereports/year/20 15/learninggain/ • Commissioned by the Department for Business Innovation and Skills (BIS), the Higher Education Funding Council for England (HEFCE) and the Higher Education Academy (HEA) • Rand Europe
  • 4.
    What is ‘LearningGain’? • Broadly, it is an attempt to measure the improvement in knowledge, skills, work-readiness and personal development made by students during their time spent in higher education. • In our case - what level of skills do students enter and leave with when they undertake their course of study?
  • 5.
    Discussion Point • Whatdoes Learning Gain mean for you in your specific subject? What do you expect your graduates to have when they exit? • In light of the subject-level TEF, how might you demonstrate the impact your programme has on your students?
  • 6.
    HEFCE and ‘LearningGain’ ●HEFCE released call for a new learning gain project to be used in selected Universities throughout England ●Decisions released formally at end of July 2015 ●13 collaborative projects funded ●Birmingham City bid with three partners: Coventry, Liverpool John Moores and Staffordshire
  • 7.
    Project Overview -Revised Proposal • Original Proposal – longitudinal study, 1,000 students with 4 testing points. Revised to: • 2 longitudinal studies from 2015-18 and 2016-19 • Follows an additional ~1000 students • 2 Cross sectional study in Semester 2 of 2016-17 and 2017-18 academic year • Test a mixture of L4 (1st year), L5 (2nd year) and L6 (3rd year) • 4 testing points reduced to 3: • Autumn of year 1 • Spring of year 2 • Spring of year 3
  • 8.
    Methodologies • Standardised tests- These measure the skills students acquire, which may be generic or specialised (for example, a skill specific to a discipline). • Self-reporting surveys - Such surveys ask students to report the extent to which they think they have gained knowledge and developed skills. • Other qualitative methods - These encourage students to reflect on their learning, acquired skills and skills gaps. • Mixed methods - This draws on a range of tools and indicators to track improvement in performance, (for example through a combination of grades, student learning data and student surveys).
  • 9.
    Emergent approaches • Cognitivegain – what students think and know • Soft skills development – affective measures of attitudes and how students feel • Employability and careers readiness – mainly behavioural measures of activities students have undertaken in preparation for the world of work
  • 10.
    The Council forAid to Education (CAE) • In 2002, CAE, a not for profit based in NYC, launched a national effort to assess the quality of undergraduate education • CLA+ (Collegiate Learning Assessment) is a web administered, value added assessment programme • The aim of CLA+ is to provide an objective assessment about the critical-thinking skills a student possesses as they enter and exit college.
  • 11.
    The Collegiate LearningAssessment (CLA+) • The Collegiate Learning Assessment (CLA+) is a performance based test that measures critical thinking, problem solving, analytic reasoning, and writing skills. • The CLA+ uses a Performance Task (60”) and 25 Selected-response Questions (30”). • Can take up to 2 hours to complete - Tutorials and Demographic Survey. • PT responses are scored (1-6) by staff leads in each institution - requires scorer training.
  • 12.
    Why CLA+? (CLApost 2011) • Generates an individual student report based on a Mastery Level (1-5) • Mastery Levels are formed via a combination of scores for the Performance Task and Selected-Response Questions. • Provides a rich source of data from the U.S and other participating countries e.g. Italy, Germany, Chile and Peru using the CLA+. • Key point - Emphasis on highlighting areas of development for the student.
  • 13.
    How have weembedded the CLA+? – Optional vs Compulsory • 1st and 2nd year of testing - financial incentives • Low attendance - first two years • Successful engagement with the Birmingham City Business School • Embedded into Professional Studies Module as formative assessment. • Importance of engaging programme and modular level staff
  • 14.
    Findings from OptionalTesting – Focus Groups • Structure – Issues with lack of spell check and viewing content. • Most students believed that the test was not subject independent. • The time the test was taken in the academic year was an issue • Students would take the CLA+ test if employers and other HEIs valued the scores from it. • Students who had taken the test twice noted enhancement of other skills e.g. time management
  • 15.
    Focus Group Findings(optional testing) – cont… • Cross-sectional study Spring 2017 (n=78) • Students also reported specific disciplines may be advantaged in the test 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 Computing Graphics Jewellery Marketing Average Mastery Level
  • 16.
    Autumn (Fall) 2017Testing – Compulsory Testing ● 201 L4 students tested (81% response rate) in a week ● Previous issues with recruitment no longer encountered ● Focus on one participating School - Birmingham City Business School ● Testing embedded into a core module ● Tied directly to summative assessment – E-Portfolio
  • 17.
    MKT4020 - ProfessionalDevelopment ● Developed via the University’s Transforming the Curriculum (TtC) process ● CLA+ embedded into Week 2 of the Module, focusing on Critical Thinking: ○ LO1 - Undertake an audit of skills and capabilities, reflecting on the outcomes and engage with a programme of professional development. ○ LO3 - Evidence skills in the context of professional development and evaluate impact. ● Other sessions include: ○ Building competency with contemporary IT software e.g. Microsoft Office ○ Enhancing presentation skills ○ Forming a professional portfolio and profile via e-portfolio and Social Media
  • 18.
    MKT4020 - ProfessionalDevelopment ● Reports distributed in Week 7 employability week in bespoke activity ● Tied into extra curricular awards framework – Graduate+ ● Majority of the reports collected! ● CLA+ individual student report submitted as evidence for summative assessment ● Electronic copies also available to send to students who could not attend.
  • 19.
  • 20.
    Average Scores vsEffort 0 200 400 600 800 1000 1200 1400 Effort 1 Effort 2 Effort 3 Effort 4 Effort 5 Average scores vs Effort Performance Task average score Selected Response average score Linear (Performance Task average score) Linear (Selected Response average score)
  • 21.
  • 22.
    Fall – MasteryLevels by Ethnicity 3.09 2.1 2.24 0 3.00 2.32 2.38 2.14 1.60 0.50 2.23 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 White Black Carribean Black African Black Other Mixed Ethnicity Indian Asian Pakistani Asian Bangladeshi Asian Chinese Prefer not to say Not Applicable Average Mastery Levels Across Ethnicity (Fall 2017)
  • 23.
    Native vs 2ndLanguage 1232.64 1093.54 1057.53 961.94 0 200 400 600 800 1000 1200 1400 Native 2nd Language Average of Performance Task & Selected Response Scores Across Language (Fall 2017) Performance Task Selected Response
  • 24.
    Summary of TestingNumbers 2015/2018 - Birmingham City University • Institution Total: • 2015/16 academic year - 220 • 2016/17 academic year - 299 • 2017/18 academic year - 259 • Birmingham City University total - 778 students • Overall Total (Across 4 partners): • 2015-2018 longitudinal study – 845 students tested • 2016-2019 longitudinal study – 693 students tested • 2017 (Spring) cross-sectional study – 201 students tested • 2017 (Autumn) cross-sectional study – 248 students tested • 2018 (Spring) cross-sectional study – TBC (awaiting CLA+ test window to close)
  • 25.
    CLA+ as adiagnostic tool – Future Developments • Interest from numerous staff at BCU on utilising the test as part of a foundation degree. • Could also provide comparative data if L4s are tested. • CLA+ to be continue being used in Birmingham City Business School programmes • CAE to provide anonymised, aggregated analysis of student performance across 4 partners
  • 26.
    Thank you forlistening! Any questions?