Assessing the assessment: A methodology to
facilitate university-wide reporting
Prepared by:
Terra Schehr
Assistant Vice President for Institutional Research & Effectiveness
Loyola University Maryland
For:
AIR Forum, New Orleans
June 2012
2
Overview
 Institutional context
 Assessment context
 Process for assessing the assessment
 Next steps
Institutional Context
 Private, Catholic Jesuit University
 Three schools (arts & sciences, business, education)
 Undergraduate Enrollment ~3,800
 35 degree programs
 Undergraduate Enrollment ~2,200
 Graduate degree programs in 7 areas
 Mainly master’s degrees
 Two doctoral degrees
 Many certificate programs
3
4
Student Learning Assessment Committee
 Committee of the Academic Senate
 Five faculty
 One student
 Assistant Vice President for Institutional Research &
Effectiveness
 Charge: The Student Learning Assessment Committee will
gather information and review processes relating to the
assessment of student learning at the departmental and
institutional levels. In this role, it will provide feedback to the
university community on the on-going process of student
learning assessment.
Where We Are Headed: The Standard Display
of the Results
Loyola University Status of
Assessment Based on 2009-10
Reviews
Non-
Existent
Minimally
Developed
Well
Developed
Highly
Developed
Learning Outcomes 14% 6% 20% 59%
Assessment Mechanism 16% 8% 25% 55%
Assessment Results 23% 8% 17% 52%
Use of Results 31% 8% 16% 45%
Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland.
-Percentages based on a total of 64 degree programs.
-Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e.
fine arts submitted separate reports for music and theater).
-Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of
“non-existent.”
-Certificate programs are not included.
5
ASSESSMENT CONTEXT
Why Assess?
7
Response options:
1=No Importance
2=Minor Importance,
3=Moderate Importance
4=High Importance
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Uses of Assessment Data
8
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
The top 8 out of 22
possible uses of
assessment data are
shown here.
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Accreditations
9
*Accrediting Board for
Engineering and Technology
*American Association of Pastoral
Counselors
*American Chemical Society
*American Psychological
Association
*American Speech-Language-
Hearing Association
*Association to Advance
Collegiate Schools of Business
*Computer Science Accreditation
Commission
*Council for Accreditation of
Counseling and Related
Educational Programs
*National Council for Accreditation
of Teacher Education
10
Accreditation and the Effective Institution
Middle States:
“An effective institution is one in which growth,
development, and change are the result of a thoughtful
and rational process of self-examination and planning,
and one in which such a process is an inherent part of
ongoing activities.”
Source: MSCHE Characteristics of Excellence in Higher Education (2006) p.4
11
Improving Student Learning
 What are we trying to do?
 How well are we doing?
 How can we improve?
Learning
Outcomes
Assessment
Method
Criteria for
Success
Assessment Results
Use of
Results
Assessment
Plan
12
Assessment Cycle
How can we
improve?
What are we
trying to do?
How well are
we doing?
13
Learning Outcomes
 What students should know?
 What students should be able to do?
 What students should value?
14
Levels of Learning Outcomes
Institution
CORE/
Gen Ed
Major
15
Types of Learning Outcomes
Skills &
Values
Skills,
Values, &
Knowledge
Knowledge,
Skills, &
Values
16
One Size Does Not Fit All
 Assessment method must fit the outcome
 Assessment method must fit the pedagogical culture of
the discipline
 Assessment method must fit the culture of the
institution
17
Some Common Assessment Methods
Source of data
(examples)
Means of Assessment
(examples)
Self-report (e.g. surveys) % reporting outcome
attainment
Embedded course work (e.g.
final paper/project)
Scoring guide/rubric
Outside evaluation (e.g.
internship supervisor’s
report)
% reporting outcome
attainment
External measure (e.g. major
field test)
Score on test
Methods Used at the Program Level
18
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Educational Effectiveness Table
19
ASSESSING THE ASSESSMENT
Assessing the Assessment – Review Process
 At the end of each academic year, departments submit
assessment reports for each of their degree programs
 Each assessment report is reviewed by at least two
Committee members using a common rubric
 Rubric was developed by the Committee
 The Assistant Vice President for Institutional Research &
Effectiveness is a reviewer on all reports
 If the two reviewers differ by more than two points on the 4-
point rating scale, the report is reviewed by a third Committee
member
21
Degree Program Learning Assessment Activity Rubric
22
Assessment
Attribute
Non-Existent
(1)
Minimally Developed
(2)
Well
Developed
(3)
Highly
Developed
(4)
Learning
Outcomes
No SLOs are articulated or
aims are articulated as
program aims
SLOs are included but are
not clearly stated in terms
of what students will learn
SLOs clearly identify what
majors will learn by
completing the degree
program
SLOs clearly identify what
majors will learn by
completing the degree
program and have been
incorporated into or linked
to various courses
Assessment
Mechanism
No assessment
mechanism is articulated
An assessment
mechanism has been
identified for at least one
SLOs
Assessment mechanisms
include direct evidence of
student learning
Assessment mechanisms
include direct evidence
and have been articulated
for all SLOs
Assessment
Results
There is no evidence that
assessment data has been
gathered
Assessment data has been
gathered
Assessment data has been
gathered and there is a
systematic process in
place for data collection
Assessment data has been
gathered and there is a
systematic process in
place for data collection
and analysis
Use of Results There is no evidence that
assessment data has been
analyzed at the unit
(degree program) level
Assessment data have
been analyzed at the unit
level
Assessment data have
been analyzed at the unit
level and there is a
process in place for faculty
to use assessment results
Assessment data has been
analyzed at the unit level,
there is a process in place
for faculty to use
assessment results, and
there is at least one
example of attempts to
improve the program
based on assessment
results
Assessing the Assessment – Reporting Process
 The average of the reviewers ratings is used as the
final rating for each attribute
 There is no “overall” rating given to an assessment report
 Departments receive a feedback letter indicating how
their assessment report(s) were rated
 Final ratings for each degree program were shared in
the Committee’s report to the Academic Senate
 Ratings were also aggregated by college/school and
for the University as a whole
23
24
Departmental Learning Assessment Activity Rubric
Annual Assessment Report 2008-09
Department Name ______Sociology__________ Degree Program ___BA________________
Assessment
Attribute
Non-Existent
(1)
Minimally
Developed
(2)
Well
Developed
(3)
Highly
Developed
(4)
Learning
Aims
____
No aims are
articulated or aims
are articulated as
program aims
Aims are included
but are not clearly
stated in terms of
what students will
learn
Learning aims
clearly identify
what majors will
learn by
completing the
degree program
Learning aims
clearly identify
what majors will
learn by
completing the
degree program
and have been
incorporated into
or linked to
various courses
Assessment
Mechanism
____
No assessment
mechanism is
articulated
An assessment
mechanism has
been identified for
at least one
learning aim
Assessment
mechanisms
include direct
evidence of
student learning
Assessment
mechanisms
include direct
evidence and
have been
articulated for all
learning aims
Assessment
Results
____
There is no
evidence that
assessment data
has been
gathered
Assessment data
has been
gathered
Assessment data
has been
gathered and
there is a
systematic
process in place
for data collection
Assessment data
has been
gathered and
there is a
systematic
process in place
for data collection
and analysis
Use of
Results
____
There is no
evidence that
assessment data
has been
analyzed at the
unit (degree
program) level
Assessment data
have been
analyzed at the
unit level
Assessment data
have been
analyzed at the
unit level and
there is a process
in place for faculty
to use
assessment
results
Assessment data
has been
analyzed at the
unit level, there is
a process in place
for faculty to use
assessment
results, and there
is at least one
example of
attempts to
improve the
program based on
assessment
results
Departmental Feedback Letters
Feedback letter the first
year was a simple graphic
indicating where the
program fell on the rubric
Feedback letter the second
year included the rubric
ratings and narrative
summary of the reviewers
thoughts about strengths
and weaknesses of the
documented assessment
Assessing the Assessment – Detailed Report
25
Department/Program Degree Learning Aims
Assessment
Mechanism
Assessment
Results
Use of
Results
LCAS
Program Name BA High Well Non-Existent Non-Existent
Program Name BS Minimal High Non-Existent Non-Existent
Program Name BS High Well Well Minimal
Program Name BA No Report No Report No Report No Report
Program Name BA Minimal Well Well Well
Program Name BA High Non-Existent Non-Existent Non-Existent
Program Name BA No Report No Report No Report No Report
Program Name BA High High Well Well
Program Name BS High High Well Well
Program Name MA Non-Existent Minimal Well Non-Existent
Program Name MS Minimal Minimal Well Well
Program Name BA High High High Well
Program Name BS High High High High
Assessing the Assessment – Aggregate Report
26
Loyola University Status of
Assessment Based on 2009-10
Reviews
Non-
Existent
Minimally
Developed
Well
Developed
Highly
Developed
Learning Aims 14% 6% 20% 59%
Assessment Mechanism 16% 8% 25% 55%
Assessment Results 23% 8% 17% 52%
Use of Results 31% 8% 16% 45%
Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland.
-Percentages based on a total of 64 degree programs.
-Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e.
fine arts submitted separate reports for music and theater).
-Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of
“non-existent.”
-Certificate programs are not included.
NEXT STEPS
Challenges and Plans
 Creating a shared understanding of University
standards for assessment
 SLAC use what was learned from two years of assessing the
assessment as context for the development of “Principles and
Practices of Assessment at Loyola University Maryland.”
 Creating a shared understanding that documentation
of assessment on the part of academic programs is
necessary
 SLAC report to Senate with the results of the reviews has been
very helpful here
28
Challenges and Plans
 Varied nature of the of assessment reports submitted
(format, length, and content)
 Encouraging departments to use a common assessment report
template
 Managing the volume of (figurative) paper
 Investigate technological solutions to streamline the file
management process
 Time intensive for the Committee
 Level of detail in the feedback to departments is being scaled
 Committee is considering a review cycle so that all programs
are not reviewed annually
29
Challenges and Plans
 The “Use of Results” attribute on the rubric was not
interpreted consistently
 Revision will be made to the rubric for clarity
 Not commenting on the quality of the assessment
 ??
 Sharing the details of departmental assessment
activities across the University
 Investigate technological solutions for a repository of examples
and best practices within departments at Loyola
30
ADVANTAGES
Advantages
 You don’t need to be an assessment expert to review
assessment reports with the rubric
 You don’t need have specialized knowledge of the
discipline being reviewed
 We now know, can substantiate, and communicate the
status of engagement with student learning
assessment within the academic departments
 Data were reported in the most recent institutional self-study for
regional accreditation
32
Prepared by:
Terra Schehr
Assistant Vice President for
Institutional Research and Effectiveness
tschehr@loyola.edu
www.loyola.edu/ir

TSchehr_Assessing the Assessment_AIR

  • 1.
    Assessing the assessment:A methodology to facilitate university-wide reporting Prepared by: Terra Schehr Assistant Vice President for Institutional Research & Effectiveness Loyola University Maryland For: AIR Forum, New Orleans June 2012
  • 2.
    2 Overview  Institutional context Assessment context  Process for assessing the assessment  Next steps
  • 3.
    Institutional Context  Private,Catholic Jesuit University  Three schools (arts & sciences, business, education)  Undergraduate Enrollment ~3,800  35 degree programs  Undergraduate Enrollment ~2,200  Graduate degree programs in 7 areas  Mainly master’s degrees  Two doctoral degrees  Many certificate programs 3
  • 4.
    4 Student Learning AssessmentCommittee  Committee of the Academic Senate  Five faculty  One student  Assistant Vice President for Institutional Research & Effectiveness  Charge: The Student Learning Assessment Committee will gather information and review processes relating to the assessment of student learning at the departmental and institutional levels. In this role, it will provide feedback to the university community on the on-going process of student learning assessment.
  • 5.
    Where We AreHeaded: The Standard Display of the Results Loyola University Status of Assessment Based on 2009-10 Reviews Non- Existent Minimally Developed Well Developed Highly Developed Learning Outcomes 14% 6% 20% 59% Assessment Mechanism 16% 8% 25% 55% Assessment Results 23% 8% 17% 52% Use of Results 31% 8% 16% 45% Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland. -Percentages based on a total of 64 degree programs. -Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e. fine arts submitted separate reports for music and theater). -Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of “non-existent.” -Certificate programs are not included. 5
  • 6.
  • 7.
    Why Assess? 7 Response options: 1=NoImportance 2=Minor Importance, 3=Moderate Importance 4=High Importance National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 8.
    Uses of AssessmentData 8 National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) The top 8 out of 22 possible uses of assessment data are shown here. Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 9.
    Accreditations 9 *Accrediting Board for Engineeringand Technology *American Association of Pastoral Counselors *American Chemical Society *American Psychological Association *American Speech-Language- Hearing Association *Association to Advance Collegiate Schools of Business *Computer Science Accreditation Commission *Council for Accreditation of Counseling and Related Educational Programs *National Council for Accreditation of Teacher Education
  • 10.
    10 Accreditation and theEffective Institution Middle States: “An effective institution is one in which growth, development, and change are the result of a thoughtful and rational process of self-examination and planning, and one in which such a process is an inherent part of ongoing activities.” Source: MSCHE Characteristics of Excellence in Higher Education (2006) p.4
  • 11.
    11 Improving Student Learning What are we trying to do?  How well are we doing?  How can we improve?
  • 12.
    Learning Outcomes Assessment Method Criteria for Success Assessment Results Useof Results Assessment Plan 12 Assessment Cycle How can we improve? What are we trying to do? How well are we doing?
  • 13.
    13 Learning Outcomes  Whatstudents should know?  What students should be able to do?  What students should value?
  • 14.
    14 Levels of LearningOutcomes Institution CORE/ Gen Ed Major
  • 15.
    15 Types of LearningOutcomes Skills & Values Skills, Values, & Knowledge Knowledge, Skills, & Values
  • 16.
    16 One Size DoesNot Fit All  Assessment method must fit the outcome  Assessment method must fit the pedagogical culture of the discipline  Assessment method must fit the culture of the institution
  • 17.
    17 Some Common AssessmentMethods Source of data (examples) Means of Assessment (examples) Self-report (e.g. surveys) % reporting outcome attainment Embedded course work (e.g. final paper/project) Scoring guide/rubric Outside evaluation (e.g. internship supervisor’s report) % reporting outcome attainment External measure (e.g. major field test) Score on test
  • 18.
    Methods Used atthe Program Level 18 National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 19.
  • 20.
  • 21.
    Assessing the Assessment– Review Process  At the end of each academic year, departments submit assessment reports for each of their degree programs  Each assessment report is reviewed by at least two Committee members using a common rubric  Rubric was developed by the Committee  The Assistant Vice President for Institutional Research & Effectiveness is a reviewer on all reports  If the two reviewers differ by more than two points on the 4- point rating scale, the report is reviewed by a third Committee member 21
  • 22.
    Degree Program LearningAssessment Activity Rubric 22 Assessment Attribute Non-Existent (1) Minimally Developed (2) Well Developed (3) Highly Developed (4) Learning Outcomes No SLOs are articulated or aims are articulated as program aims SLOs are included but are not clearly stated in terms of what students will learn SLOs clearly identify what majors will learn by completing the degree program SLOs clearly identify what majors will learn by completing the degree program and have been incorporated into or linked to various courses Assessment Mechanism No assessment mechanism is articulated An assessment mechanism has been identified for at least one SLOs Assessment mechanisms include direct evidence of student learning Assessment mechanisms include direct evidence and have been articulated for all SLOs Assessment Results There is no evidence that assessment data has been gathered Assessment data has been gathered Assessment data has been gathered and there is a systematic process in place for data collection Assessment data has been gathered and there is a systematic process in place for data collection and analysis Use of Results There is no evidence that assessment data has been analyzed at the unit (degree program) level Assessment data have been analyzed at the unit level Assessment data have been analyzed at the unit level and there is a process in place for faculty to use assessment results Assessment data has been analyzed at the unit level, there is a process in place for faculty to use assessment results, and there is at least one example of attempts to improve the program based on assessment results
  • 23.
    Assessing the Assessment– Reporting Process  The average of the reviewers ratings is used as the final rating for each attribute  There is no “overall” rating given to an assessment report  Departments receive a feedback letter indicating how their assessment report(s) were rated  Final ratings for each degree program were shared in the Committee’s report to the Academic Senate  Ratings were also aggregated by college/school and for the University as a whole 23
  • 24.
    24 Departmental Learning AssessmentActivity Rubric Annual Assessment Report 2008-09 Department Name ______Sociology__________ Degree Program ___BA________________ Assessment Attribute Non-Existent (1) Minimally Developed (2) Well Developed (3) Highly Developed (4) Learning Aims ____ No aims are articulated or aims are articulated as program aims Aims are included but are not clearly stated in terms of what students will learn Learning aims clearly identify what majors will learn by completing the degree program Learning aims clearly identify what majors will learn by completing the degree program and have been incorporated into or linked to various courses Assessment Mechanism ____ No assessment mechanism is articulated An assessment mechanism has been identified for at least one learning aim Assessment mechanisms include direct evidence of student learning Assessment mechanisms include direct evidence and have been articulated for all learning aims Assessment Results ____ There is no evidence that assessment data has been gathered Assessment data has been gathered Assessment data has been gathered and there is a systematic process in place for data collection Assessment data has been gathered and there is a systematic process in place for data collection and analysis Use of Results ____ There is no evidence that assessment data has been analyzed at the unit (degree program) level Assessment data have been analyzed at the unit level Assessment data have been analyzed at the unit level and there is a process in place for faculty to use assessment results Assessment data has been analyzed at the unit level, there is a process in place for faculty to use assessment results, and there is at least one example of attempts to improve the program based on assessment results Departmental Feedback Letters Feedback letter the first year was a simple graphic indicating where the program fell on the rubric Feedback letter the second year included the rubric ratings and narrative summary of the reviewers thoughts about strengths and weaknesses of the documented assessment
  • 25.
    Assessing the Assessment– Detailed Report 25 Department/Program Degree Learning Aims Assessment Mechanism Assessment Results Use of Results LCAS Program Name BA High Well Non-Existent Non-Existent Program Name BS Minimal High Non-Existent Non-Existent Program Name BS High Well Well Minimal Program Name BA No Report No Report No Report No Report Program Name BA Minimal Well Well Well Program Name BA High Non-Existent Non-Existent Non-Existent Program Name BA No Report No Report No Report No Report Program Name BA High High Well Well Program Name BS High High Well Well Program Name MA Non-Existent Minimal Well Non-Existent Program Name MS Minimal Minimal Well Well Program Name BA High High High Well Program Name BS High High High High
  • 26.
    Assessing the Assessment– Aggregate Report 26 Loyola University Status of Assessment Based on 2009-10 Reviews Non- Existent Minimally Developed Well Developed Highly Developed Learning Aims 14% 6% 20% 59% Assessment Mechanism 16% 8% 25% 55% Assessment Results 23% 8% 17% 52% Use of Results 31% 8% 16% 45% Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland. -Percentages based on a total of 64 degree programs. -Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e. fine arts submitted separate reports for music and theater). -Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of “non-existent.” -Certificate programs are not included.
  • 27.
  • 28.
    Challenges and Plans Creating a shared understanding of University standards for assessment  SLAC use what was learned from two years of assessing the assessment as context for the development of “Principles and Practices of Assessment at Loyola University Maryland.”  Creating a shared understanding that documentation of assessment on the part of academic programs is necessary  SLAC report to Senate with the results of the reviews has been very helpful here 28
  • 29.
    Challenges and Plans Varied nature of the of assessment reports submitted (format, length, and content)  Encouraging departments to use a common assessment report template  Managing the volume of (figurative) paper  Investigate technological solutions to streamline the file management process  Time intensive for the Committee  Level of detail in the feedback to departments is being scaled  Committee is considering a review cycle so that all programs are not reviewed annually 29
  • 30.
    Challenges and Plans The “Use of Results” attribute on the rubric was not interpreted consistently  Revision will be made to the rubric for clarity  Not commenting on the quality of the assessment  ??  Sharing the details of departmental assessment activities across the University  Investigate technological solutions for a repository of examples and best practices within departments at Loyola 30
  • 31.
  • 32.
    Advantages  You don’tneed to be an assessment expert to review assessment reports with the rubric  You don’t need have specialized knowledge of the discipline being reviewed  We now know, can substantiate, and communicate the status of engagement with student learning assessment within the academic departments  Data were reported in the most recent institutional self-study for regional accreditation 32
  • 33.
    Prepared by: Terra Schehr AssistantVice President for Institutional Research and Effectiveness tschehr@loyola.edu www.loyola.edu/ir