SlideShare a Scribd company logo
1 of 33
Download to read offline
Assessing the assessment: A methodology to
facilitate university-wide reporting
Prepared by:
Terra Schehr
Assistant Vice President for Institutional Research & Effectiveness
Loyola University Maryland
For:
AIR Forum, New Orleans
June 2012
2
Overview
 Institutional context
 Assessment context
 Process for assessing the assessment
 Next steps
Institutional Context
 Private, Catholic Jesuit University
 Three schools (arts & sciences, business, education)
 Undergraduate Enrollment ~3,800
 35 degree programs
 Undergraduate Enrollment ~2,200
 Graduate degree programs in 7 areas
 Mainly master’s degrees
 Two doctoral degrees
 Many certificate programs
3
4
Student Learning Assessment Committee
 Committee of the Academic Senate
 Five faculty
 One student
 Assistant Vice President for Institutional Research &
Effectiveness
 Charge: The Student Learning Assessment Committee will
gather information and review processes relating to the
assessment of student learning at the departmental and
institutional levels. In this role, it will provide feedback to the
university community on the on-going process of student
learning assessment.
Where We Are Headed: The Standard Display
of the Results
Loyola University Status of
Assessment Based on 2009-10
Reviews
Non-
Existent
Minimally
Developed
Well
Developed
Highly
Developed
Learning Outcomes 14% 6% 20% 59%
Assessment Mechanism 16% 8% 25% 55%
Assessment Results 23% 8% 17% 52%
Use of Results 31% 8% 16% 45%
Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland.
-Percentages based on a total of 64 degree programs.
-Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e.
fine arts submitted separate reports for music and theater).
-Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of
“non-existent.”
-Certificate programs are not included.
5
ASSESSMENT CONTEXT
Why Assess?
7
Response options:
1=No Importance
2=Minor Importance,
3=Moderate Importance
4=High Importance
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Uses of Assessment Data
8
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
The top 8 out of 22
possible uses of
assessment data are
shown here.
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Accreditations
9
*Accrediting Board for
Engineering and Technology
*American Association of Pastoral
Counselors
*American Chemical Society
*American Psychological
Association
*American Speech-Language-
Hearing Association
*Association to Advance
Collegiate Schools of Business
*Computer Science Accreditation
Commission
*Council for Accreditation of
Counseling and Related
Educational Programs
*National Council for Accreditation
of Teacher Education
10
Accreditation and the Effective Institution
Middle States:
“An effective institution is one in which growth,
development, and change are the result of a thoughtful
and rational process of self-examination and planning,
and one in which such a process is an inherent part of
ongoing activities.”
Source: MSCHE Characteristics of Excellence in Higher Education (2006) p.4
11
Improving Student Learning
 What are we trying to do?
 How well are we doing?
 How can we improve?
Learning
Outcomes
Assessment
Method
Criteria for
Success
Assessment Results
Use of
Results
Assessment
Plan
12
Assessment Cycle
How can we
improve?
What are we
trying to do?
How well are
we doing?
13
Learning Outcomes
 What students should know?
 What students should be able to do?
 What students should value?
14
Levels of Learning Outcomes
Institution
CORE/
Gen Ed
Major
15
Types of Learning Outcomes
Skills &
Values
Skills,
Values, &
Knowledge
Knowledge,
Skills, &
Values
16
One Size Does Not Fit All
 Assessment method must fit the outcome
 Assessment method must fit the pedagogical culture of
the discipline
 Assessment method must fit the culture of the
institution
17
Some Common Assessment Methods
Source of data
(examples)
Means of Assessment
(examples)
Self-report (e.g. surveys) % reporting outcome
attainment
Embedded course work (e.g.
final paper/project)
Scoring guide/rubric
Outside evaluation (e.g.
internship supervisor’s
report)
% reporting outcome
attainment
External measure (e.g. major
field test)
Score on test
Methods Used at the Program Level
18
National Institute for
Learning Outcomes
Assessment (NILOA)
2009 Survey of Chief
Academic Officers
(n=2,809)
Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need:
Learning outcomes assessment in American higher education. NILOA.
Available: http://learningoutcomesassessment.org
Educational Effectiveness Table
19
ASSESSING THE ASSESSMENT
Assessing the Assessment – Review Process
 At the end of each academic year, departments submit
assessment reports for each of their degree programs
 Each assessment report is reviewed by at least two
Committee members using a common rubric
 Rubric was developed by the Committee
 The Assistant Vice President for Institutional Research &
Effectiveness is a reviewer on all reports
 If the two reviewers differ by more than two points on the 4-
point rating scale, the report is reviewed by a third Committee
member
21
Degree Program Learning Assessment Activity Rubric
22
Assessment
Attribute
Non-Existent
(1)
Minimally Developed
(2)
Well
Developed
(3)
Highly
Developed
(4)
Learning
Outcomes
No SLOs are articulated or
aims are articulated as
program aims
SLOs are included but are
not clearly stated in terms
of what students will learn
SLOs clearly identify what
majors will learn by
completing the degree
program
SLOs clearly identify what
majors will learn by
completing the degree
program and have been
incorporated into or linked
to various courses
Assessment
Mechanism
No assessment
mechanism is articulated
An assessment
mechanism has been
identified for at least one
SLOs
Assessment mechanisms
include direct evidence of
student learning
Assessment mechanisms
include direct evidence
and have been articulated
for all SLOs
Assessment
Results
There is no evidence that
assessment data has been
gathered
Assessment data has been
gathered
Assessment data has been
gathered and there is a
systematic process in
place for data collection
Assessment data has been
gathered and there is a
systematic process in
place for data collection
and analysis
Use of Results There is no evidence that
assessment data has been
analyzed at the unit
(degree program) level
Assessment data have
been analyzed at the unit
level
Assessment data have
been analyzed at the unit
level and there is a
process in place for faculty
to use assessment results
Assessment data has been
analyzed at the unit level,
there is a process in place
for faculty to use
assessment results, and
there is at least one
example of attempts to
improve the program
based on assessment
results
Assessing the Assessment – Reporting Process
 The average of the reviewers ratings is used as the
final rating for each attribute
 There is no “overall” rating given to an assessment report
 Departments receive a feedback letter indicating how
their assessment report(s) were rated
 Final ratings for each degree program were shared in
the Committee’s report to the Academic Senate
 Ratings were also aggregated by college/school and
for the University as a whole
23
24
Departmental Learning Assessment Activity Rubric
Annual Assessment Report 2008-09
Department Name ______Sociology__________ Degree Program ___BA________________
Assessment
Attribute
Non-Existent
(1)
Minimally
Developed
(2)
Well
Developed
(3)
Highly
Developed
(4)
Learning
Aims
____
No aims are
articulated or aims
are articulated as
program aims
Aims are included
but are not clearly
stated in terms of
what students will
learn
Learning aims
clearly identify
what majors will
learn by
completing the
degree program
Learning aims
clearly identify
what majors will
learn by
completing the
degree program
and have been
incorporated into
or linked to
various courses
Assessment
Mechanism
____
No assessment
mechanism is
articulated
An assessment
mechanism has
been identified for
at least one
learning aim
Assessment
mechanisms
include direct
evidence of
student learning
Assessment
mechanisms
include direct
evidence and
have been
articulated for all
learning aims
Assessment
Results
____
There is no
evidence that
assessment data
has been
gathered
Assessment data
has been
gathered
Assessment data
has been
gathered and
there is a
systematic
process in place
for data collection
Assessment data
has been
gathered and
there is a
systematic
process in place
for data collection
and analysis
Use of
Results
____
There is no
evidence that
assessment data
has been
analyzed at the
unit (degree
program) level
Assessment data
have been
analyzed at the
unit level
Assessment data
have been
analyzed at the
unit level and
there is a process
in place for faculty
to use
assessment
results
Assessment data
has been
analyzed at the
unit level, there is
a process in place
for faculty to use
assessment
results, and there
is at least one
example of
attempts to
improve the
program based on
assessment
results
Departmental Feedback Letters
Feedback letter the first
year was a simple graphic
indicating where the
program fell on the rubric
Feedback letter the second
year included the rubric
ratings and narrative
summary of the reviewers
thoughts about strengths
and weaknesses of the
documented assessment
Assessing the Assessment – Detailed Report
25
Department/Program Degree Learning Aims
Assessment
Mechanism
Assessment
Results
Use of
Results
LCAS
Program Name BA High Well Non-Existent Non-Existent
Program Name BS Minimal High Non-Existent Non-Existent
Program Name BS High Well Well Minimal
Program Name BA No Report No Report No Report No Report
Program Name BA Minimal Well Well Well
Program Name BA High Non-Existent Non-Existent Non-Existent
Program Name BA No Report No Report No Report No Report
Program Name BA High High Well Well
Program Name BS High High Well Well
Program Name MA Non-Existent Minimal Well Non-Existent
Program Name MS Minimal Minimal Well Well
Program Name BA High High High Well
Program Name BS High High High High
Assessing the Assessment – Aggregate Report
26
Loyola University Status of
Assessment Based on 2009-10
Reviews
Non-
Existent
Minimally
Developed
Well
Developed
Highly
Developed
Learning Aims 14% 6% 20% 59%
Assessment Mechanism 16% 8% 25% 55%
Assessment Results 23% 8% 17% 52%
Use of Results 31% 8% 16% 45%
Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland.
-Percentages based on a total of 64 degree programs.
-Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e.
fine arts submitted separate reports for music and theater).
-Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of
“non-existent.”
-Certificate programs are not included.
NEXT STEPS
Challenges and Plans
 Creating a shared understanding of University
standards for assessment
 SLAC use what was learned from two years of assessing the
assessment as context for the development of “Principles and
Practices of Assessment at Loyola University Maryland.”
 Creating a shared understanding that documentation
of assessment on the part of academic programs is
necessary
 SLAC report to Senate with the results of the reviews has been
very helpful here
28
Challenges and Plans
 Varied nature of the of assessment reports submitted
(format, length, and content)
 Encouraging departments to use a common assessment report
template
 Managing the volume of (figurative) paper
 Investigate technological solutions to streamline the file
management process
 Time intensive for the Committee
 Level of detail in the feedback to departments is being scaled
 Committee is considering a review cycle so that all programs
are not reviewed annually
29
Challenges and Plans
 The “Use of Results” attribute on the rubric was not
interpreted consistently
 Revision will be made to the rubric for clarity
 Not commenting on the quality of the assessment
 ??
 Sharing the details of departmental assessment
activities across the University
 Investigate technological solutions for a repository of examples
and best practices within departments at Loyola
30
ADVANTAGES
Advantages
 You don’t need to be an assessment expert to review
assessment reports with the rubric
 You don’t need have specialized knowledge of the
discipline being reviewed
 We now know, can substantiate, and communicate the
status of engagement with student learning
assessment within the academic departments
 Data were reported in the most recent institutional self-study for
regional accreditation
32
Prepared by:
Terra Schehr
Assistant Vice President for
Institutional Research and Effectiveness
tschehr@loyola.edu
www.loyola.edu/ir

More Related Content

What's hot

Assessment Literacy Module - Assessment Literacy Workshop in Kazakhstan (ENG...
Assessment Literacy Module  - Assessment Literacy Workshop in Kazakhstan (ENG...Assessment Literacy Module  - Assessment Literacy Workshop in Kazakhstan (ENG...
Assessment Literacy Module - Assessment Literacy Workshop in Kazakhstan (ENG...Kathleen Sullivan
 
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...ACBSP Global Accreditation
 
Slo training teacher teams. jan29.2014
Slo training teacher teams. jan29.2014Slo training teacher teams. jan29.2014
Slo training teacher teams. jan29.2014Cissy Mecca
 
SLO Template steps 1, 2, 3
SLO Template steps 1, 2, 3SLO Template steps 1, 2, 3
SLO Template steps 1, 2, 3emilycaryn
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Dr.Shazia Zamir
 
Language course evaluation
Language course evaluationLanguage course evaluation
Language course evaluationCarlos Mayora
 
Introduction to SLO Training - Steps 1 & 2
Introduction to SLO Training - Steps 1 & 2Introduction to SLO Training - Steps 1 & 2
Introduction to SLO Training - Steps 1 & 2emilycaryn
 
SLO overview power point 2014
SLO overview power point 2014SLO overview power point 2014
SLO overview power point 2014Ann Noonen
 
2010 Exploring online faculty workload
2010 Exploring online faculty workload2010 Exploring online faculty workload
2010 Exploring online faculty workloadWCET
 
Presentation (m & e)
Presentation (m & e)Presentation (m & e)
Presentation (m & e)shainaanwar
 

What's hot (20)

SLO for teachers
SLO for teachersSLO for teachers
SLO for teachers
 
Assessment Literacy Module - Assessment Literacy Workshop in Kazakhstan (ENG...
Assessment Literacy Module  - Assessment Literacy Workshop in Kazakhstan (ENG...Assessment Literacy Module  - Assessment Literacy Workshop in Kazakhstan (ENG...
Assessment Literacy Module - Assessment Literacy Workshop in Kazakhstan (ENG...
 
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...
Brad Kleindl - Degree to Enrollment Ratios and Persistence Rates, Meeting Rep...
 
Slo training teacher teams. jan29.2014
Slo training teacher teams. jan29.2014Slo training teacher teams. jan29.2014
Slo training teacher teams. jan29.2014
 
GETSI-Field rubrics and assessments
GETSI-Field rubrics and assessmentsGETSI-Field rubrics and assessments
GETSI-Field rubrics and assessments
 
SLO Template steps 1, 2, 3
SLO Template steps 1, 2, 3SLO Template steps 1, 2, 3
SLO Template steps 1, 2, 3
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by
 
Language course evaluation
Language course evaluationLanguage course evaluation
Language course evaluation
 
Common standard 2
Common standard 2Common standard 2
Common standard 2
 
Introduction to SLO Training - Steps 1 & 2
Introduction to SLO Training - Steps 1 & 2Introduction to SLO Training - Steps 1 & 2
Introduction to SLO Training - Steps 1 & 2
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
What are SLO's
What are SLO'sWhat are SLO's
What are SLO's
 
SLO overview power point 2014
SLO overview power point 2014SLO overview power point 2014
SLO overview power point 2014
 
Assessment Template
Assessment TemplateAssessment Template
Assessment Template
 
M2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSiteM2-Building the SLO-SSO-DemoSite
M2-Building the SLO-SSO-DemoSite
 
2010 Exploring online faculty workload
2010 Exploring online faculty workload2010 Exploring online faculty workload
2010 Exploring online faculty workload
 
Methods of evaluation
Methods of evaluationMethods of evaluation
Methods of evaluation
 
Goal free model
Goal free modelGoal free model
Goal free model
 
Ch. 12 powerpoint apt 501
Ch. 12 powerpoint apt 501Ch. 12 powerpoint apt 501
Ch. 12 powerpoint apt 501
 
Presentation (m & e)
Presentation (m & e)Presentation (m & e)
Presentation (m & e)
 

Viewers also liked

Viewers also liked (12)

Calendar 2015
Calendar 2015Calendar 2015
Calendar 2015
 
Celebración Beatificación Santa María Rosa Molas
Celebración Beatificación Santa María Rosa MolasCelebración Beatificación Santa María Rosa Molas
Celebración Beatificación Santa María Rosa Molas
 
Retiro una comunidad en camino
Retiro una comunidad en caminoRetiro una comunidad en camino
Retiro una comunidad en camino
 
TSchehr_Cost_MdAIRSpring10
TSchehr_Cost_MdAIRSpring10TSchehr_Cost_MdAIRSpring10
TSchehr_Cost_MdAIRSpring10
 
Schehr_Drinking_AIR09ForumFINAL
Schehr_Drinking_AIR09ForumFINALSchehr_Drinking_AIR09ForumFINAL
Schehr_Drinking_AIR09ForumFINAL
 
Adoración eucarística 14 de abril
Adoración eucarística 14 de abril Adoración eucarística 14 de abril
Adoración eucarística 14 de abril
 
TSchehr_Senarios_AIR08Forum
TSchehr_Senarios_AIR08ForumTSchehr_Senarios_AIR08Forum
TSchehr_Senarios_AIR08Forum
 
Макетирование. Арт-терапевтическая группа для детей
Макетирование. Арт-терапевтическая группа для детейМакетирование. Арт-терапевтическая группа для детей
Макетирование. Арт-терапевтическая группа для детей
 
cv haitham
cv haithamcv haitham
cv haitham
 
Терапия творчеством
Терапия творчествомТерапия творчеством
Терапия творчеством
 
5. magnet copy
5. magnet   copy5. magnet   copy
5. magnet copy
 
Kim5 2
Kim5 2Kim5 2
Kim5 2
 

Similar to TSchehr_Assessing the Assessment_AIR

Assessment in Flexible Learning Mode
Assessment in Flexible Learning ModeAssessment in Flexible Learning Mode
Assessment in Flexible Learning ModeMartin Nobis
 
National university assessment process
National university assessment processNational university assessment process
National university assessment processAshley Kovacs
 
teaching and learning process (evaluation)
teaching and learning process (evaluation)teaching and learning process (evaluation)
teaching and learning process (evaluation)neilaldrinalfaro
 
Evaluation in Teaching learning process
Evaluation in Teaching learning processEvaluation in Teaching learning process
Evaluation in Teaching learning processEnu Sambyal
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating AssessmentsChristina Sax
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"EDUCAUSE
 
ASSESSMENT and EVALUATION.pptx
ASSESSMENT and EVALUATION.pptxASSESSMENT and EVALUATION.pptx
ASSESSMENT and EVALUATION.pptxGIRLYCAMACHO
 
Evaluation
EvaluationEvaluation
Evaluationedac4co
 
Program evaluation
Program evaluationProgram evaluation
Program evaluationYen Bunsoy
 
Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1Lisa M. Snyder
 
New facultyie&evaluationjan2011 1
New facultyie&evaluationjan2011 1New facultyie&evaluationjan2011 1
New facultyie&evaluationjan2011 1Brian Nixon
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC PresentationBradley Vaden
 
Pogram Evaluation.pptx
Pogram Evaluation.pptxPogram Evaluation.pptx
Pogram Evaluation.pptxFadi El Kallab
 

Similar to TSchehr_Assessing the Assessment_AIR (20)

Assessment in Flexible Learning Mode
Assessment in Flexible Learning ModeAssessment in Flexible Learning Mode
Assessment in Flexible Learning Mode
 
National university assessment process
National university assessment processNational university assessment process
National university assessment process
 
teaching and learning process (evaluation)
teaching and learning process (evaluation)teaching and learning process (evaluation)
teaching and learning process (evaluation)
 
Evaluation in Teaching learning process
Evaluation in Teaching learning processEvaluation in Teaching learning process
Evaluation in Teaching learning process
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
 
W 2 WASC 101
W 2 WASC 101W 2 WASC 101
W 2 WASC 101
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"
 
Interview presentation.pptx
Interview presentation.pptxInterview presentation.pptx
Interview presentation.pptx
 
ASSESSMENT and EVALUATION.pptx
ASSESSMENT and EVALUATION.pptxASSESSMENT and EVALUATION.pptx
ASSESSMENT and EVALUATION.pptx
 
Evaluation
EvaluationEvaluation
Evaluation
 
Wssu session 2
Wssu session 2Wssu session 2
Wssu session 2
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1Introduction to Designing Assessment Plans Workshop 1
Introduction to Designing Assessment Plans Workshop 1
 
Report 5
Report 5Report 5
Report 5
 
Research panel
Research panelResearch panel
Research panel
 
New facultyie&evaluationjan2011 1
New facultyie&evaluationjan2011 1New facultyie&evaluationjan2011 1
New facultyie&evaluationjan2011 1
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC Presentation
 
Practical Evaluation Workshop
Practical Evaluation WorkshopPractical Evaluation Workshop
Practical Evaluation Workshop
 
Pogram Evaluation.pptx
Pogram Evaluation.pptxPogram Evaluation.pptx
Pogram Evaluation.pptx
 
Unit 9-6503.pptx
Unit 9-6503.pptxUnit 9-6503.pptx
Unit 9-6503.pptx
 

Recently uploaded

Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerunnathinaik
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsanshu789521
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
CELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxCELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxJiesonDelaCerna
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxRaymartEstabillo3
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 

Recently uploaded (20)

Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
internship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developerinternship ppt on smartinternz platform as salesforce developer
internship ppt on smartinternz platform as salesforce developer
 
Presiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha electionsPresiding Officer Training module 2024 lok sabha elections
Presiding Officer Training module 2024 lok sabha elections
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
CELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptxCELL CYCLE Division Science 8 quarter IV.pptx
CELL CYCLE Division Science 8 quarter IV.pptx
 
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptxEPANDING THE CONTENT OF AN OUTLINE using notes.pptx
EPANDING THE CONTENT OF AN OUTLINE using notes.pptx
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 

TSchehr_Assessing the Assessment_AIR

  • 1. Assessing the assessment: A methodology to facilitate university-wide reporting Prepared by: Terra Schehr Assistant Vice President for Institutional Research & Effectiveness Loyola University Maryland For: AIR Forum, New Orleans June 2012
  • 2. 2 Overview  Institutional context  Assessment context  Process for assessing the assessment  Next steps
  • 3. Institutional Context  Private, Catholic Jesuit University  Three schools (arts & sciences, business, education)  Undergraduate Enrollment ~3,800  35 degree programs  Undergraduate Enrollment ~2,200  Graduate degree programs in 7 areas  Mainly master’s degrees  Two doctoral degrees  Many certificate programs 3
  • 4. 4 Student Learning Assessment Committee  Committee of the Academic Senate  Five faculty  One student  Assistant Vice President for Institutional Research & Effectiveness  Charge: The Student Learning Assessment Committee will gather information and review processes relating to the assessment of student learning at the departmental and institutional levels. In this role, it will provide feedback to the university community on the on-going process of student learning assessment.
  • 5. Where We Are Headed: The Standard Display of the Results Loyola University Status of Assessment Based on 2009-10 Reviews Non- Existent Minimally Developed Well Developed Highly Developed Learning Outcomes 14% 6% 20% 59% Assessment Mechanism 16% 8% 25% 55% Assessment Results 23% 8% 17% 52% Use of Results 31% 8% 16% 45% Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland. -Percentages based on a total of 64 degree programs. -Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e. fine arts submitted separate reports for music and theater). -Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of “non-existent.” -Certificate programs are not included. 5
  • 7. Why Assess? 7 Response options: 1=No Importance 2=Minor Importance, 3=Moderate Importance 4=High Importance National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 8. Uses of Assessment Data 8 National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) The top 8 out of 22 possible uses of assessment data are shown here. Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 9. Accreditations 9 *Accrediting Board for Engineering and Technology *American Association of Pastoral Counselors *American Chemical Society *American Psychological Association *American Speech-Language- Hearing Association *Association to Advance Collegiate Schools of Business *Computer Science Accreditation Commission *Council for Accreditation of Counseling and Related Educational Programs *National Council for Accreditation of Teacher Education
  • 10. 10 Accreditation and the Effective Institution Middle States: “An effective institution is one in which growth, development, and change are the result of a thoughtful and rational process of self-examination and planning, and one in which such a process is an inherent part of ongoing activities.” Source: MSCHE Characteristics of Excellence in Higher Education (2006) p.4
  • 11. 11 Improving Student Learning  What are we trying to do?  How well are we doing?  How can we improve?
  • 12. Learning Outcomes Assessment Method Criteria for Success Assessment Results Use of Results Assessment Plan 12 Assessment Cycle How can we improve? What are we trying to do? How well are we doing?
  • 13. 13 Learning Outcomes  What students should know?  What students should be able to do?  What students should value?
  • 14. 14 Levels of Learning Outcomes Institution CORE/ Gen Ed Major
  • 15. 15 Types of Learning Outcomes Skills & Values Skills, Values, & Knowledge Knowledge, Skills, & Values
  • 16. 16 One Size Does Not Fit All  Assessment method must fit the outcome  Assessment method must fit the pedagogical culture of the discipline  Assessment method must fit the culture of the institution
  • 17. 17 Some Common Assessment Methods Source of data (examples) Means of Assessment (examples) Self-report (e.g. surveys) % reporting outcome attainment Embedded course work (e.g. final paper/project) Scoring guide/rubric Outside evaluation (e.g. internship supervisor’s report) % reporting outcome attainment External measure (e.g. major field test) Score on test
  • 18. Methods Used at the Program Level 18 National Institute for Learning Outcomes Assessment (NILOA) 2009 Survey of Chief Academic Officers (n=2,809) Kuh, G. & Ikenberry, S. (2009, October). More than you think, less than we need: Learning outcomes assessment in American higher education. NILOA. Available: http://learningoutcomesassessment.org
  • 21. Assessing the Assessment – Review Process  At the end of each academic year, departments submit assessment reports for each of their degree programs  Each assessment report is reviewed by at least two Committee members using a common rubric  Rubric was developed by the Committee  The Assistant Vice President for Institutional Research & Effectiveness is a reviewer on all reports  If the two reviewers differ by more than two points on the 4- point rating scale, the report is reviewed by a third Committee member 21
  • 22. Degree Program Learning Assessment Activity Rubric 22 Assessment Attribute Non-Existent (1) Minimally Developed (2) Well Developed (3) Highly Developed (4) Learning Outcomes No SLOs are articulated or aims are articulated as program aims SLOs are included but are not clearly stated in terms of what students will learn SLOs clearly identify what majors will learn by completing the degree program SLOs clearly identify what majors will learn by completing the degree program and have been incorporated into or linked to various courses Assessment Mechanism No assessment mechanism is articulated An assessment mechanism has been identified for at least one SLOs Assessment mechanisms include direct evidence of student learning Assessment mechanisms include direct evidence and have been articulated for all SLOs Assessment Results There is no evidence that assessment data has been gathered Assessment data has been gathered Assessment data has been gathered and there is a systematic process in place for data collection Assessment data has been gathered and there is a systematic process in place for data collection and analysis Use of Results There is no evidence that assessment data has been analyzed at the unit (degree program) level Assessment data have been analyzed at the unit level Assessment data have been analyzed at the unit level and there is a process in place for faculty to use assessment results Assessment data has been analyzed at the unit level, there is a process in place for faculty to use assessment results, and there is at least one example of attempts to improve the program based on assessment results
  • 23. Assessing the Assessment – Reporting Process  The average of the reviewers ratings is used as the final rating for each attribute  There is no “overall” rating given to an assessment report  Departments receive a feedback letter indicating how their assessment report(s) were rated  Final ratings for each degree program were shared in the Committee’s report to the Academic Senate  Ratings were also aggregated by college/school and for the University as a whole 23
  • 24. 24 Departmental Learning Assessment Activity Rubric Annual Assessment Report 2008-09 Department Name ______Sociology__________ Degree Program ___BA________________ Assessment Attribute Non-Existent (1) Minimally Developed (2) Well Developed (3) Highly Developed (4) Learning Aims ____ No aims are articulated or aims are articulated as program aims Aims are included but are not clearly stated in terms of what students will learn Learning aims clearly identify what majors will learn by completing the degree program Learning aims clearly identify what majors will learn by completing the degree program and have been incorporated into or linked to various courses Assessment Mechanism ____ No assessment mechanism is articulated An assessment mechanism has been identified for at least one learning aim Assessment mechanisms include direct evidence of student learning Assessment mechanisms include direct evidence and have been articulated for all learning aims Assessment Results ____ There is no evidence that assessment data has been gathered Assessment data has been gathered Assessment data has been gathered and there is a systematic process in place for data collection Assessment data has been gathered and there is a systematic process in place for data collection and analysis Use of Results ____ There is no evidence that assessment data has been analyzed at the unit (degree program) level Assessment data have been analyzed at the unit level Assessment data have been analyzed at the unit level and there is a process in place for faculty to use assessment results Assessment data has been analyzed at the unit level, there is a process in place for faculty to use assessment results, and there is at least one example of attempts to improve the program based on assessment results Departmental Feedback Letters Feedback letter the first year was a simple graphic indicating where the program fell on the rubric Feedback letter the second year included the rubric ratings and narrative summary of the reviewers thoughts about strengths and weaknesses of the documented assessment
  • 25. Assessing the Assessment – Detailed Report 25 Department/Program Degree Learning Aims Assessment Mechanism Assessment Results Use of Results LCAS Program Name BA High Well Non-Existent Non-Existent Program Name BS Minimal High Non-Existent Non-Existent Program Name BS High Well Well Minimal Program Name BA No Report No Report No Report No Report Program Name BA Minimal Well Well Well Program Name BA High Non-Existent Non-Existent Non-Existent Program Name BA No Report No Report No Report No Report Program Name BA High High Well Well Program Name BS High High Well Well Program Name MA Non-Existent Minimal Well Non-Existent Program Name MS Minimal Minimal Well Well Program Name BA High High High Well Program Name BS High High High High
  • 26. Assessing the Assessment – Aggregate Report 26 Loyola University Status of Assessment Based on 2009-10 Reviews Non- Existent Minimally Developed Well Developed Highly Developed Learning Aims 14% 6% 20% 59% Assessment Mechanism 16% 8% 25% 55% Assessment Results 23% 8% 17% 52% Use of Results 31% 8% 16% 45% Source: SLAC (2010, May). SLAC Annual Report to the Academic Senate 2009-2010. Loyola University Maryland. -Percentages based on a total of 64 degree programs. -Some degree programs submitted separate assessment reports for different concentrations/delivery methods (i.e. fine arts submitted separate reports for music and theater). -Three undergraduate and four graduate programs did not submit reports; they are included in the percentages of “non-existent.” -Certificate programs are not included.
  • 28. Challenges and Plans  Creating a shared understanding of University standards for assessment  SLAC use what was learned from two years of assessing the assessment as context for the development of “Principles and Practices of Assessment at Loyola University Maryland.”  Creating a shared understanding that documentation of assessment on the part of academic programs is necessary  SLAC report to Senate with the results of the reviews has been very helpful here 28
  • 29. Challenges and Plans  Varied nature of the of assessment reports submitted (format, length, and content)  Encouraging departments to use a common assessment report template  Managing the volume of (figurative) paper  Investigate technological solutions to streamline the file management process  Time intensive for the Committee  Level of detail in the feedback to departments is being scaled  Committee is considering a review cycle so that all programs are not reviewed annually 29
  • 30. Challenges and Plans  The “Use of Results” attribute on the rubric was not interpreted consistently  Revision will be made to the rubric for clarity  Not commenting on the quality of the assessment  ??  Sharing the details of departmental assessment activities across the University  Investigate technological solutions for a repository of examples and best practices within departments at Loyola 30
  • 32. Advantages  You don’t need to be an assessment expert to review assessment reports with the rubric  You don’t need have specialized knowledge of the discipline being reviewed  We now know, can substantiate, and communicate the status of engagement with student learning assessment within the academic departments  Data were reported in the most recent institutional self-study for regional accreditation 32
  • 33. Prepared by: Terra Schehr Assistant Vice President for Institutional Research and Effectiveness tschehr@loyola.edu www.loyola.edu/ir