SlideShare a Scribd company logo
Reflection to Improve Assessment: Unveiling the “Heart” of Faculty Engagement
Mr. Rajeeb Das, Senior Program Evaluator and Assessment Specialist
Dr. Timothy Brophy, Director of Institutional Assessment
University of Florida, Gainesville, Florida
Session Structure
Rajeeb Das
Senior Assessment and Evaluation Specialist
Institutional Planning and Research
University of Florida
Timothy S. Brophy
Director of Institutional Assessment
Office of the Provost
University of Florida
Institutional Context
Our Research Methods
and Findings
Improvements and
Modifications Based
on our Results
YOUR INSTITUTIONAL CONTEXT
The University of Florida Context
About 54,000 students total
Over 5,000 faculty
Research 1, AAU member
Educational Units:
•16 colleges
•496 academic programs
Administrative Units:
•10 Vice Presidential units
•4 Senior Vice Presidential units,
•the Libraries
•the Graduate School, and
•The Florida Museum of Natural History
Over $700 million in annual research funding
ASSESSMENT ON YOUR CAMPUS
Communication is Key!
We use a distributed leadership model
Each of our 16 colleges, 4 Senior Vice Presidential units, 10 Vice
presidential units, the Graduate School, The Libraries, and the Florida
Museum of Natural History all have appointed SACSCOC Coordinators
These individuals meet as a group when needed, usually twice a year
We communicate with them, they communicate to their faculty and
administration
UF Assessment System
Faculty and staff
Unit SACSCOC Coordinator
System Inputs – Automated
Approval Tracking System
• Assessment and Institutional
Effectiveness Plans
• Annual data reports
System Outputs
• Academic Assessment Committee actions
(approve, comment, conditionally
approve, recycle, deny, or table)
• Feedback and guidance
For more information, see:
Brophy, T. S. (2017). Case study: The
University of Florida Assessment
System. T. Cumming & In M. D. Miller,
(Eds.), Enhancing assessment in
higher education: Putting
psychometrics to work (pp. 186-204).
Sterling, VA: Stylus.
The Need for the Research
Patterns emerged from the annual data reports
• Inconsistencies across programs
• Lack of understanding of institutional processes
Anecdotal evidence from the SACSCOC Coordinators
• Individual reports from coordinators about activity in their
colleges and units
Academic Assessment Committee interest
Phase 1: Survey of the College SACSCOC
Coordinators (Das & Gater, 2015)
1. Started with validated survey
• Welsh, J. F., & Metcalf, J. (2003). Cultivating faculty support for
institutional effectiveness activities: Benchmarking best practices.
Assessment & Evaluation in Higher Education, 28(1), 33-45.
doi:10.1080/02602930301682
2. Modified to delete items and add our own items
3. Obtained IRB approval (IRB2 2014-U-0242)
4. Created on-line survey and sent link via e-mail
5. Response Rate = 11 out of 17 = 65%
4.8% 23.8%
10.0%
71.4%
90.0%
Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree
Survey Questions & Results
Program goals and/or student learning outcomes will be an important
component of SACSCOC accreditation well into the future.
Efforts to evaluate the effectiveness of my college’s degree program goals
and/or student learning outcomes are worthwhile.
52.4%
90.5%
4.8% 42.9%
9.5%
Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree
Survey Questions & Results
Analyzing program goals and/or student learning outcomes are not an
important component of my job responsibilities.
Program goals and/or student learning outcomes are primarily the
responsibility of university staff outside my college.
38.1% 52.4% 9.5%
57.1% 42.9%
Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree
Survey Questions & Results
Program goals and/or student learning outcomes in my college is, or
would be, strengthened by active participation by faculty members.
Developing program goals and/or student learning outcomes is a fad that
will likely be replaced by another area of emphasis.
4.8%
4.8%
14.3%
9.5%
19.0%
23.8%
61.9%
61.9%
Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree
Survey Questions & Results
• Documenting program goals and/or student learning outcomes should be
an integral element to any college or department-specific SACSCOC
accreditation criteria.
• Resources dedicated to program goal and/or student learning outcome
activities are investments in the long-term effectiveness of my college.
Phase 1: Modifications
• Results led to change
• Office of Institutional Assessment modified
annual professional development
– Emphasized that we support PD initiatives in the
field (one-on-one, small groups, telephone
support)
– Customized professional development to the
extent possible
Phase 2: Stakeholder Interviews
(Das & Gater, 2015)
• What duties comprise your role/job?
• Describe annual data collection
process.
• How do you handle training?
• How do you motivate faculty? New
faculty?
• What do you do when data is not
entered or entered inadequately?
• Any tips/advice?
Structured
interviews with
college
representatives,
stakeholders
responsible for
assessment
data
Phase 2: Stakeholder Interview Results
Evidence: Centralized
data entry produces the
highest quality results
Result: This is what we
recommend as a best
practice
Evidence: Training styles
varied depending on the
size of the department or
college: regular agenda
item in faculty meetings,
one-on-one training, ad-
hoc meetings, etcetera
Result: Data supports
that we continue to offer
training to comply with
CS 3.3.1 in multiple
formats: web, video, PDF,
phone
Evidence: Some units did
not use data collected for
CS 3.3.1 for anything
other than CS 3.3.1
Result: We shared ways
other faculty used results
(e.g., template for other
accrediting bodies,
program improvement
processes)
Phase 3: Faculty Focus Groups
Academic Assessment Committee developed these research
questions:
How are UF faculty engaged in
academic assessment processes at
the University of Florida?
In what ways could this information
lead to the modification and
improvement of institutional
assessment processes?
The Das & Gater findings revealed a need to better
understand faculty engagement with assessment
Participants
16 focus groups (one in
each college), N = 146
Tenure and tenure-track
faculty; all were invited to
join groups
Specific days and times
were set
Doodle polls sent out, and
focus groups established
based on availability and
interest
Delimitation:
Limited to faculty who
were available at the
scheduled times in each
college
Question Categories
Faculty
Engagement
with
Assessment
Instructor
Assessments
Perceived Value of
Assessments
Assessment at the
Department/
Program/Major
level
Closing question –
What haven’t we
asked you today
that you would
like to talk about?
Results
34 sets of data – field notes, recordings
Loaded into NVivo11
Coded first for response categories
8 response categories and three themes emerged
Response Categories
Data Category Description
Assessment methods and
processes
Various assessment types used for the assessment
of student learning
Challenges Issues that impede assessment or make it
challenging
Concerns Areas that cause concern or are barriers to
assessment they would like to do
Context Factors that influence assessment that faculty
cannot control
Data gaps Information that faculty would like to collect but
cannot or do not
Needs What faculty would like to have to facilitate their
assessment processes
Use of Results The ways that faculty use the results of their
assessments
Value What faculty value about assessment
Theme 1: Assessment is Valued
Faculty quotes
“If you don’t
determine they’re
learning, why are
we here?”
“There is value in
seeing students
succeed, and
assessment
provides
information that is
used to re-examine
student
knowledge.”
Findings
The value of assessment is often directly associated with
standards of the field
Use of results is consistent but purposes and methods vary
• Most prevalent: assessment data used to modify instruction to advance
student learning
• Open dialogue is a primary assessment methodology for those who
teach/mentor in one-to-one teaching situations
Faculty want to learn from their peers – sharing
assessment methods and processes is valuable
Theme 2: Influential Conditions
Faculty quotes
“The type of
assessment I use
depends on the size
of the class.”
“Our disciplinary
accreditor requires
a set of national
exams that all of
our students must
take. Why can’t we
use these as
outcome
measures?”
Findings
Two conditions that impact
assessment were common across the
colleges:
Class size
Disciplinary
accreditation
Class Size
Primary driver for assessment methodology: number of
students in the class
Large classes constrain assessment choices to large scale
measures (such as exams scored electronically)
There is a tension between what faculty want to do to
assess their students and what they feel they must do
because of class size
Disciplinary Accreditation
Disciplinary
accreditors often
require student
learning measures
and some prescribe
student learning
outcomes
Some
disciplinary
accreditors
have
established
assessment
standards
Frustrations:
• Aligning
disciplinary
accreditation
requirements with
SACSCOC
requirements
• Appropriate use
of required third-
party exams
Theme 3: Misconceptions about
SACSCOC Reporting
Findings
We didn’t ask, but…
Accreditation reporting was raised in nearly every college
Three misconceptions emerged:
• All student learning must be quantified
• Academic assessment is limited to specific categories and types
• The data “disappears”
Misconception 1:
All student learning must be quantified
“Our faculty are very engaged in gathering
anecdotal evidence, but push back with
quantification of student learning information.”
“Subjective data cannot be quantified.”
• Likely arises from a UF requirement to provide a rubric
• We ask for summary data; for some, this has been conflated
with quantification of student learning data
Misconception 2: Assessment is
limited to specific categories and types
“The criteria for SACSCOC are limited;
I feel like my hands are tied.”
• Florida regulations require certain categories
of student learning outcomes
• The UF Graduate School also has established
outcome categories
• However: additional categories are permitted
Misconception 3: The data “disappears”
Faculty related concerns about not knowing what happens
to the data they report
Data reporting is done in our accreditation module from a
third party provider
Access is limited to those who have a “role” in the
software program
This is a legitimate concern
CULTIVATING ENGAGEMENT:
ACTIONS TAKEN BASED ON OUR
ANALYSIS
Modifications based on our analysis
Finding Recommendations
Value
1. Share assessment work across
colleges
1. Continue the UF Assessment Conference,
and develop an online mechanism for
faculty to share their assessment work
with others.
Influential conditions
1. Class size
2. Disciplinary accreditation
1. Develop faculty workshops in conjunction
with the Office of Faculty Development
and Teaching Excellence on using Canvas
assessment tools to facilitate data
collection for multiple assessment
methods.
2. Work with specific disciplines to maximize
use of student learning data collected for
disciplinary accreditors for regional
accreditation reports.
Modifications based on our analysis
Finding Recommendations
Misconceptions
1. Quantification of
student learning data
2. Limitations on
assessment outcomes
and measures.
3. Faculty are not clear on
how the data they
report is used at the
institutional level, nor
do they have ready
access to it.
1. Develop online tools to clarify
what can be reported.
2. Develop online tools to clarify
assessment types.
3. Develop an accessible view of
student learning data reports,
perhaps through visualization.
Current Visualization Projects
(in progress)
General Education
The Quality Enhancement Plan
Continuing Research
Non-tenure track faculty
N = 1860
Will take place in February-March, 2018
UF Assessment Conferences
Next internal conference is March 30, 2018
Theme: Assessment@UF: Enhancing Program
Excellence through Assessment
• Plenary with our Peers: Engaging with Assessment in the Research
University (panelists from UT Austin, UNC Chapel Hill, University
of Virginia)
April 3-5, 2019: Expanded conference – stay tuned for
information!
Questions and Discussion
Rajeeb Das, Institutional Planning and Research
rdas@ufl.edu
352-392-0456
Timothy S. Brophy, Institutional Assessment
tbrophy@aa.ufl.edu
352-273-4476

More Related Content

Similar to Interview presentation.pptx

Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionb
Rita Ndagire Kizito
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC Presentation
Bradley Vaden
 
Cahe 572 finalized assessment presentation (1)
Cahe 572 finalized assessment presentation (1)Cahe 572 finalized assessment presentation (1)
Cahe 572 finalized assessment presentation (1)
Wendy Marshall
 
School Wide Focus
School Wide FocusSchool Wide Focus
School Wide Focus
guest10349b
 

Similar to Interview presentation.pptx (20)

Interrogating evaluation 2015 induction
Interrogating evaluation 2015 inductionInterrogating evaluation 2015 induction
Interrogating evaluation 2015 induction
 
Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionb
 
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...
Moving Forward on Learning Analytics - A/Professor Deborah West, Charles Darw...
 
Program Review ACCJC Presentation
Program Review ACCJC PresentationProgram Review ACCJC Presentation
Program Review ACCJC Presentation
 
Learning Outcomes Discussion
Learning Outcomes DiscussionLearning Outcomes Discussion
Learning Outcomes Discussion
 
Professional Development: Differentiated Instruction
Professional Development: Differentiated InstructionProfessional Development: Differentiated Instruction
Professional Development: Differentiated Instruction
 
ABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of DerbyABLE - the NTU Student Dashboard - University of Derby
ABLE - the NTU Student Dashboard - University of Derby
 
The Path to Creating an Integrated Online Contingent Faculty Competency System
The Path to Creating an Integrated Online Contingent Faculty Competency SystemThe Path to Creating an Integrated Online Contingent Faculty Competency System
The Path to Creating an Integrated Online Contingent Faculty Competency System
 
Cahe 572 finalized assessment presentation (1)
Cahe 572 finalized assessment presentation (1)Cahe 572 finalized assessment presentation (1)
Cahe 572 finalized assessment presentation (1)
 
Program assessment
Program assessmentProgram assessment
Program assessment
 
School Wide Focus
School Wide FocusSchool Wide Focus
School Wide Focus
 
Research panel
Research panelResearch panel
Research panel
 
Building Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and TeachersBuilding Data Literacy Among Middle School Administrators and Teachers
Building Data Literacy Among Middle School Administrators and Teachers
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate Teaching
 
Process of Assessment- B.Ed syllabus, assessment for learning
Process of Assessment- B.Ed syllabus, assessment for learningProcess of Assessment- B.Ed syllabus, assessment for learning
Process of Assessment- B.Ed syllabus, assessment for learning
 
FIPSE Findings
FIPSE FindingsFIPSE Findings
FIPSE Findings
 
FIPSE Findings
FIPSE FindingsFIPSE Findings
FIPSE Findings
 
counseling psychology- evaluation of counseling
counseling psychology-  evaluation of counselingcounseling psychology-  evaluation of counseling
counseling psychology- evaluation of counseling
 
Evaluating the Quality of Online Teaching
Evaluating the Quality of Online Teaching Evaluating the Quality of Online Teaching
Evaluating the Quality of Online Teaching
 
Monitoring and Evaluation Supporting School Improvement and Effectiveness
Monitoring and Evaluation Supporting School Improvement and EffectivenessMonitoring and Evaluation Supporting School Improvement and Effectiveness
Monitoring and Evaluation Supporting School Improvement and Effectiveness
 

More from Dr. S. Dinesh Kumar (6)

Alumni Lecture.pptx
Alumni Lecture.pptxAlumni Lecture.pptx
Alumni Lecture.pptx
 
engineer or study abroad.ppt
engineer or study abroad.pptengineer or study abroad.ppt
engineer or study abroad.ppt
 
details of apprentice.pptx
details of apprentice.pptxdetails of apprentice.pptx
details of apprentice.pptx
 
career guidance.ppt
career guidance.pptcareer guidance.ppt
career guidance.ppt
 
nba ppt for inspection.pptx
nba ppt for inspection.pptxnba ppt for inspection.pptx
nba ppt for inspection.pptx
 
EEE_NBA- 10.03.2022_Final.pptx (1).pptx
EEE_NBA- 10.03.2022_Final.pptx (1).pptxEEE_NBA- 10.03.2022_Final.pptx (1).pptx
EEE_NBA- 10.03.2022_Final.pptx (1).pptx
 

Recently uploaded

plant breeding methods in asexually or clonally propagated crops
plant breeding methods in asexually or clonally propagated cropsplant breeding methods in asexually or clonally propagated crops
plant breeding methods in asexually or clonally propagated crops
parmarsneha2
 

Recently uploaded (20)

Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptxJose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
Jose-Rizal-and-Philippine-Nationalism-National-Symbol-2.pptx
 
plant breeding methods in asexually or clonally propagated crops
plant breeding methods in asexually or clonally propagated cropsplant breeding methods in asexually or clonally propagated crops
plant breeding methods in asexually or clonally propagated crops
 
Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
NCERT Solutions Power Sharing Class 10 Notes pdf
NCERT Solutions Power Sharing Class 10 Notes pdfNCERT Solutions Power Sharing Class 10 Notes pdf
NCERT Solutions Power Sharing Class 10 Notes pdf
 
Overview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with MechanismOverview on Edible Vaccine: Pros & Cons with Mechanism
Overview on Edible Vaccine: Pros & Cons with Mechanism
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
UNIT – IV_PCI Complaints: Complaints and evaluation of complaints, Handling o...
 
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptxSolid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
Solid waste management & Types of Basic civil Engineering notes by DJ Sir.pptx
 
B.ed spl. HI pdusu exam paper-2023-24.pdf
B.ed spl. HI pdusu exam paper-2023-24.pdfB.ed spl. HI pdusu exam paper-2023-24.pdf
B.ed spl. HI pdusu exam paper-2023-24.pdf
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
50 ĐỀ LUYỆN THI IOE LỚP 9 - NĂM HỌC 2022-2023 (CÓ LINK HÌNH, FILE AUDIO VÀ ĐÁ...
50 ĐỀ LUYỆN THI IOE LỚP 9 - NĂM HỌC 2022-2023 (CÓ LINK HÌNH, FILE AUDIO VÀ ĐÁ...50 ĐỀ LUYỆN THI IOE LỚP 9 - NĂM HỌC 2022-2023 (CÓ LINK HÌNH, FILE AUDIO VÀ ĐÁ...
50 ĐỀ LUYỆN THI IOE LỚP 9 - NĂM HỌC 2022-2023 (CÓ LINK HÌNH, FILE AUDIO VÀ ĐÁ...
 
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
GIÁO ÁN DẠY THÊM (KẾ HOẠCH BÀI BUỔI 2) - TIẾNG ANH 8 GLOBAL SUCCESS (2 CỘT) N...
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Extraction Of Natural Dye From Beetroot (Beta Vulgaris) And Preparation Of He...
Extraction Of Natural Dye From Beetroot (Beta Vulgaris) And Preparation Of He...Extraction Of Natural Dye From Beetroot (Beta Vulgaris) And Preparation Of He...
Extraction Of Natural Dye From Beetroot (Beta Vulgaris) And Preparation Of He...
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 

Interview presentation.pptx

  • 1. Reflection to Improve Assessment: Unveiling the “Heart” of Faculty Engagement Mr. Rajeeb Das, Senior Program Evaluator and Assessment Specialist Dr. Timothy Brophy, Director of Institutional Assessment University of Florida, Gainesville, Florida
  • 2. Session Structure Rajeeb Das Senior Assessment and Evaluation Specialist Institutional Planning and Research University of Florida Timothy S. Brophy Director of Institutional Assessment Office of the Provost University of Florida Institutional Context Our Research Methods and Findings Improvements and Modifications Based on our Results
  • 4. The University of Florida Context About 54,000 students total Over 5,000 faculty Research 1, AAU member Educational Units: •16 colleges •496 academic programs Administrative Units: •10 Vice Presidential units •4 Senior Vice Presidential units, •the Libraries •the Graduate School, and •The Florida Museum of Natural History Over $700 million in annual research funding
  • 6. Communication is Key! We use a distributed leadership model Each of our 16 colleges, 4 Senior Vice Presidential units, 10 Vice presidential units, the Graduate School, The Libraries, and the Florida Museum of Natural History all have appointed SACSCOC Coordinators These individuals meet as a group when needed, usually twice a year We communicate with them, they communicate to their faculty and administration
  • 7. UF Assessment System Faculty and staff Unit SACSCOC Coordinator System Inputs – Automated Approval Tracking System • Assessment and Institutional Effectiveness Plans • Annual data reports System Outputs • Academic Assessment Committee actions (approve, comment, conditionally approve, recycle, deny, or table) • Feedback and guidance For more information, see: Brophy, T. S. (2017). Case study: The University of Florida Assessment System. T. Cumming & In M. D. Miller, (Eds.), Enhancing assessment in higher education: Putting psychometrics to work (pp. 186-204). Sterling, VA: Stylus.
  • 8. The Need for the Research Patterns emerged from the annual data reports • Inconsistencies across programs • Lack of understanding of institutional processes Anecdotal evidence from the SACSCOC Coordinators • Individual reports from coordinators about activity in their colleges and units Academic Assessment Committee interest
  • 9. Phase 1: Survey of the College SACSCOC Coordinators (Das & Gater, 2015) 1. Started with validated survey • Welsh, J. F., & Metcalf, J. (2003). Cultivating faculty support for institutional effectiveness activities: Benchmarking best practices. Assessment & Evaluation in Higher Education, 28(1), 33-45. doi:10.1080/02602930301682 2. Modified to delete items and add our own items 3. Obtained IRB approval (IRB2 2014-U-0242) 4. Created on-line survey and sent link via e-mail 5. Response Rate = 11 out of 17 = 65%
  • 10. 4.8% 23.8% 10.0% 71.4% 90.0% Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree Survey Questions & Results Program goals and/or student learning outcomes will be an important component of SACSCOC accreditation well into the future. Efforts to evaluate the effectiveness of my college’s degree program goals and/or student learning outcomes are worthwhile.
  • 11. 52.4% 90.5% 4.8% 42.9% 9.5% Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree Survey Questions & Results Analyzing program goals and/or student learning outcomes are not an important component of my job responsibilities. Program goals and/or student learning outcomes are primarily the responsibility of university staff outside my college.
  • 12. 38.1% 52.4% 9.5% 57.1% 42.9% Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree Survey Questions & Results Program goals and/or student learning outcomes in my college is, or would be, strengthened by active participation by faculty members. Developing program goals and/or student learning outcomes is a fad that will likely be replaced by another area of emphasis.
  • 13. 4.8% 4.8% 14.3% 9.5% 19.0% 23.8% 61.9% 61.9% Strongly Disagree Somewhat Disagree Somewhat Agree Strongly Agree Survey Questions & Results • Documenting program goals and/or student learning outcomes should be an integral element to any college or department-specific SACSCOC accreditation criteria. • Resources dedicated to program goal and/or student learning outcome activities are investments in the long-term effectiveness of my college.
  • 14. Phase 1: Modifications • Results led to change • Office of Institutional Assessment modified annual professional development – Emphasized that we support PD initiatives in the field (one-on-one, small groups, telephone support) – Customized professional development to the extent possible
  • 15. Phase 2: Stakeholder Interviews (Das & Gater, 2015) • What duties comprise your role/job? • Describe annual data collection process. • How do you handle training? • How do you motivate faculty? New faculty? • What do you do when data is not entered or entered inadequately? • Any tips/advice? Structured interviews with college representatives, stakeholders responsible for assessment data
  • 16. Phase 2: Stakeholder Interview Results Evidence: Centralized data entry produces the highest quality results Result: This is what we recommend as a best practice Evidence: Training styles varied depending on the size of the department or college: regular agenda item in faculty meetings, one-on-one training, ad- hoc meetings, etcetera Result: Data supports that we continue to offer training to comply with CS 3.3.1 in multiple formats: web, video, PDF, phone Evidence: Some units did not use data collected for CS 3.3.1 for anything other than CS 3.3.1 Result: We shared ways other faculty used results (e.g., template for other accrediting bodies, program improvement processes)
  • 17. Phase 3: Faculty Focus Groups Academic Assessment Committee developed these research questions: How are UF faculty engaged in academic assessment processes at the University of Florida? In what ways could this information lead to the modification and improvement of institutional assessment processes? The Das & Gater findings revealed a need to better understand faculty engagement with assessment
  • 18. Participants 16 focus groups (one in each college), N = 146 Tenure and tenure-track faculty; all were invited to join groups Specific days and times were set Doodle polls sent out, and focus groups established based on availability and interest Delimitation: Limited to faculty who were available at the scheduled times in each college
  • 19. Question Categories Faculty Engagement with Assessment Instructor Assessments Perceived Value of Assessments Assessment at the Department/ Program/Major level Closing question – What haven’t we asked you today that you would like to talk about?
  • 20. Results 34 sets of data – field notes, recordings Loaded into NVivo11 Coded first for response categories 8 response categories and three themes emerged
  • 21. Response Categories Data Category Description Assessment methods and processes Various assessment types used for the assessment of student learning Challenges Issues that impede assessment or make it challenging Concerns Areas that cause concern or are barriers to assessment they would like to do Context Factors that influence assessment that faculty cannot control Data gaps Information that faculty would like to collect but cannot or do not Needs What faculty would like to have to facilitate their assessment processes Use of Results The ways that faculty use the results of their assessments Value What faculty value about assessment
  • 22. Theme 1: Assessment is Valued
  • 23. Faculty quotes “If you don’t determine they’re learning, why are we here?” “There is value in seeing students succeed, and assessment provides information that is used to re-examine student knowledge.”
  • 24. Findings The value of assessment is often directly associated with standards of the field Use of results is consistent but purposes and methods vary • Most prevalent: assessment data used to modify instruction to advance student learning • Open dialogue is a primary assessment methodology for those who teach/mentor in one-to-one teaching situations Faculty want to learn from their peers – sharing assessment methods and processes is valuable
  • 25. Theme 2: Influential Conditions
  • 26. Faculty quotes “The type of assessment I use depends on the size of the class.” “Our disciplinary accreditor requires a set of national exams that all of our students must take. Why can’t we use these as outcome measures?”
  • 27. Findings Two conditions that impact assessment were common across the colleges: Class size Disciplinary accreditation
  • 28. Class Size Primary driver for assessment methodology: number of students in the class Large classes constrain assessment choices to large scale measures (such as exams scored electronically) There is a tension between what faculty want to do to assess their students and what they feel they must do because of class size
  • 29. Disciplinary Accreditation Disciplinary accreditors often require student learning measures and some prescribe student learning outcomes Some disciplinary accreditors have established assessment standards Frustrations: • Aligning disciplinary accreditation requirements with SACSCOC requirements • Appropriate use of required third- party exams
  • 30. Theme 3: Misconceptions about SACSCOC Reporting
  • 31. Findings We didn’t ask, but… Accreditation reporting was raised in nearly every college Three misconceptions emerged: • All student learning must be quantified • Academic assessment is limited to specific categories and types • The data “disappears”
  • 32. Misconception 1: All student learning must be quantified “Our faculty are very engaged in gathering anecdotal evidence, but push back with quantification of student learning information.” “Subjective data cannot be quantified.” • Likely arises from a UF requirement to provide a rubric • We ask for summary data; for some, this has been conflated with quantification of student learning data
  • 33. Misconception 2: Assessment is limited to specific categories and types “The criteria for SACSCOC are limited; I feel like my hands are tied.” • Florida regulations require certain categories of student learning outcomes • The UF Graduate School also has established outcome categories • However: additional categories are permitted
  • 34. Misconception 3: The data “disappears” Faculty related concerns about not knowing what happens to the data they report Data reporting is done in our accreditation module from a third party provider Access is limited to those who have a “role” in the software program This is a legitimate concern
  • 35. CULTIVATING ENGAGEMENT: ACTIONS TAKEN BASED ON OUR ANALYSIS
  • 36. Modifications based on our analysis Finding Recommendations Value 1. Share assessment work across colleges 1. Continue the UF Assessment Conference, and develop an online mechanism for faculty to share their assessment work with others. Influential conditions 1. Class size 2. Disciplinary accreditation 1. Develop faculty workshops in conjunction with the Office of Faculty Development and Teaching Excellence on using Canvas assessment tools to facilitate data collection for multiple assessment methods. 2. Work with specific disciplines to maximize use of student learning data collected for disciplinary accreditors for regional accreditation reports.
  • 37. Modifications based on our analysis Finding Recommendations Misconceptions 1. Quantification of student learning data 2. Limitations on assessment outcomes and measures. 3. Faculty are not clear on how the data they report is used at the institutional level, nor do they have ready access to it. 1. Develop online tools to clarify what can be reported. 2. Develop online tools to clarify assessment types. 3. Develop an accessible view of student learning data reports, perhaps through visualization.
  • 38. Current Visualization Projects (in progress) General Education The Quality Enhancement Plan
  • 39. Continuing Research Non-tenure track faculty N = 1860 Will take place in February-March, 2018
  • 40. UF Assessment Conferences Next internal conference is March 30, 2018 Theme: Assessment@UF: Enhancing Program Excellence through Assessment • Plenary with our Peers: Engaging with Assessment in the Research University (panelists from UT Austin, UNC Chapel Hill, University of Virginia) April 3-5, 2019: Expanded conference – stay tuned for information!
  • 41. Questions and Discussion Rajeeb Das, Institutional Planning and Research rdas@ufl.edu 352-392-0456 Timothy S. Brophy, Institutional Assessment tbrophy@aa.ufl.edu 352-273-4476