SlideShare a Scribd company logo
1 of 32
Download to read offline
Exams Evaluate Students:
Who’s Evaluating Exams?
Data-Informed Exam Design
G. Alex Ambrose, Professor of the Practice, Kaneb Center for Teaching and Learning
Kael Kanczuzewski, Academic Technology Professional, Learning Platforms
Xiaojing Duan, Learning Platform/Analytics Engineer, Learning Platforms
Kelley M. H Young, Assistant Teaching Professor, Dept of Chemistry and Biochemistry
J. Daniel Gezelter, Professor and Director of Undergraduate Studies, Dept of Chemistry and Biochemistry
University of Notre Dame
2019 Midwest Scholarship of Teaching and Learning Conference
1
How to Cite this Presentation:
Ambrose, G. Alex, Duan, Xiaojing, Kanczuzewski, Kael,
Young, Kelley M., & Gezelter, J. Daniel (2019) “Exams
Evaluate Students: Who’s Evaluating Exams? Data-
Informed Exam Design” 2019 Midwest Scholarship of
Teaching and Learning Conference, Indiana University-
South Bend.
- Learning Platforms
- Enterprise Architecture
- Platform Services
- InfoSec
- Business Intelligence
- Project Management
Collaborators
Research Context, Challenge, Goal, & Questions
Exam Data & Tools
Exam Item Analysis, Analytics, & Dashboard
Course & Instructor Implications
Questions & Discussion
4
Research Context, Challenge, & Goals
Exams are:
1. a tool to assess mastery,
2. an incentive to motivate students to study, and
3. the cause of retention issues for underserved and underrepresented
students in STEM majors.
Challenge: Can we make exams do the first two tasks more effectively while
fixing the retention issue?
Goals: Transform the exam design, delivery, grading, analysis, and redesign
process to make it more efficient, effective, error-free, easy to use, and
enjoyable. 5
Research Questions
RQ1: How do we evaluate an assessment technology tool?
RQ2: What are the best item analysis methods and easiest visualizations to
support students and instructors?
RQ3: What are the course, student & instructor impacts and implications for
continuous improvement changes?
6
Research Context, Challenge, Goal, & Questions
Exam Data & Tools
Exam Item Analysis, Analytics, & Dashboard
Course & Instructor Implications
Questions & Discussion
7
RQ1: How do we evaluate an assessment technology tool?
Ed Tech Evaluation: SAMR + 5 E’s
8
https://www.wqusability.com/
https://www.schoology.com/blog/samr-model-practical-guide-edtech-integration
Error
Tolerant
Effective
Easy to
Learn
Efficient
Engaging
5 E’s of Usability
Data Ownership.. Or at least full access
9
https://www.jisc.ac.uk/learning-analytics
The Gradescope Pilot
Gradescope enables instructors to grade
paper-based exams or assignments online.
Paper exams are scanned. Gradescope AI
interprets responses and groups similar
answers to speed up grading. Rubrics help
ensure fair and consistent grading.
Working closely with Gradescope, we have
access to export data including item-level
question data.
10
12
Gradescope Instructor Results and SAMR
Substitution
Augmentation
Modification
Redefinition
N=14
N=946
N=946
Research Context, Challenge, Goal, & Questions
Exam Data & Tools
Exam Item Analysis, Analytics, & Dashboard
Course & Instructor Implications
Questions & Discussion
15
RQ2: What are the best item analysis methods and easiest visualizations to
support students and instructors?
Gradescope Current Distractor Performance
16
Item Difficulty Index
● Definition: a measure of how many students exhibited mastery of one topic.
● Formula: the percentage of the total group that got the item correct.
Reference: https://www.uwosh.edu/testing/faculty-information/test-scoring/score-report-interpretation/item-analysis-1/item-difficulty
17
18
Item Discrimination Index
● Definition: a measure of an item’s effectiveness in differentiating students who
mastered the topic from those who did not.
● Formula:
○ (Upper Group Percent Correct) - (Lower Group Percent Correct)
○ Upper Group = Top 27% of exam score
○ Lower Group = Lowest 27% of exam score
● Index scale:
○ 40%-100% Excellent Predictive Value
○ 25% - 39% Good Predictive Value
○ 0 - 24% Possibly No Predictive Value
Reference: https://www.uwosh.edu/testing/faculty-information/test-scoring/score-report-interpretation/item-analysis-1/item-i
19
20
Connecting Exam to HW Analytics
21
Research Context, Challenge, Goal, & Questions
Exam Data & Tools
Exam Item Analysis, Analytics, & Dashboard
Course & Instructor Implications
Questions & Discussion
22
RQ3: What are the course, student & instructor impacts and implications for
continuous improvement changes?
Course & Instructor Implications
Exam design is slightly modified –
item answer spaces are delineated,
and an initial rubric is put in place.
23
Exam processing requires
significant investment in labor.
Course & Instructor Implications
24
Exam scanning requires roughly
2-3 hours additional time for 1000
exams.
Exam grading is much smoother,
and improvements are immediately
apparent.
Course & Instructor Implications
Exam feedback can be more
informative and personalized.
● Applied rubric items
● Personal item feedback
25
Exam data is easily accessible.
● Overall exam statistics
● Item-by-item statistics
● Grades synchronize with LMS
Course & Instructor Implications
Regrade requests drop dramatically
● Previous benchmark of 40
requests for 1000 exams
(4% regrades)
After moving to Gradescope:
● Exam item regrade requests: 64
● Total exam items graded: 85,078
● 0.075% regrades
26
Monolithic Exams
The data analytics and economies of
scale are only possible with monolithic
exams.
Multiple versions and randomized
answers are still works in progress at
Gradescope.
Many large courses at ND don’t
currently use monolithic exams.
Course & Instructor Implications
Test Item Library
We want test questions that efficiently
gauge mastery of material.
We want to eliminate item bias,
particularly linguistic and cultural
biases.
Do free response items test mastery of
material that multiple choice items
don’t capture?
27
Early Warning System
Can we catch struggling students early
in the semester?
Do homework attempts signify
problems with mastery?
Do particular homework items correlate
with particular exam items?
Which homework items don’t provide
mastery on exams?
Summary
RQ1: How do we evaluate an assessment technology tool? (SAMR, 5 E’s)
RQ2: What are the best item analysis methods and easiest visualizations to
support students and instructors? (Distractor Performance, Item
Difficulty & Discrimination)
RQ3: What are the course, student & instructor impacts and implications for
continuous improvement changes? (Scanning, Re+Grading, Feedback,
Data & Analytics, Revisit & Revise Test Item Library, & Early Warning)
28
Future Work?
● Early Warning: Cross-reference student learning activity, homework
analytics, and exam item analysis to let instructors intervene early to
improve student performance, course, and assessment design.
● Question Bank: Over time make a more inclusive question bank (not too
long without any unintentional bias) in Gradescope and compare previous
exam items year over year.
● Deeper Analysis: Overlay filters based on demographics, SES, ESL, and
HS preparation
● Scale to other STEM Courses: Calculus, Organic Chemistry, and Physics
29
References
Ambrose, G. Alex, Abbott, Kevin, Lanski, Alison (2017) “Under the Hood of a Next Generation Digital Learning Environment in Progress” Educause
Review.
Gugiu, M. R., & Gugiu, P. C. (2013). Utilizing item analysis to improve the evaluation of student performance. Journal of Political Science Education, 9(3),
345-361
Kern, Beth, et al. "The role of SoTL in the academy: Upon the 25th anniversary of Boyer’s Scholarship Reconsidered." Journal of the Scholarship of
Teaching and Learning 15.3 (2015):1-14.
Miller, Patrick, Duan, Xiajing (2018) “NGDLE Learning Analytics: Gaining a 360-Degree View of Learning” Educause Review.
Nielsen, J. (1993). Usability Engineering (1st ed.). Morgan Kaufmann.
Nieveen, N., & van den Akker, J. (1999). Exploring the potential of a computer tool for instructional developers. Educational Technology Research and
Development, 47(3), 77-98.
Puentedura, R. R. (2014). SAMR and TPCK: A hands-on approach to classroom practice. Hipassus. En ligne: Retrieved from:
http://www.hippasus.com/rrpweblog/archives/2012/09/03/BuildingUponSAMR.pdf
Siri, A., & Freddano, M. (2011). The use of item analysis for the improvement of objective examinations. Procedia-Social and Behavioral Sciences, 29, 188-
197.
Syed, M., Anggara, T., Duan, X., Lanski, A., Chawla, N. & Ambrose, G. A. (2018) Learning Analytics Modular Kit: A Closed Loop Success Story in Boosting
Students Proceedings of the International Conference on Learning Analytics & Knowledge.
30
Research Problem, Goal, Questions, and Context
Exam Data & Tools
Exam Item Analysis, Analytics, & Dashboard
Course & Instructor Implications
Questions & Discussion
31
32
More Information, Connect, Collaborate?
Visit our Lab Blog at sites.nd.edu/real

More Related Content

What's hot

LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
Yun Huang
 
EDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINALEDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINAL
Ellen Wagner
 
Glfes summer institute2013_raleigh_final
Glfes summer institute2013_raleigh_finalGlfes summer institute2013_raleigh_final
Glfes summer institute2013_raleigh_final
Tricia Townsend
 
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Sheryl Abshire
 
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Sheryl Abshire
 
Inquiry research action plan
Inquiry research action planInquiry research action plan
Inquiry research action plan
oliviaherak
 
Making the Argument for Learning Science in Informal Environments - Math in z...
Making the Argument for Learning Science in Informal Environments - Math in z...Making the Argument for Learning Science in Informal Environments - Math in z...
Making the Argument for Learning Science in Informal Environments - Math in z...
K L
 

What's hot (20)

Projctppt (1)
Projctppt (1)Projctppt (1)
Projctppt (1)
 
Designing Learning Analytics for Humans with Humans
Designing Learning Analytics for Humans with HumansDesigning Learning Analytics for Humans with Humans
Designing Learning Analytics for Humans with Humans
 
LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
LAK21 Data Driven Redesign of Tutoring Systems (Yun Huang)
 
Educational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons LearnedEducational Data Mining in Program Evaluation: Lessons Learned
Educational Data Mining in Program Evaluation: Lessons Learned
 
Dit elearning summer school 2015 analytics
Dit elearning summer school 2015   analyticsDit elearning summer school 2015   analytics
Dit elearning summer school 2015 analytics
 
Introduction to Learning Analytics
Introduction to Learning AnalyticsIntroduction to Learning Analytics
Introduction to Learning Analytics
 
EDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINALEDUCA Leveraging Analytics FINAL
EDUCA Leveraging Analytics FINAL
 
Learning Analytics @ The Open University
Learning Analytics @ The Open UniversityLearning Analytics @ The Open University
Learning Analytics @ The Open University
 
ISSOTL Presentation
ISSOTL PresentationISSOTL Presentation
ISSOTL Presentation
 
Developing Self-regulated Learning in High-school Students: The Role of Learn...
Developing Self-regulated Learning in High-school Students: The Role of Learn...Developing Self-regulated Learning in High-school Students: The Role of Learn...
Developing Self-regulated Learning in High-school Students: The Role of Learn...
 
Glfes summer institute2013_raleigh_final
Glfes summer institute2013_raleigh_finalGlfes summer institute2013_raleigh_final
Glfes summer institute2013_raleigh_final
 
Riding the tiger: dealing with complexity in the implementation of institutio...
Riding the tiger: dealing with complexity in the implementation of institutio...Riding the tiger: dealing with complexity in the implementation of institutio...
Riding the tiger: dealing with complexity in the implementation of institutio...
 
Using predictive indicators of student success at scale – implementation succ...
Using predictive indicators of student success at scale – implementation succ...Using predictive indicators of student success at scale – implementation succ...
Using predictive indicators of student success at scale – implementation succ...
 
Blackboard Learning Analytics Research Update
Blackboard Learning Analytics Research UpdateBlackboard Learning Analytics Research Update
Blackboard Learning Analytics Research Update
 
From theory to practice blending the math classroom and creating a data cultu...
From theory to practice blending the math classroom and creating a data cultu...From theory to practice blending the math classroom and creating a data cultu...
From theory to practice blending the math classroom and creating a data cultu...
 
Learning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic setsLearning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic sets
 
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
 
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
Lamar resconfdevelopmentimplementationuseeportfoliospk 12 schools-3-22-13
 
Inquiry research action plan
Inquiry research action planInquiry research action plan
Inquiry research action plan
 
Making the Argument for Learning Science in Informal Environments - Math in z...
Making the Argument for Learning Science in Informal Environments - Math in z...Making the Argument for Learning Science in Informal Environments - Math in z...
Making the Argument for Learning Science in Informal Environments - Math in z...
 

Similar to Exams evaluate students. Who’s evaluating exams? Data-Informed Exam Design

Increasing Course Revision Efficacy
Increasing Course Revision EfficacyIncreasing Course Revision Efficacy
Increasing Course Revision Efficacy
Steven McGahan
 
Peerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the fieldPeerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the field
Lenandlar Singh
 
What can e assessment do for teaching & learning
What can e assessment do for teaching & learningWhat can e assessment do for teaching & learning
What can e assessment do for teaching & learning
Kenji Lamb
 

Similar to Exams evaluate students. Who’s evaluating exams? Data-Informed Exam Design (20)

Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
Increasing Course Revision Efficacy
Increasing Course Revision EfficacyIncreasing Course Revision Efficacy
Increasing Course Revision Efficacy
 
CrICET: Learning Studios
CrICET: Learning StudiosCrICET: Learning Studios
CrICET: Learning Studios
 
Investigating learning strategies in a dispositional learning analytics conte...
Investigating learning strategies in a dispositional learning analytics conte...Investigating learning strategies in a dispositional learning analytics conte...
Investigating learning strategies in a dispositional learning analytics conte...
 
Quality matters in Blended Course Design and Development
Quality matters in Blended Course Design and DevelopmentQuality matters in Blended Course Design and Development
Quality matters in Blended Course Design and Development
 
Learner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program AssessmentLearner Analytics and the “Big Data” Promise for Course & Program Assessment
Learner Analytics and the “Big Data” Promise for Course & Program Assessment
 
Professional development in MCQ writing
Professional development in MCQ writingProfessional development in MCQ writing
Professional development in MCQ writing
 
Relating Instructional Materials Use to Student Achievement Using Validated M...
Relating Instructional Materials Use to Student Achievement Using Validated M...Relating Instructional Materials Use to Student Achievement Using Validated M...
Relating Instructional Materials Use to Student Achievement Using Validated M...
 
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluati...
 
Understanding Student Learning Using Learning Management Systems and Basic An...
Understanding Student Learning Using Learning Management Systems and Basic An...Understanding Student Learning Using Learning Management Systems and Basic An...
Understanding Student Learning Using Learning Management Systems and Basic An...
 
Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...
Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...
Bb Education on Tour | Blackboard Learning Analytics | Chris Eske, Platform S...
 
Moodle moot Dublin May 2015
Moodle moot Dublin May 2015Moodle moot Dublin May 2015
Moodle moot Dublin May 2015
 
Peerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the fieldPeerwise and students’ contribution experiences from the field
Peerwise and students’ contribution experiences from the field
 
IRJET- Academic Performance Analysis System
IRJET- Academic Performance Analysis SystemIRJET- Academic Performance Analysis System
IRJET- Academic Performance Analysis System
 
Oral Defense presentation
Oral Defense presentationOral Defense presentation
Oral Defense presentation
 
Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?Assessment Futures: The Role for e-Assessment?
Assessment Futures: The Role for e-Assessment?
 
Prospect for learning analytics to achieve adaptive learning model
Prospect for learning analytics to achieve  adaptive learning modelProspect for learning analytics to achieve  adaptive learning model
Prospect for learning analytics to achieve adaptive learning model
 
Retiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative AssessmentsRetiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative Assessments
 
What can e assessment do for teaching & learning
What can e assessment do for teaching & learningWhat can e assessment do for teaching & learning
What can e assessment do for teaching & learning
 
Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium
 

More from G. Alex Ambrose

Examining the Role of Digital Badges in a University's Massive Open Online Co...
Examining the Role of Digital Badges in a University's Massive Open Online Co...Examining the Role of Digital Badges in a University's Massive Open Online Co...
Examining the Role of Digital Badges in a University's Massive Open Online Co...
G. Alex Ambrose
 
Googlios 21st Century Skills & Learning in Digital Age-pdf
Googlios 21st Century Skills & Learning in Digital Age-pdfGooglios 21st Century Skills & Learning in Digital Age-pdf
Googlios 21st Century Skills & Learning in Digital Age-pdf
G. Alex Ambrose
 

More from G. Alex Ambrose (8)

Liberate Learning through Next Generation Assessment -AACU 2018 Closing Plenary
Liberate Learning through Next Generation Assessment -AACU 2018 Closing PlenaryLiberate Learning through Next Generation Assessment -AACU 2018 Closing Plenary
Liberate Learning through Next Generation Assessment -AACU 2018 Closing Plenary
 
Incorporating ePortfolios into Advising Practice
Incorporating ePortfolios  into Advising PracticeIncorporating ePortfolios  into Advising Practice
Incorporating ePortfolios into Advising Practice
 
Examining the Role of Digital Badges in a University's Massive Open Online Co...
Examining the Role of Digital Badges in a University's Massive Open Online Co...Examining the Role of Digital Badges in a University's Massive Open Online Co...
Examining the Role of Digital Badges in a University's Massive Open Online Co...
 
Flipped Finals: Assessment As Learning via Culminating ePortfolios
Flipped Finals: Assessment As Learning via Culminating ePortfoliosFlipped Finals: Assessment As Learning via Culminating ePortfolios
Flipped Finals: Assessment As Learning via Culminating ePortfolios
 
Googlios 21st Century Skills & Learning in Digital Age-pdf
Googlios 21st Century Skills & Learning in Digital Age-pdfGooglios 21st Century Skills & Learning in Digital Age-pdf
Googlios 21st Century Skills & Learning in Digital Age-pdf
 
Googlios: Next Generation E-Portfolios at the University of Notre Dame
Googlios: Next Generation E-Portfolios at the University of Notre DameGooglios: Next Generation E-Portfolios at the University of Notre Dame
Googlios: Next Generation E-Portfolios at the University of Notre Dame
 
Digital Storytelling 101
Digital Storytelling 101Digital Storytelling 101
Digital Storytelling 101
 
Visual Definitions of Web 2.0
Visual Definitions of Web 2.0Visual Definitions of Web 2.0
Visual Definitions of Web 2.0
 

Recently uploaded

Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
kauryashika82
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
QucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
QucHHunhnh
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 

Recently uploaded (20)

Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in DelhiRussian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
Russian Escort Service in Delhi 11k Hotel Foreigner Russian Call Girls in Delhi
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Unit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptxUnit-V; Pricing (Pharma Marketing Management).pptx
Unit-V; Pricing (Pharma Marketing Management).pptx
 
Asian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptxAsian American Pacific Islander Month DDSD 2024.pptx
Asian American Pacific Islander Month DDSD 2024.pptx
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-IIFood Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural ResourcesEnergy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
Energy Resources. ( B. Pharmacy, 1st Year, Sem-II) Natural Resources
 

Exams evaluate students. Who’s evaluating exams? Data-Informed Exam Design

  • 1. Exams Evaluate Students: Who’s Evaluating Exams? Data-Informed Exam Design G. Alex Ambrose, Professor of the Practice, Kaneb Center for Teaching and Learning Kael Kanczuzewski, Academic Technology Professional, Learning Platforms Xiaojing Duan, Learning Platform/Analytics Engineer, Learning Platforms Kelley M. H Young, Assistant Teaching Professor, Dept of Chemistry and Biochemistry J. Daniel Gezelter, Professor and Director of Undergraduate Studies, Dept of Chemistry and Biochemistry University of Notre Dame 2019 Midwest Scholarship of Teaching and Learning Conference 1
  • 2. How to Cite this Presentation: Ambrose, G. Alex, Duan, Xiaojing, Kanczuzewski, Kael, Young, Kelley M., & Gezelter, J. Daniel (2019) “Exams Evaluate Students: Who’s Evaluating Exams? Data- Informed Exam Design” 2019 Midwest Scholarship of Teaching and Learning Conference, Indiana University- South Bend.
  • 3. - Learning Platforms - Enterprise Architecture - Platform Services - InfoSec - Business Intelligence - Project Management Collaborators
  • 4. Research Context, Challenge, Goal, & Questions Exam Data & Tools Exam Item Analysis, Analytics, & Dashboard Course & Instructor Implications Questions & Discussion 4
  • 5. Research Context, Challenge, & Goals Exams are: 1. a tool to assess mastery, 2. an incentive to motivate students to study, and 3. the cause of retention issues for underserved and underrepresented students in STEM majors. Challenge: Can we make exams do the first two tasks more effectively while fixing the retention issue? Goals: Transform the exam design, delivery, grading, analysis, and redesign process to make it more efficient, effective, error-free, easy to use, and enjoyable. 5
  • 6. Research Questions RQ1: How do we evaluate an assessment technology tool? RQ2: What are the best item analysis methods and easiest visualizations to support students and instructors? RQ3: What are the course, student & instructor impacts and implications for continuous improvement changes? 6
  • 7. Research Context, Challenge, Goal, & Questions Exam Data & Tools Exam Item Analysis, Analytics, & Dashboard Course & Instructor Implications Questions & Discussion 7 RQ1: How do we evaluate an assessment technology tool?
  • 8. Ed Tech Evaluation: SAMR + 5 E’s 8 https://www.wqusability.com/ https://www.schoology.com/blog/samr-model-practical-guide-edtech-integration Error Tolerant Effective Easy to Learn Efficient Engaging 5 E’s of Usability
  • 9. Data Ownership.. Or at least full access 9 https://www.jisc.ac.uk/learning-analytics
  • 10. The Gradescope Pilot Gradescope enables instructors to grade paper-based exams or assignments online. Paper exams are scanned. Gradescope AI interprets responses and groups similar answers to speed up grading. Rubrics help ensure fair and consistent grading. Working closely with Gradescope, we have access to export data including item-level question data. 10
  • 11.
  • 12. 12 Gradescope Instructor Results and SAMR Substitution Augmentation Modification Redefinition N=14
  • 13. N=946
  • 14. N=946
  • 15. Research Context, Challenge, Goal, & Questions Exam Data & Tools Exam Item Analysis, Analytics, & Dashboard Course & Instructor Implications Questions & Discussion 15 RQ2: What are the best item analysis methods and easiest visualizations to support students and instructors?
  • 17. Item Difficulty Index ● Definition: a measure of how many students exhibited mastery of one topic. ● Formula: the percentage of the total group that got the item correct. Reference: https://www.uwosh.edu/testing/faculty-information/test-scoring/score-report-interpretation/item-analysis-1/item-difficulty 17
  • 18. 18
  • 19. Item Discrimination Index ● Definition: a measure of an item’s effectiveness in differentiating students who mastered the topic from those who did not. ● Formula: ○ (Upper Group Percent Correct) - (Lower Group Percent Correct) ○ Upper Group = Top 27% of exam score ○ Lower Group = Lowest 27% of exam score ● Index scale: ○ 40%-100% Excellent Predictive Value ○ 25% - 39% Good Predictive Value ○ 0 - 24% Possibly No Predictive Value Reference: https://www.uwosh.edu/testing/faculty-information/test-scoring/score-report-interpretation/item-analysis-1/item-i 19
  • 20. 20
  • 21. Connecting Exam to HW Analytics 21
  • 22. Research Context, Challenge, Goal, & Questions Exam Data & Tools Exam Item Analysis, Analytics, & Dashboard Course & Instructor Implications Questions & Discussion 22 RQ3: What are the course, student & instructor impacts and implications for continuous improvement changes?
  • 23. Course & Instructor Implications Exam design is slightly modified – item answer spaces are delineated, and an initial rubric is put in place. 23 Exam processing requires significant investment in labor.
  • 24. Course & Instructor Implications 24 Exam scanning requires roughly 2-3 hours additional time for 1000 exams. Exam grading is much smoother, and improvements are immediately apparent.
  • 25. Course & Instructor Implications Exam feedback can be more informative and personalized. ● Applied rubric items ● Personal item feedback 25 Exam data is easily accessible. ● Overall exam statistics ● Item-by-item statistics ● Grades synchronize with LMS
  • 26. Course & Instructor Implications Regrade requests drop dramatically ● Previous benchmark of 40 requests for 1000 exams (4% regrades) After moving to Gradescope: ● Exam item regrade requests: 64 ● Total exam items graded: 85,078 ● 0.075% regrades 26 Monolithic Exams The data analytics and economies of scale are only possible with monolithic exams. Multiple versions and randomized answers are still works in progress at Gradescope. Many large courses at ND don’t currently use monolithic exams.
  • 27. Course & Instructor Implications Test Item Library We want test questions that efficiently gauge mastery of material. We want to eliminate item bias, particularly linguistic and cultural biases. Do free response items test mastery of material that multiple choice items don’t capture? 27 Early Warning System Can we catch struggling students early in the semester? Do homework attempts signify problems with mastery? Do particular homework items correlate with particular exam items? Which homework items don’t provide mastery on exams?
  • 28. Summary RQ1: How do we evaluate an assessment technology tool? (SAMR, 5 E’s) RQ2: What are the best item analysis methods and easiest visualizations to support students and instructors? (Distractor Performance, Item Difficulty & Discrimination) RQ3: What are the course, student & instructor impacts and implications for continuous improvement changes? (Scanning, Re+Grading, Feedback, Data & Analytics, Revisit & Revise Test Item Library, & Early Warning) 28
  • 29. Future Work? ● Early Warning: Cross-reference student learning activity, homework analytics, and exam item analysis to let instructors intervene early to improve student performance, course, and assessment design. ● Question Bank: Over time make a more inclusive question bank (not too long without any unintentional bias) in Gradescope and compare previous exam items year over year. ● Deeper Analysis: Overlay filters based on demographics, SES, ESL, and HS preparation ● Scale to other STEM Courses: Calculus, Organic Chemistry, and Physics 29
  • 30. References Ambrose, G. Alex, Abbott, Kevin, Lanski, Alison (2017) “Under the Hood of a Next Generation Digital Learning Environment in Progress” Educause Review. Gugiu, M. R., & Gugiu, P. C. (2013). Utilizing item analysis to improve the evaluation of student performance. Journal of Political Science Education, 9(3), 345-361 Kern, Beth, et al. "The role of SoTL in the academy: Upon the 25th anniversary of Boyer’s Scholarship Reconsidered." Journal of the Scholarship of Teaching and Learning 15.3 (2015):1-14. Miller, Patrick, Duan, Xiajing (2018) “NGDLE Learning Analytics: Gaining a 360-Degree View of Learning” Educause Review. Nielsen, J. (1993). Usability Engineering (1st ed.). Morgan Kaufmann. Nieveen, N., & van den Akker, J. (1999). Exploring the potential of a computer tool for instructional developers. Educational Technology Research and Development, 47(3), 77-98. Puentedura, R. R. (2014). SAMR and TPCK: A hands-on approach to classroom practice. Hipassus. En ligne: Retrieved from: http://www.hippasus.com/rrpweblog/archives/2012/09/03/BuildingUponSAMR.pdf Siri, A., & Freddano, M. (2011). The use of item analysis for the improvement of objective examinations. Procedia-Social and Behavioral Sciences, 29, 188- 197. Syed, M., Anggara, T., Duan, X., Lanski, A., Chawla, N. & Ambrose, G. A. (2018) Learning Analytics Modular Kit: A Closed Loop Success Story in Boosting Students Proceedings of the International Conference on Learning Analytics & Knowledge. 30
  • 31. Research Problem, Goal, Questions, and Context Exam Data & Tools Exam Item Analysis, Analytics, & Dashboard Course & Instructor Implications Questions & Discussion 31
  • 32. 32 More Information, Connect, Collaborate? Visit our Lab Blog at sites.nd.edu/real