SlideShare a Scribd company logo
1 of 34
A Learning Objective-Based Model
to Predict Students’ Success
in a First-Year Engineering Course
Farshid Marbouti
Committee: Dr. Diefes-Dux, Dr. Madhavan
Dr. Main, Dr. Ohland
January 2016
INTRODUCTION
MethodsIntroduction
Retention & Graduation Rate
3
Model Development Conclusions
Solution
Analyze students’ performance data
(Huang & Fang, 2012; White, 2012)
Predict students’ success in a course and identify
at-risk students
(Jin, Imbrie, Lin, & Chen, 2011; Olani, 2009)
Use the prediction model as an early warning system
Inform both the instructor and the students of their
performance
(Arnold & Pistilli, 2012b; Essa & Hanan, 2012; Macfadyen & Dawson, 2010)
4
MethodsIntroduction Model Development Conclusions
Shortcomings of Early Warning Systems
Use same model for all courses
Generic model decreases accuracy
Not useful for non-traditional courses
Do not reap the benefits of standards-based grading
More data points
Higher reliability in grading
Depend on online access data
May not be suitable for face-to-face courses
Some courses may not use schools’ CMS
Instructors do not have access to these data
5
MethodsIntroduction Model Development Conclusions
Research Questions (1/3)
6
Of six different predictive modeling methods, as
well as a seventh hybrid or Ensemble method,
which is the most successful at identifying at-risk
students, based on specified in-semester student
performance data? Why is this method the most
successful? Why are the other methods less
successful?
MethodsIntroduction Model Development Conclusions
Research Questions (2/3)
7
To what extent can the models created by
predictive methods for identifying at-risk students
in a course be improved through the selection of
in-semester student performance data (e.g., quiz,
homework learning objectives, midterm exam)?
What does the selection reveal?
MethodsIntroduction Model Development Conclusions
Research Questions (3/3)
8
What are the relationships, if any, between
students’ success and achievement of different
learning objectives in a course? What are the
implications for the resulting prediction models
and what are the pedagogical implications?
MethodsIntroduction Model Development Conclusions
METHODS
Settings & Data
10
50% training
25% Verify1
25% Verify2
Final Test
MethodsIntroduction Model Development Conclusions
ENGR 132
Prediction Modeling Methods
11
Input
Students’ learning objective scores (HW 1-5, 33 LOs)
Students’ grades on course assessments
Quiz: Weeks 1-5, 10 grades
Written Exam 1 (Midterm Exam): 1 grade
Output
At-risk (positive)
Successful (negative)
MethodsIntroduction Model Development Conclusions
Evaluating the Models
12
MethodsIntroduction Model Development Conclusions
Research Flowchart
13
MethodsIntroduction Model Development Conclusions
Spring 2014
Spring 2013
Data
Modeling Methods:
Logistics Regression
Multi Layer Perceptron
Support Vector Machine
K-Nearest Neighbor
Decision Tree
Naive Bayes Classifier
- Train and verify the 6
models
- Error Analysis
- Create an ensemble model
Model Development
- Train and verify top 2
models
- Use different number
of variables
Feature Selection
- Test top 2 models
- Select optimal
number of variables
Final Test
- Randomly cluster training
data
- Train top 2 models with
different number of clusters
- Verify the models
Model Robustness
- Train top 2 models with
subsets of data
- Verify the models
Assessment Types
Feature Selection
Methods:
Correlations
Explained Variance
Gini Gain
1 2
3
Top
variables
Top 2 modeling
methods
MODEL DEVELOPMENT
Training and Verifying the Models
15
MethodsIntroduction Model Development Conclusions
Ensemble Model
16
KNN, SVM, NBC
MethodsIntroduction Model Development Conclusions
R.Q. 1 – Best Prediction Model(s)
NBC & Ensemble
Low % of students fail -> small sample size
Bias: Inaccurate assumptions in the learning algorithm
Variance: Sensitivity of the model to small changes
Models with high bias/low variance perform better
17
MethodsIntroduction Model Development Conclusions
Predictive Power of Assessments
18
MethodsIntroduction Model Development Conclusions
Feature Selection - Final Test
19
MethodsIntroduction Model Development Conclusions
R.Q. 2 – Data Selection
Models with only two variables
Simple models have high bias/low variance
20
MethodsIntroduction Model Development Conclusions
Correlations with Success
21
MethodsIntroduction Model Development Conclusions
R.Q. 3 – Learning Objectives
Identify potential threshold learning objectives
Week 5 was important for students’ success
Topic: User defined functions in Matlab
The topic is important for the rest of the semester
The first difficult topic that can differentiate students
Students start to take the course seriously from week 5
22
MethodsIntroduction Model Development Conclusions
CONCLUSIONS
23
Recommendations
Minimum class size: 120 students with ~10% at-risk students
All models have error: Communicate results with this consideration
Use at least two semesters data to train and test the models
No drastic change in course structure from one semester to another
Be mindful about false negative (type II) error
The process can reveal relations about assessments and success
24
% of at-risk students
low high
# of students
in the course
low Know the students
high SVM, NBC KNN, DT, MLP
MethodsIntroduction Model Development Conclusions
Limitations
Models Errors
Pedagogical decisions (e.g. only HW was graded by SBG)
Quality of performance data
Mid-size classes (40-120)
25
MethodsIntroduction Model Development Conclusions
Future Work
How to use the models?
Predict students performance during the semester
Investigate what leads students to success in a course
26
MethodsIntroduction Model Development Conclusions
Thank You…
 My Advisers Dr. Diefes-Dux & Dr. Madhavan
 My committee members: Dr. Ohland and Dr. Main
 My wife
 Friends who were part of my journey
Questions?
Research
Flowchart
29
MethodsIntroduction Model Development ConclusionsSpring 2014
Spring 2013
Data Cleaning
Randomly divide
- Sp 2013 data into 50% train
data, 25% / 25% verify
- Sp 2014 (test)
Train/Verify/Test Datasets
Modeling
Methods:
Log Reg
MLP
SVM
KNN
DT
NBC
- Train 6 models
Train
- Select variables
Feature
Selection
- Verify the 6 models
- Use verify1 data
Verify Models
- Compare the models
- Analyze the errors
Error Analysis
- Train/verify ensemble
model
- Select top 2 of 7
models
Create Ensemble
Model
- Train top 2 models
- Use different number
of variables
Train
- Test top 2 models
- Use test dataset
Test
- Train top 2 models with
different number of clusters
- Verify the models
Model Robustness
- Train top 2 models with
subsets of data
- Verify the models
Assessment Types
- Randomly cluster
training data
Cluster Data
- Verify top 2 models
- Use verify2 data
Verify Models
- Select optimal
number of predictors
Variable selection
Feature Selection
Methods:
Correlations
Explained Variance
Gini gain
1
2
3
Top 2 modeling
methods
Top
variables
Top 2 modeling
methods
Misidentifications
30
Misidentifications
31
Model Robustness
32
Feature Selection
33
MethodsIntroduction Model Development Conclusions
LO
Correlations
34

More Related Content

What's hot

Purposeful assessments
Purposeful assessmentsPurposeful assessments
Purposeful assessmentsMary Miller
 
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...ExamSoft
 
Standardized Information Literacy Assessment
Standardized Information Literacy AssessmentStandardized Information Literacy Assessment
Standardized Information Literacy AssessmentNicoleBranch
 
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...Franck Silvestre
 
Colorado assessment summit_teacher_eval
Colorado assessment summit_teacher_evalColorado assessment summit_teacher_eval
Colorado assessment summit_teacher_evalJohn Cronin
 
What data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course designWhat data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course designJohn Whitmer, Ed.D.
 
Comparative And Non Comparative Study
Comparative And Non Comparative  StudyComparative And Non Comparative  Study
Comparative And Non Comparative Studyu058688
 
Getting to grips with TESTA methods
Getting to grips with TESTA methodsGetting to grips with TESTA methods
Getting to grips with TESTA methodsTansy Jessop
 
Norm referenced grading system
Norm referenced grading systemNorm referenced grading system
Norm referenced grading systemobemrosalia
 
1.1.geiger
1.1.geiger1.1.geiger
1.1.geigerafacct
 
ScholarCentric Presentation
ScholarCentric PresentationScholarCentric Presentation
ScholarCentric PresentationKurt Ludwig
 
Norm reference grading system.ppt
Norm reference grading system.pptNorm reference grading system.ppt
Norm reference grading system.pptCyra Mae Soreda
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"EDUCAUSE
 
#HobsonsInsights - ROI Malaysia
#HobsonsInsights - ROI Malaysia#HobsonsInsights - ROI Malaysia
#HobsonsInsights - ROI MalaysiaHobsons APAC
 
Student test scores improved in English Lit
Student test scores improved in English Lit Student test scores improved in English Lit
Student test scores improved in English Lit AMAJOR3
 
Experiential Education Approaches in Nonprofit Management & Leadershi...
Experiential Education Approaches in   Nonprofit Management & Leadershi...Experiential Education Approaches in   Nonprofit Management & Leadershi...
Experiential Education Approaches in Nonprofit Management & Leadershi...heathercarpenter
 
ABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsEd Foster
 

What's hot (20)

Purposeful assessments
Purposeful assessmentsPurposeful assessments
Purposeful assessments
 
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...
Utilizing Multiple Grader Rubrics for Practical Assessment of Student Perform...
 
Standardized Information Literacy Assessment
Standardized Information Literacy AssessmentStandardized Information Literacy Assessment
Standardized Information Literacy Assessment
 
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...
Reflexive learning, socio-cognitive conflict and peer- assessment to improve ...
 
Fyp1 slides pdf 2
Fyp1 slides pdf 2Fyp1 slides pdf 2
Fyp1 slides pdf 2
 
Colorado assessment summit_teacher_eval
Colorado assessment summit_teacher_evalColorado assessment summit_teacher_eval
Colorado assessment summit_teacher_eval
 
What data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course designWhat data from 3 million learners can tell us about effective course design
What data from 3 million learners can tell us about effective course design
 
Comparative And Non Comparative Study
Comparative And Non Comparative  StudyComparative And Non Comparative  Study
Comparative And Non Comparative Study
 
SACS Readiness Week: Graduate and Post-Baccalaureate Programs
SACS Readiness Week: Graduate and Post-Baccalaureate ProgramsSACS Readiness Week: Graduate and Post-Baccalaureate Programs
SACS Readiness Week: Graduate and Post-Baccalaureate Programs
 
Getting to grips with TESTA methods
Getting to grips with TESTA methodsGetting to grips with TESTA methods
Getting to grips with TESTA methods
 
Norm referenced grading system
Norm referenced grading systemNorm referenced grading system
Norm referenced grading system
 
1.1.geiger
1.1.geiger1.1.geiger
1.1.geiger
 
ScholarCentric Presentation
ScholarCentric PresentationScholarCentric Presentation
ScholarCentric Presentation
 
Norm reference grading system.ppt
Norm reference grading system.pptNorm reference grading system.ppt
Norm reference grading system.ppt
 
Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"Seeking Evidence of Impact: Answering "How Do We Know?"
Seeking Evidence of Impact: Answering "How Do We Know?"
 
#HobsonsInsights - ROI Malaysia
#HobsonsInsights - ROI Malaysia#HobsonsInsights - ROI Malaysia
#HobsonsInsights - ROI Malaysia
 
Successful Statistics Course Redesign
Successful Statistics Course RedesignSuccessful Statistics Course Redesign
Successful Statistics Course Redesign
 
Student test scores improved in English Lit
Student test scores improved in English Lit Student test scores improved in English Lit
Student test scores improved in English Lit
 
Experiential Education Approaches in Nonprofit Management & Leadershi...
Experiential Education Approaches in   Nonprofit Management & Leadershi...Experiential Education Approaches in   Nonprofit Management & Leadershi...
Experiential Education Approaches in Nonprofit Management & Leadershi...
 
ABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca EdwardsABLE - Inside Government Nov 2017 - Rebecca Edwards
ABLE - Inside Government Nov 2017 - Rebecca Edwards
 

Viewers also liked

CEDIA Home Technology Event 2011: Designing the Perfect Rack
CEDIA Home Technology Event 2011: Designing the Perfect RackCEDIA Home Technology Event 2011: Designing the Perfect Rack
CEDIA Home Technology Event 2011: Designing the Perfect RackCarlAmbrus
 
159398318 6o-diagnostico-2013 (1)
159398318 6o-diagnostico-2013 (1)159398318 6o-diagnostico-2013 (1)
159398318 6o-diagnostico-2013 (1)Adriana Diaz
 
10 LATINAS Breaking Barriers and Leaving a Legacy
10 LATINAS Breaking Barriers and Leaving a Legacy10 LATINAS Breaking Barriers and Leaving a Legacy
10 LATINAS Breaking Barriers and Leaving a LegacyMaria de Jesus Dixon
 
Creative Social 'Best Piece of Advice Ever'
Creative Social 'Best Piece of Advice Ever'Creative Social 'Best Piece of Advice Ever'
Creative Social 'Best Piece of Advice Ever'Creative Social
 
EG Brochure English
EG Brochure EnglishEG Brochure English
EG Brochure EnglishCG Karakaxis
 
Computación básica
Computación básicaComputación básica
Computación básicaEgshells3
 
Jane Doe(보드게임)
Jane Doe(보드게임)Jane Doe(보드게임)
Jane Doe(보드게임)Dawoon Kim
 
Csp11 hack slideshare_copy2
Csp11 hack slideshare_copy2Csp11 hack slideshare_copy2
Csp11 hack slideshare_copy2Creative Social
 
Introduction to Economics
Introduction to EconomicsIntroduction to Economics
Introduction to EconomicsTan Heng Yong
 
Factors affecting language learning strategy
Factors affecting language learning strategyFactors affecting language learning strategy
Factors affecting language learning strategyAizud Din
 
Activ8 - External Presentation v1 0 PM
Activ8 - External Presentation v1 0 PMActiv8 - External Presentation v1 0 PM
Activ8 - External Presentation v1 0 PMSteve Kenny
 
環保署:「德翔臺北貨輪最新處理情形」報告
環保署:「德翔臺北貨輪最新處理情形」報告環保署:「德翔臺北貨輪最新處理情形」報告
環保署:「德翔臺北貨輪最新處理情形」報告R.O.C.Executive Yuan
 
20160818行政院106年度中央政府總預算案
20160818行政院106年度中央政府總預算案20160818行政院106年度中央政府總預算案
20160818行政院106年度中央政府總預算案R.O.C.Executive Yuan
 

Viewers also liked (20)

Farshid_Thesis
Farshid_ThesisFarshid_Thesis
Farshid_Thesis
 
CEDIA Home Technology Event 2011: Designing the Perfect Rack
CEDIA Home Technology Event 2011: Designing the Perfect RackCEDIA Home Technology Event 2011: Designing the Perfect Rack
CEDIA Home Technology Event 2011: Designing the Perfect Rack
 
Practica
PracticaPractica
Practica
 
159398318 6o-diagnostico-2013 (1)
159398318 6o-diagnostico-2013 (1)159398318 6o-diagnostico-2013 (1)
159398318 6o-diagnostico-2013 (1)
 
Service Transcript
Service TranscriptService Transcript
Service Transcript
 
10 LATINAS Breaking Barriers and Leaving a Legacy
10 LATINAS Breaking Barriers and Leaving a Legacy10 LATINAS Breaking Barriers and Leaving a Legacy
10 LATINAS Breaking Barriers and Leaving a Legacy
 
Rap
RapRap
Rap
 
Creative Social 'Best Piece of Advice Ever'
Creative Social 'Best Piece of Advice Ever'Creative Social 'Best Piece of Advice Ever'
Creative Social 'Best Piece of Advice Ever'
 
EG Brochure English
EG Brochure EnglishEG Brochure English
EG Brochure English
 
Computación básica
Computación básicaComputación básica
Computación básica
 
Jane Doe(보드게임)
Jane Doe(보드게임)Jane Doe(보드게임)
Jane Doe(보드게임)
 
Csp11 hack slideshare_copy2
Csp11 hack slideshare_copy2Csp11 hack slideshare_copy2
Csp11 hack slideshare_copy2
 
Introduction to Economics
Introduction to EconomicsIntroduction to Economics
Introduction to Economics
 
Factors affecting language learning strategy
Factors affecting language learning strategyFactors affecting language learning strategy
Factors affecting language learning strategy
 
Activ8 - External Presentation v1 0 PM
Activ8 - External Presentation v1 0 PMActiv8 - External Presentation v1 0 PM
Activ8 - External Presentation v1 0 PM
 
PROBLEMAS DE EDADES
PROBLEMAS DE EDADESPROBLEMAS DE EDADES
PROBLEMAS DE EDADES
 
Ensayo matemáticas
Ensayo  matemáticasEnsayo  matemáticas
Ensayo matemáticas
 
7 ensayos completos
7 ensayos completos7 ensayos completos
7 ensayos completos
 
環保署:「德翔臺北貨輪最新處理情形」報告
環保署:「德翔臺北貨輪最新處理情形」報告環保署:「德翔臺北貨輪最新處理情形」報告
環保署:「德翔臺北貨輪最新處理情形」報告
 
20160818行政院106年度中央政府總預算案
20160818行政院106年度中央政府總預算案20160818行政院106年度中央政府總預算案
20160818行政院106年度中央政府總預算案
 

Similar to Farshid_Defense

administrating test,scoring,grading vs marks
administrating test,scoring,grading vs marksadministrating test,scoring,grading vs marks
administrating test,scoring,grading vs markskrishu29
 
Program ID #28: Assessing Counseling Programs
Program ID #28: Assessing Counseling ProgramsProgram ID #28: Assessing Counseling Programs
Program ID #28: Assessing Counseling Programstcaconference
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating AssessmentsChristina Sax
 
Retiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative AssessmentsRetiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative AssessmentsExamSoft
 
Learning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic setsLearning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic setsnourallah
 
Evaluation in Education
Evaluation in EducationEvaluation in Education
Evaluation in EducationKusum Gaur
 
DiP committee presentation
DiP committee presentationDiP committee presentation
DiP committee presentationCPEDInitiative
 
Presentation ms english linguistics [autosaved]
Presentation ms english linguistics [autosaved]Presentation ms english linguistics [autosaved]
Presentation ms english linguistics [autosaved]Tahir Awan
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...CesToronto
 
National university assessment process
National university assessment processNational university assessment process
National university assessment processAshley Kovacs
 
Standard Setting In Medical Exams
Standard Setting In Medical ExamsStandard Setting In Medical Exams
Standard Setting In Medical ExamsSanjoy Sanyal
 
EPC 690C-Graduate School CONNECTION Point
EPC 690C-Graduate School CONNECTION PointEPC 690C-Graduate School CONNECTION Point
EPC 690C-Graduate School CONNECTION PointRyan Adams
 
administrating test,scoring, grading vs marks
administrating test,scoring, grading vs marksadministrating test,scoring, grading vs marks
administrating test,scoring, grading vs markskrishu29
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasJohn Cronin
 
Blended By Design: Student Readiness, Student Crisis Points, and Student Teams
Blended By Design: Student Readiness, Student Crisis Points, and Student TeamsBlended By Design: Student Readiness, Student Crisis Points, and Student Teams
Blended By Design: Student Readiness, Student Crisis Points, and Student TeamsEDUCAUSE
 
Quality and quality education
Quality and quality educationQuality and quality education
Quality and quality educationWilliamdharmaraja
 

Similar to Farshid_Defense (20)

administrating test,scoring,grading vs marks
administrating test,scoring,grading vs marksadministrating test,scoring,grading vs marks
administrating test,scoring,grading vs marks
 
Program ID #28: Assessing Counseling Programs
Program ID #28: Assessing Counseling ProgramsProgram ID #28: Assessing Counseling Programs
Program ID #28: Assessing Counseling Programs
 
Creating Assessments
Creating AssessmentsCreating Assessments
Creating Assessments
 
Retiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative AssessmentsRetiring Exam Questions? How to Use These Items in Formative Assessments
Retiring Exam Questions? How to Use These Items in Formative Assessments
 
Learning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic setsLearning Management Systems Evaluation based on Neutrosophic sets
Learning Management Systems Evaluation based on Neutrosophic sets
 
Evaluation in Education
Evaluation in EducationEvaluation in Education
Evaluation in Education
 
Influence of the students’ learning strategy on the evaluation scores
Influence of the students’ learning strategy on the evaluation scoresInfluence of the students’ learning strategy on the evaluation scores
Influence of the students’ learning strategy on the evaluation scores
 
DiP committee presentation
DiP committee presentationDiP committee presentation
DiP committee presentation
 
Teacher-made-test.pptx
Teacher-made-test.pptxTeacher-made-test.pptx
Teacher-made-test.pptx
 
Presentation ms english linguistics [autosaved]
Presentation ms english linguistics [autosaved]Presentation ms english linguistics [autosaved]
Presentation ms english linguistics [autosaved]
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...
 
New Options for Online Student Feedback
New Options for Online Student FeedbackNew Options for Online Student Feedback
New Options for Online Student Feedback
 
National university assessment process
National university assessment processNational university assessment process
National university assessment process
 
Standard Setting In Medical Exams
Standard Setting In Medical ExamsStandard Setting In Medical Exams
Standard Setting In Medical Exams
 
test
testtest
test
 
EPC 690C-Graduate School CONNECTION Point
EPC 690C-Graduate School CONNECTION PointEPC 690C-Graduate School CONNECTION Point
EPC 690C-Graduate School CONNECTION Point
 
administrating test,scoring, grading vs marks
administrating test,scoring, grading vs marksadministrating test,scoring, grading vs marks
administrating test,scoring, grading vs marks
 
Ed Reform Lecture - University of Arkansas
Ed Reform Lecture - University of ArkansasEd Reform Lecture - University of Arkansas
Ed Reform Lecture - University of Arkansas
 
Blended By Design: Student Readiness, Student Crisis Points, and Student Teams
Blended By Design: Student Readiness, Student Crisis Points, and Student TeamsBlended By Design: Student Readiness, Student Crisis Points, and Student Teams
Blended By Design: Student Readiness, Student Crisis Points, and Student Teams
 
Quality and quality education
Quality and quality educationQuality and quality education
Quality and quality education
 

Farshid_Defense

  • 1. A Learning Objective-Based Model to Predict Students’ Success in a First-Year Engineering Course Farshid Marbouti Committee: Dr. Diefes-Dux, Dr. Madhavan Dr. Main, Dr. Ohland January 2016
  • 3. MethodsIntroduction Retention & Graduation Rate 3 Model Development Conclusions
  • 4. Solution Analyze students’ performance data (Huang & Fang, 2012; White, 2012) Predict students’ success in a course and identify at-risk students (Jin, Imbrie, Lin, & Chen, 2011; Olani, 2009) Use the prediction model as an early warning system Inform both the instructor and the students of their performance (Arnold & Pistilli, 2012b; Essa & Hanan, 2012; Macfadyen & Dawson, 2010) 4 MethodsIntroduction Model Development Conclusions
  • 5. Shortcomings of Early Warning Systems Use same model for all courses Generic model decreases accuracy Not useful for non-traditional courses Do not reap the benefits of standards-based grading More data points Higher reliability in grading Depend on online access data May not be suitable for face-to-face courses Some courses may not use schools’ CMS Instructors do not have access to these data 5 MethodsIntroduction Model Development Conclusions
  • 6. Research Questions (1/3) 6 Of six different predictive modeling methods, as well as a seventh hybrid or Ensemble method, which is the most successful at identifying at-risk students, based on specified in-semester student performance data? Why is this method the most successful? Why are the other methods less successful? MethodsIntroduction Model Development Conclusions
  • 7. Research Questions (2/3) 7 To what extent can the models created by predictive methods for identifying at-risk students in a course be improved through the selection of in-semester student performance data (e.g., quiz, homework learning objectives, midterm exam)? What does the selection reveal? MethodsIntroduction Model Development Conclusions
  • 8. Research Questions (3/3) 8 What are the relationships, if any, between students’ success and achievement of different learning objectives in a course? What are the implications for the resulting prediction models and what are the pedagogical implications? MethodsIntroduction Model Development Conclusions
  • 10. Settings & Data 10 50% training 25% Verify1 25% Verify2 Final Test MethodsIntroduction Model Development Conclusions ENGR 132
  • 11. Prediction Modeling Methods 11 Input Students’ learning objective scores (HW 1-5, 33 LOs) Students’ grades on course assessments Quiz: Weeks 1-5, 10 grades Written Exam 1 (Midterm Exam): 1 grade Output At-risk (positive) Successful (negative) MethodsIntroduction Model Development Conclusions
  • 12. Evaluating the Models 12 MethodsIntroduction Model Development Conclusions
  • 13. Research Flowchart 13 MethodsIntroduction Model Development Conclusions Spring 2014 Spring 2013 Data Modeling Methods: Logistics Regression Multi Layer Perceptron Support Vector Machine K-Nearest Neighbor Decision Tree Naive Bayes Classifier - Train and verify the 6 models - Error Analysis - Create an ensemble model Model Development - Train and verify top 2 models - Use different number of variables Feature Selection - Test top 2 models - Select optimal number of variables Final Test - Randomly cluster training data - Train top 2 models with different number of clusters - Verify the models Model Robustness - Train top 2 models with subsets of data - Verify the models Assessment Types Feature Selection Methods: Correlations Explained Variance Gini Gain 1 2 3 Top variables Top 2 modeling methods
  • 15. Training and Verifying the Models 15 MethodsIntroduction Model Development Conclusions
  • 16. Ensemble Model 16 KNN, SVM, NBC MethodsIntroduction Model Development Conclusions
  • 17. R.Q. 1 – Best Prediction Model(s) NBC & Ensemble Low % of students fail -> small sample size Bias: Inaccurate assumptions in the learning algorithm Variance: Sensitivity of the model to small changes Models with high bias/low variance perform better 17 MethodsIntroduction Model Development Conclusions
  • 18. Predictive Power of Assessments 18 MethodsIntroduction Model Development Conclusions
  • 19. Feature Selection - Final Test 19 MethodsIntroduction Model Development Conclusions
  • 20. R.Q. 2 – Data Selection Models with only two variables Simple models have high bias/low variance 20 MethodsIntroduction Model Development Conclusions
  • 21. Correlations with Success 21 MethodsIntroduction Model Development Conclusions
  • 22. R.Q. 3 – Learning Objectives Identify potential threshold learning objectives Week 5 was important for students’ success Topic: User defined functions in Matlab The topic is important for the rest of the semester The first difficult topic that can differentiate students Students start to take the course seriously from week 5 22 MethodsIntroduction Model Development Conclusions
  • 24. Recommendations Minimum class size: 120 students with ~10% at-risk students All models have error: Communicate results with this consideration Use at least two semesters data to train and test the models No drastic change in course structure from one semester to another Be mindful about false negative (type II) error The process can reveal relations about assessments and success 24 % of at-risk students low high # of students in the course low Know the students high SVM, NBC KNN, DT, MLP MethodsIntroduction Model Development Conclusions
  • 25. Limitations Models Errors Pedagogical decisions (e.g. only HW was graded by SBG) Quality of performance data Mid-size classes (40-120) 25 MethodsIntroduction Model Development Conclusions
  • 26. Future Work How to use the models? Predict students performance during the semester Investigate what leads students to success in a course 26 MethodsIntroduction Model Development Conclusions
  • 27. Thank You…  My Advisers Dr. Diefes-Dux & Dr. Madhavan  My committee members: Dr. Ohland and Dr. Main  My wife  Friends who were part of my journey
  • 29. Research Flowchart 29 MethodsIntroduction Model Development ConclusionsSpring 2014 Spring 2013 Data Cleaning Randomly divide - Sp 2013 data into 50% train data, 25% / 25% verify - Sp 2014 (test) Train/Verify/Test Datasets Modeling Methods: Log Reg MLP SVM KNN DT NBC - Train 6 models Train - Select variables Feature Selection - Verify the 6 models - Use verify1 data Verify Models - Compare the models - Analyze the errors Error Analysis - Train/verify ensemble model - Select top 2 of 7 models Create Ensemble Model - Train top 2 models - Use different number of variables Train - Test top 2 models - Use test dataset Test - Train top 2 models with different number of clusters - Verify the models Model Robustness - Train top 2 models with subsets of data - Verify the models Assessment Types - Randomly cluster training data Cluster Data - Verify top 2 models - Use verify2 data Verify Models - Select optimal number of predictors Variable selection Feature Selection Methods: Correlations Explained Variance Gini gain 1 2 3 Top 2 modeling methods Top variables Top 2 modeling methods