SlideShare a Scribd company logo
1 of 8
Analysis of Lafayette College Course Evaluations
By Bruce Keller and JJ Wanda
10/20/2014
Introduction
The creation, implementation, and interpretations of course evaluations consume a
considerable amount of resources, and the validity of these tests should be assessed to determine
how beneficial they truly are and how heavily they should weigh on policy decisions.
Our study is to determine how well items on course evaluations at Lafayette College
predict the overall self report value of the course. Students tend to be relatively unbiased in their
ratings, using consistent weighing for course content across different professors and classes
(Broder and Dorfman, 1994); therefore self report information from Lafayette course evaluations
should be a valid measure of course characteristics such as enthusiasm of the instructor and
course organization for our parameters.
In addition, our study also utilizes a self report item on course evaluations asking the
student what they felt the value of the course was as the dependent variable. One may think that
students will rate a less rigorous course more favorably and vice-versa due to their enjoyment of
the course and not the value of it. Fortunately, this seems not to be the case. The same study
found that approximately 80% of explained variance in teacher quality is related to enjoyment of
the learning process and that over 90% of the explained variance for course value is due to the
quantity and quality of material learned in the course. This research would indicate that students’
enjoyment of the course is paramount in fostering learning, and that they rate courses more
favorably based on how much they learned.
Our study aims to use the most widely supported factors such as amount of course
content, as well as introducing a new factor that has received little coverage, availability of extra
help (Broder and Dorfman, 1994), while omitting ones that have shown little support such as
teaching experience (Harris and Sass, 2003).
Theoretical Analysis
+ + + +
Yi = β0 + β1 CCi + β2 TMi + β3 Ei + β5 EHi + εi
How do different class characteristics influence overall course ratings?
Independent Variable
Yi = Mean Score for Overall Value of the ith section
CCi = Mean Score for Course Content for the ith section.
TMi = Mean Score for Effectiveness of Teaching Material for the ith Section.
Ei = Mean Score for Instructors Use of Examples for the ith Section.
EHi = Mean Score of availability of Extra Help for the ith Section.
Description of Data
We collected the data using the Lafayette College course evaluation website1. We created
a program to parse all of our data from the website into a file that could be opened with excel.
The website hosts about 15 years of course evaluations. Our criteria for choosing a specific year
is based on the amount of questions on the evaluations and the amount of courses. Using this
criteria, we selected the Fall 2005 semester because it had the largest number of sections with a
format the included more items. The selection of all courses gave us a sample size of 468 courses
and a selection of 26 questions. We treated sections as separate courses even they were was
taught by the same teacher. The evaluations were on a 5 point Likert Scale with 1 being very
poor and 5 being excellent. For each sample, we used the mean of the individual evaluations for
a class. Using the 468 points in our evaluation we calculated the mean and standard deviation for
each point.
Mean Stdev
Y 3.73 0.56
CC 4.01 0.68
TM 3.80 0.63
E 3.82 0.65
EH 3.79 0.69
1 https://fac-eval.lafayette.edu
Regression Estimates
Table 1 Descriptive
Statistics
Variable Model 1 Model 2
CC
0.2626
(.0415)
0.3037
(.0410)
TM
0.1398
(.0392)
0.2179
(.0347)
E
0.2541
(.0379)
0.2649
(.0384)
EH
0.1383
(.0340) --
Constant
0.65
(.0881)
0.6713
(.0894)
Adjusted R-squared 0.7347 0.7257
The table above shows the coefficients, the t-value, and the R2 value. We created two
models, one with extra help and one without. In our research no studies included a variable
similar to extra help. When we removed it our adjusted r-squared was slightly less and the values
did not change in significance. As a result of these findings, we selected model one as the correct
model. To confirm that we should use a linear model, we used a scatter plot to see if our data
created any trend other than linear.
The graph supports a high R2 since it shows the positive correlation between each
exogenous variable and the endogenous overall course value scores. Using only this linear
model we can determine if we can reject the null hypothesis.
CC - H0: β0 < 0; Ha: β0 > 0
TM - H0: β1 < 0; Ha: β1 > 0
E - H0: β2 < 0; Ha: β2 > 0
EH - H0: β3 < 0; Ha: β3 > 0
Previous research found positive coefficients for course content (CC), teaching material
(TM), and use of examples (E); or equivalent variables. Extra help (EH) has not been well tested
and we predict it will have a positive coefficient. We rejected the null hypothesis for each
variable with 99.9% confidence. We predicted each coefficient to be a positive value and our
results match that prediction.
Empirical Results
Our results match our expectations. All factors incorporated into our model were
significant at the 0.01 level. Course content (CC), teaching material (TM), and use of examples
(E) had high coefficients, that are consistent with previous research (Broder and Dorfman, 1994).
Extra Help (EH) although overlooked by other research had a substantial coefficient of
approximately 0.14. Overall our model explains approximately 74% of explained variance in
our model.
Conclusion
Our model had a lot of explanatory power (74%), even more than other studies tended to
find (61%) despite using almost all the same substantial variables (significant with a coefficient
> 0.01. This may indicate variance between different colleges. Future research can, predict
variance between these supported factors in course evaluations across institutions.
Works Cited
Broder, J., & Dorfman, J. (1994). Determinants of Teaching Quality:
What's Important to Students. Research in Higher Education, 235-248.
Harris, D., & Sass, T. (2003). Teacher training, teacher quality and
student achievement. Journal of Public Economics, 798-812.

More Related Content

What's hot

Algebra team. 9.5.12
Algebra team. 9.5.12Algebra team. 9.5.12
Algebra team. 9.5.12
Mona Toncheff
 
Measuring tools edtech
Measuring tools edtechMeasuring tools edtech
Measuring tools edtech
SFYC
 
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
Jie Wang
 

What's hot (17)

Algebra team. 9.5.12
Algebra team. 9.5.12Algebra team. 9.5.12
Algebra team. 9.5.12
 
Poster Harold Lyon et al7.6.09 Iams Efinal
Poster Harold Lyon et al7.6.09 Iams EfinalPoster Harold Lyon et al7.6.09 Iams Efinal
Poster Harold Lyon et al7.6.09 Iams Efinal
 
Lesson 3 year 9 convection artifact
Lesson 3 year 9 convection artifactLesson 3 year 9 convection artifact
Lesson 3 year 9 convection artifact
 
Cooperative group lab- Alka Seltzer Experiment
Cooperative group lab- Alka Seltzer ExperimentCooperative group lab- Alka Seltzer Experiment
Cooperative group lab- Alka Seltzer Experiment
 
introduction to test measure and evaluation
introduction to test measure and evaluationintroduction to test measure and evaluation
introduction to test measure and evaluation
 
L4: Systematic Approach
 L4: Systematic Approach L4: Systematic Approach
L4: Systematic Approach
 
Using an Online Interactive Textbook to Improve Student Outcomes
Using an Online Interactive Textbook to Improve Student OutcomesUsing an Online Interactive Textbook to Improve Student Outcomes
Using an Online Interactive Textbook to Improve Student Outcomes
 
Evaluation in education
Evaluation in educationEvaluation in education
Evaluation in education
 
Test, measurement, assessment & evaluation
Test, measurement, assessment & evaluationTest, measurement, assessment & evaluation
Test, measurement, assessment & evaluation
 
Chapter iii
Chapter iiiChapter iii
Chapter iii
 
Measuring tools edtech
Measuring tools edtechMeasuring tools edtech
Measuring tools edtech
 
Psi District Presentation 091210
Psi   District Presentation    091210Psi   District Presentation    091210
Psi District Presentation 091210
 
Assessment in Education
Assessment in EducationAssessment in Education
Assessment in Education
 
Dr. Teresa Ann Hughes, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Teresa Ann Hughes, PhD Dissertation Defense, Dr. William Allan Kritsonis,...Dr. Teresa Ann Hughes, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
Dr. Teresa Ann Hughes, PhD Dissertation Defense, Dr. William Allan Kritsonis,...
 
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
中国政治-新加坡国立大学教学评估-2014:2015 - Wang Jie
 
Co-teaching Strategies
Co-teaching StrategiesCo-teaching Strategies
Co-teaching Strategies
 
Sloan c et4_o_2012_study for improving student reflections electronic portfol...
Sloan c et4_o_2012_study for improving student reflections electronic portfol...Sloan c et4_o_2012_study for improving student reflections electronic portfol...
Sloan c et4_o_2012_study for improving student reflections electronic portfol...
 

Viewers also liked

November 2014 EARTH Connections
November 2014 EARTH ConnectionsNovember 2014 EARTH Connections
November 2014 EARTH Connections
Kelly Petersen
 
April 2015 EARTH Connections
April 2015 EARTH ConnectionsApril 2015 EARTH Connections
April 2015 EARTH Connections
Kelly Petersen
 
September 2015 EARTH Connections
September 2015 EARTH ConnectionsSeptember 2015 EARTH Connections
September 2015 EARTH Connections
Kelly Petersen
 
Final Meta Analysis 1
Final Meta Analysis 1Final Meta Analysis 1
Final Meta Analysis 1
Keller A
 
Collage Certificate signed.PDF
Collage Certificate signed.PDFCollage Certificate signed.PDF
Collage Certificate signed.PDF
Tamara Azzawi
 

Viewers also liked (15)

pliant_cd
pliant_cdpliant_cd
pliant_cd
 
portfolio4
portfolio4portfolio4
portfolio4
 
WCAN 2015 Autumn
WCAN 2015 AutumnWCAN 2015 Autumn
WCAN 2015 Autumn
 
November 2014 EARTH Connections
November 2014 EARTH ConnectionsNovember 2014 EARTH Connections
November 2014 EARTH Connections
 
April 2015 EARTH Connections
April 2015 EARTH ConnectionsApril 2015 EARTH Connections
April 2015 EARTH Connections
 
September 2015 EARTH Connections
September 2015 EARTH ConnectionsSeptember 2015 EARTH Connections
September 2015 EARTH Connections
 
Final Meta Analysis 1
Final Meta Analysis 1Final Meta Analysis 1
Final Meta Analysis 1
 
Collage Certificate signed.PDF
Collage Certificate signed.PDFCollage Certificate signed.PDF
Collage Certificate signed.PDF
 
How To Manage Zero Clients
How To Manage Zero ClientsHow To Manage Zero Clients
How To Manage Zero Clients
 
Field_Training_Report
Field_Training_ReportField_Training_Report
Field_Training_Report
 
Resume
ResumeResume
Resume
 
Unitat 10: Paisatges en vers
Unitat 10: Paisatges en versUnitat 10: Paisatges en vers
Unitat 10: Paisatges en vers
 
Unitat 2: Guanyar-se el pa
Unitat 2: Guanyar-se el paUnitat 2: Guanyar-se el pa
Unitat 2: Guanyar-se el pa
 
Unidad 2: ¿Cómo te sientes hoy?
Unidad 2: ¿Cómo te sientes hoy?Unidad 2: ¿Cómo te sientes hoy?
Unidad 2: ¿Cómo te sientes hoy?
 
Unidad 1: A ti no te conozco ¿verdad?
Unidad 1: A ti no te conozco ¿verdad?Unidad 1: A ti no te conozco ¿verdad?
Unidad 1: A ti no te conozco ¿verdad?
 

Similar to Analysis of Lafayette College Course Evaluations

CTA Algebra Comparative Pilot Study
CTA Algebra Comparative Pilot StudyCTA Algebra Comparative Pilot Study
CTA Algebra Comparative Pilot Study
Muteti Mutie
 
Task Assessment of Fourth and Fifth Grade Teachers
Task Assessment of Fourth and Fifth Grade TeachersTask Assessment of Fourth and Fifth Grade Teachers
Task Assessment of Fourth and Fifth Grade Teachers
Christopher Peter Makris
 
Week 6 DQ1. What is your research questionIs there a differen.docx
Week 6 DQ1. What is your research questionIs there a differen.docxWeek 6 DQ1. What is your research questionIs there a differen.docx
Week 6 DQ1. What is your research questionIs there a differen.docx
cockekeshia
 
Assessment: Grading & Student Evaluation
Assessment: Grading & Student EvaluationAssessment: Grading & Student Evaluation
Assessment: Grading & Student Evaluation
Eddy White, Ph.D.
 
Skills, Understanding and Attitudes
Skills, Understanding and AttitudesSkills, Understanding and Attitudes
Skills, Understanding and Attitudes
noblex1
 
The effect of investment on school building and student performanc.docx
The effect of investment on school building and student performanc.docxThe effect of investment on school building and student performanc.docx
The effect of investment on school building and student performanc.docx
mehek4
 
PROF ED 7 PPT.pptx
PROF ED 7 PPT.pptxPROF ED 7 PPT.pptx
PROF ED 7 PPT.pptx
DessAlla
 
Al Nat Conf Assessment Topic
Al Nat Conf Assessment TopicAl Nat Conf Assessment Topic
Al Nat Conf Assessment Topic
grainne
 

Similar to Analysis of Lafayette College Course Evaluations (20)

Day 11 t test for independent samples
Day 11 t test for independent samplesDay 11 t test for independent samples
Day 11 t test for independent samples
 
CTA Algebra Comparative Pilot Study
CTA Algebra Comparative Pilot StudyCTA Algebra Comparative Pilot Study
CTA Algebra Comparative Pilot Study
 
European conference on educational research
European conference on educational research European conference on educational research
European conference on educational research
 
European conference on educational research
European conference on educational research European conference on educational research
European conference on educational research
 
European conference on educational research
European conference on educational research European conference on educational research
European conference on educational research
 
Categorical Data Analysis Survey Data
Categorical Data Analysis Survey DataCategorical Data Analysis Survey Data
Categorical Data Analysis Survey Data
 
Ets caveat emptor
Ets caveat emptorEts caveat emptor
Ets caveat emptor
 
Task Assessment of Fourth and Fifth Grade Teachers
Task Assessment of Fourth and Fifth Grade TeachersTask Assessment of Fourth and Fifth Grade Teachers
Task Assessment of Fourth and Fifth Grade Teachers
 
Week 6 DQ1. What is your research questionIs there a differen.docx
Week 6 DQ1. What is your research questionIs there a differen.docxWeek 6 DQ1. What is your research questionIs there a differen.docx
Week 6 DQ1. What is your research questionIs there a differen.docx
 
Automated Essay Scoring Using Bayes Theorem.pdf
Automated Essay Scoring Using Bayes  Theorem.pdfAutomated Essay Scoring Using Bayes  Theorem.pdf
Automated Essay Scoring Using Bayes Theorem.pdf
 
Criterion.docx
Criterion.docxCriterion.docx
Criterion.docx
 
Assessment: Grading & Student Evaluation
Assessment: Grading & Student EvaluationAssessment: Grading & Student Evaluation
Assessment: Grading & Student Evaluation
 
Skills, Understanding and Attitudes
Skills, Understanding and AttitudesSkills, Understanding and Attitudes
Skills, Understanding and Attitudes
 
The effect of investment on school building and student performanc.docx
The effect of investment on school building and student performanc.docxThe effect of investment on school building and student performanc.docx
The effect of investment on school building and student performanc.docx
 
Ujer.2013.010304
Ujer.2013.010304Ujer.2013.010304
Ujer.2013.010304
 
PROF ED 7 PPT.pptx
PROF ED 7 PPT.pptxPROF ED 7 PPT.pptx
PROF ED 7 PPT.pptx
 
nabeela Khattak Final Defnse.ppt
nabeela Khattak Final  Defnse.pptnabeela Khattak Final  Defnse.ppt
nabeela Khattak Final Defnse.ppt
 
How do we measure academic entitlement? (and should we?)
How do we measure academic entitlement? (and should we?)How do we measure academic entitlement? (and should we?)
How do we measure academic entitlement? (and should we?)
 
Analyzing Assessment.docx
Analyzing Assessment.docxAnalyzing Assessment.docx
Analyzing Assessment.docx
 
Al Nat Conf Assessment Topic
Al Nat Conf Assessment TopicAl Nat Conf Assessment Topic
Al Nat Conf Assessment Topic
 

Analysis of Lafayette College Course Evaluations

  • 1. Analysis of Lafayette College Course Evaluations By Bruce Keller and JJ Wanda 10/20/2014
  • 2. Introduction The creation, implementation, and interpretations of course evaluations consume a considerable amount of resources, and the validity of these tests should be assessed to determine how beneficial they truly are and how heavily they should weigh on policy decisions. Our study is to determine how well items on course evaluations at Lafayette College predict the overall self report value of the course. Students tend to be relatively unbiased in their ratings, using consistent weighing for course content across different professors and classes (Broder and Dorfman, 1994); therefore self report information from Lafayette course evaluations should be a valid measure of course characteristics such as enthusiasm of the instructor and course organization for our parameters. In addition, our study also utilizes a self report item on course evaluations asking the student what they felt the value of the course was as the dependent variable. One may think that students will rate a less rigorous course more favorably and vice-versa due to their enjoyment of the course and not the value of it. Fortunately, this seems not to be the case. The same study found that approximately 80% of explained variance in teacher quality is related to enjoyment of the learning process and that over 90% of the explained variance for course value is due to the quantity and quality of material learned in the course. This research would indicate that students’ enjoyment of the course is paramount in fostering learning, and that they rate courses more favorably based on how much they learned. Our study aims to use the most widely supported factors such as amount of course content, as well as introducing a new factor that has received little coverage, availability of extra help (Broder and Dorfman, 1994), while omitting ones that have shown little support such as
  • 3. teaching experience (Harris and Sass, 2003). Theoretical Analysis + + + + Yi = β0 + β1 CCi + β2 TMi + β3 Ei + β5 EHi + εi How do different class characteristics influence overall course ratings? Independent Variable Yi = Mean Score for Overall Value of the ith section CCi = Mean Score for Course Content for the ith section. TMi = Mean Score for Effectiveness of Teaching Material for the ith Section. Ei = Mean Score for Instructors Use of Examples for the ith Section. EHi = Mean Score of availability of Extra Help for the ith Section.
  • 4. Description of Data We collected the data using the Lafayette College course evaluation website1. We created a program to parse all of our data from the website into a file that could be opened with excel. The website hosts about 15 years of course evaluations. Our criteria for choosing a specific year is based on the amount of questions on the evaluations and the amount of courses. Using this criteria, we selected the Fall 2005 semester because it had the largest number of sections with a format the included more items. The selection of all courses gave us a sample size of 468 courses and a selection of 26 questions. We treated sections as separate courses even they were was taught by the same teacher. The evaluations were on a 5 point Likert Scale with 1 being very poor and 5 being excellent. For each sample, we used the mean of the individual evaluations for a class. Using the 468 points in our evaluation we calculated the mean and standard deviation for each point. Mean Stdev Y 3.73 0.56 CC 4.01 0.68 TM 3.80 0.63 E 3.82 0.65 EH 3.79 0.69 1 https://fac-eval.lafayette.edu
  • 5. Regression Estimates Table 1 Descriptive Statistics Variable Model 1 Model 2 CC 0.2626 (.0415) 0.3037 (.0410) TM 0.1398 (.0392) 0.2179 (.0347) E 0.2541 (.0379) 0.2649 (.0384) EH 0.1383 (.0340) -- Constant 0.65 (.0881) 0.6713 (.0894) Adjusted R-squared 0.7347 0.7257 The table above shows the coefficients, the t-value, and the R2 value. We created two models, one with extra help and one without. In our research no studies included a variable similar to extra help. When we removed it our adjusted r-squared was slightly less and the values did not change in significance. As a result of these findings, we selected model one as the correct model. To confirm that we should use a linear model, we used a scatter plot to see if our data created any trend other than linear.
  • 6. The graph supports a high R2 since it shows the positive correlation between each exogenous variable and the endogenous overall course value scores. Using only this linear model we can determine if we can reject the null hypothesis. CC - H0: β0 < 0; Ha: β0 > 0 TM - H0: β1 < 0; Ha: β1 > 0 E - H0: β2 < 0; Ha: β2 > 0 EH - H0: β3 < 0; Ha: β3 > 0
  • 7. Previous research found positive coefficients for course content (CC), teaching material (TM), and use of examples (E); or equivalent variables. Extra help (EH) has not been well tested and we predict it will have a positive coefficient. We rejected the null hypothesis for each variable with 99.9% confidence. We predicted each coefficient to be a positive value and our results match that prediction. Empirical Results Our results match our expectations. All factors incorporated into our model were significant at the 0.01 level. Course content (CC), teaching material (TM), and use of examples (E) had high coefficients, that are consistent with previous research (Broder and Dorfman, 1994). Extra Help (EH) although overlooked by other research had a substantial coefficient of approximately 0.14. Overall our model explains approximately 74% of explained variance in our model. Conclusion Our model had a lot of explanatory power (74%), even more than other studies tended to find (61%) despite using almost all the same substantial variables (significant with a coefficient > 0.01. This may indicate variance between different colleges. Future research can, predict variance between these supported factors in course evaluations across institutions.
  • 8. Works Cited Broder, J., & Dorfman, J. (1994). Determinants of Teaching Quality: What's Important to Students. Research in Higher Education, 235-248. Harris, D., & Sass, T. (2003). Teacher training, teacher quality and student achievement. Journal of Public Economics, 798-812.