Your SlideShare is downloading. ×
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Is It the Same? G Yamazaki
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Is It the Same? G Yamazaki

130

Published on

Is using pencil-and-paper the better or worse than using a classroom response system for quiz administration?

Is using pencil-and-paper the better or worse than using a classroom response system for quiz administration?

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
130
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Is it the same? Using paper & pencil versus a classroom response system for quiz administration T. Gayle Yamazaki, Gary Packard, Randall Gibb, Edie Edmondson,Douglas Lindsay, Joseph Sanders, Heidi Schwenn, Scott Walchli,Steve Jones,Lorne Gibson, Kathleen O’Donnell, and Andrew Katayama United States Air Force Academy, CO
  • 2. 2 AbstractAlthough the perception of taking a quiz via paper-and-pencil vs. taking a quiz via a classroomresponse system (CRS) may vary substantially, performance on such quizzes may be lesssubstantiated than originally perceived. In this experiment, we set out to gather data toinvestigate if such perceptions are true regarding quiz-taking methods. We also were interestedin seeing if the time of day (morning vs afternoon quizzes) had any effect on performance. Toevaluate the differences between quiz taking methods and time of day factors, randomlyassigned students to sections were created by the registrar’s office. A total of 404 collegefreshman enrolled in an introductory psychology class took part in this study. Results indicatethat there is no difference in student quiz performance between traditional paper-and-pencilmethod and taking the quiz with CRS. Additionally, there were no differences between morningand afternoon sections. In conclusion, there appear to be no differences between quiz takingmethods or time of day.
  • 3. 3 Over the past decade, Classroom Response Systems (CRS) have become increasinglypopular in educational settings (Bjorn et al, 2011; Hoekstra, 2008; MacArthur & Jones, 2008;Zhu, 2007) as well as in medical training settings (Thomas, Monturo, & Conroy, 2011). Not onlyare CRS being used for demonstrating concepts (Shaffer & Collura, 2009), they are also beingused for student assessment of course content (Mezeske & Mezeske, 2007; Yourstone, Kraye, &Albaum, 2008) andfacilitating critical thinking in class (Mollborn & Hoekstra, 2010). The use ofthe CRS as an assessment device has prompted some faculty and students to be concerned thatthere may be advantages to taking a multiple-choice quiz using paper-and-pencil administrationas compared to using a CRS (Epstein,Klinkenberg & Wiley, 2001). One of the perceivedadvantages to using paper-and-pencil administration is that students would be able to referback to previously answered questions and be able to change their answers to improve theiroverall score. While there have been some studies to support this notion on basic knowledgeand comprehension items on multiple-choice tests (see Geiger, 1997 for a review), the samewas not found on more conceptually based or higher-order items. Further, other studies havefound that two mediating factors that correspond to improved performance may be attributedto 1) metacognitive factors (e.g., signal detection and discrimination) and 2) timed-responses(the method used in the present study) more than changing answers with respect to theproportion of correct responses (Hanna, 2010; Higham & Gerrard, 2005).On the other hand,some researchers contend that this perception can be mediated by allowing students to changetheir responses on the clicker device within the prescribed time limit set by the instructor(Caldwell, 2007; Croupch, Fagen, Callan, & Mazur, 2004). From a historical perspective, Muellerand Wasser (1977) report that changing responses on objective tests generally lowered
  • 4. 4students’ scores. The purpose of our study was to examine whether or not there is a differencein average student quiz scores when comparing paper-and-pencil administration with CRSadministration of course quizzes. It was expected that there would not be a significantdifferences between the administration methods, time of day, and that there would not be aninteraction between the two variables studied. Method This was a quasi-experimental study in which all students are assigned by the registrar’soffice to each of the 25 course sections of Introductory Psychology. Based on studentextracurricular activities, validation of courses, and placement examinations the registrar’soffice place students into sections on a random basis. In other words, students are not allowedto choose instructors or sections. All participants (N=404) were freshman in college (age range:17 to 23 years; female=61 and male=343; Ethnic heritage – European-American=301,Hispanic/Latino(a)=25 African-American=17, Asian/Pacific Islander=37, Native American, notspecified=20). Each of the 25 sections of introduction to psychology were divided into the twoadministration groups using the following criteria: a) morning versus afternoon courseofferings, b) instructor preference for quiz administration method, and c) balance between thetypes of quiz administration.Number of sections per condition Eleven Quizzes were administeredthroughout the semester. Each quiz consisted of tenmultiple-choice questions with four answer choices, each question was worth two points for atotal of 20 points per quiz. All students were given the same quiz questions, only the
  • 5. 5administration method varied between the two conditions. The first four quizzes wereaccomplished using the CRS to help tease out any priming factor related to instructor bias (seeThomas, Monturo, & Conroy, 2011). To help ensure that all faculty and students werecomfortable using the CRS, the experimental phase of the study was only conducted on theremaining seven quiz administrations. If a student was absent from class during theadministration of the quiz, his/her score was not used in the data analysis. Paper-and-Pencil quiz administration: Each student was given a single sheet of paperwith ten multiple-choice questions printed out in standard 12-point font. Students were givenapproximately ten minutes to accomplish the quiz. Students for whom English was a secondlanguage were given 20 minutes to complete the quiz if necessary. Classroom Response System quiz administration: Using PowerPoint© slides and IClicker©software, each multiple-choice question was presented separately. The students were givenapproximately one minute to respond to each question, for a total of ten minutesper quiz(same time as the paper-and-pencil condition). Classrooms in which there was a student forwhom English was a second language, two minutes was used for a total of 20 minutes. If allstudents responded to a question before the allotted time, the instructor would query thestudents to ensure all students had sufficient time to respond to the question and then the nextquestion would be presented. Results Upon completion of the semester, quiz scores were acquired from the IClicker©software program and from the institutional database system for paper-and-penciladministered quizzes. An independent sample t-test was used to compare the means for the
  • 6. 6first four quizzes for the CRS and the paper-and-pencil administration groups to assess whetheror not there were any preexisting differences between the groups (baseline measures). As aresult, no significant differences were found t(259)=-1.64, p = 0.102. Levene’s test for equalityof variance met criteria for equal variances. An Independent samplest-test was used to compare the means for the first four quizzesfor the CRS and the paper-and-pencil administration groups to assess whether or not there wasa difference between mean scores. Table 2 presents the means and standard deviations for thegroups. We found no significant difference between administration methods t(314)=0.45, p =0.842. An Independent samplest-test was also used to compare the means for quizzes 5-11between the quiz administration types (CRS vs paper-and-pencil). Again, we found nosignificant difference between administration methods t(332) = 1.05, p = 0.292. Table 3presents the t-table results for group comparisons. An Independent samples t-test was used to compare the means for the first four quizzesfor the time of day (morning vs. afternoon sections). Table 4 presents the t-test results. Wefound no significant difference between time of day for quizzing t(314) = -1.87, p = 0.620. AnIndependent sample t-test was used to compare the means for quizzes 5-11 between the timesof day (morning vs. afternoon). Again, we found no significant difference between time of daythe quiz was administered t(332) = -1.19, p = 0.231. A one way analysis of variance (ANOVA) was conducted to investigate if there was aninteraction between quiz administration group (CRS vs. paper-and pencil) and quiz time of day(morning vs. afternoon). Results of the ANOVA found no significant interaction, MSE = 94.869, F(1, 330) = .116, p = .733. Further, a partial eta squared = .002 suggests that the administration
  • 7. 7type and time of day had a very small interactive effect on the outcome. Table 5 presents theinteraction results. Conclusion Although some of our faculty and students believed there was a substantial advantageto taking quizzes using the paper-and-pencil administration method, the findings of this studysuggest that students score equally well using either method of quiz administration.In an ever-changing technological environment, it is essential that instructors have someunderstanding of the role/impact the introduction of technology may have on studentperformance. The findings of this study, suggest that the administration method used to delivera quiz (paper-and-pencil or CRS) did not impact the overall student average quiz scores across asemester. What this suggests is that in cases in which course-wide consistency is an importantfactor in course delivery, presentation and administration, the method by which quizzes areadministered can be left to instructor discretion. Instructors who choose to more fullyincorporate the advantages of using a CRS throughout their course will not adversely impactstudent performance on quizzes across an academic term. Instructors who prefer to use themore traditional method of paper-and-pencil can do so as well. Since there was no differences detected between average scores of student who wereable to change answers (paper-and-pencil) and students who were not allowed to changeanswers (CRS), it may be that students are changing just as many answers from incorrect tocorrect as they are correct to incorrect. Therefore, it appears that there may be amisperception among faculty and students that the ability to change answers during a quizleads to improved scores. This should be examined further in future studies.
  • 8. 8Since we allowed instructors to use their own preferred method of quiz delivery, it is unclearwhat impact, if any, instructor preference might have on student performance. Although not afocus of this study, student attitudes toward the CRS closely aligned with known instructorfeelings toward the system. Instructors were explicitly asked to not discuss or indicate tostudents their own attitudes about the CRS and they felt they had appropriately withheld theirattitudes and opinions from their students. This observation might suggest that further studyshould be done to investigate whether or not instructor attitudes, particularly negative views,might adversely impact student performance. This line of research would provide neededinsight to those departments and institutions who are examining the additional use oftechnology throughout their course offerings. There were several lessons learned during the administration of this study. First,Students stated that they would use a later question to help answer earlier questions on thequiz. If quiz questions are carefully developed to avoid having the answer to one quiz questionembedded within another question this objection to the CRS is negated. Secondly, to thesurprise of some of our faculty, we found that students were very adept at determining theattitude of the instructor with respect to use of the CRS for quiz administration. Students offaculty who had unfavorable opinions with regard to the CRS had more negative studentopinions related to CRS use for quizzing. In response, we allowed instructors to select whichadministration method they preferred. And finally, having a non-graded practice quiz using theCRS, as well as various concept demonstrations using the CRS increased student comfort andconfidence in the CRS.
  • 9. 9 ReferencesBjorn, H. K., Wolter, M. A., Lundeberg, H. K, & Herreid, C. F. (2011). Students’ perceptions of using personal response systems (“Clickers”) with cases in science. Journal of College Science Teaching, 40 (4), 14-19.Caldwell, J. E. (2007). Clickers in the large classroom. CBE-Life Sciences Education 6, 9-20.Crouch, C. H., Fagan, A. P., Callan, J. P., & Mazur, E. (2004) Classroom demonstrations: Learning tools or entertainment? American Association of Physics Teachers 72 (6), 835-838.Duncan, D. (2006). Clickers: A New Teaching Aid with Exceptional Promise. The Astronomy Education Review, 5 (1), 5-19.Epstein, J., Klinkenberg, W. D., & Wiley, D. (2001). Insuring sample equivalence across internet and paper-and-pencil assessments. Computers in Human Behavior, 17, 339-346.Geiger, M. A. (1997). An examination of the relationship between answer changing, testwiseness, and exam performance. Journal of Experimental Education, 66 (1), 49-60.Hanna, G. S. (2010). To change answers or not to change answers: That is the question. The Clearing House: A Journal of Educational Strategies, Issues, & Ideas, 62 (9), 414-416.Higham, P. A. & Gerrard, C. (2005). Not all errors are created equal: Metacognition and changing answers on multiple-choice tests. Canadian Journal of Experimental Psychology, 59 (1), 28-34.Hoekstra, A. (2008). Vibrant student voices: Exploring effects of the use of clickers in college courses. Learning, Media, and Technology, 33 (4), 329-341.MacArthur, J. R., & Jones, L. L. (2008). A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice, 9, 187-195.
  • 10. 10Mezeske, R. J., & Mezeske, B. A. (Eds). (2007). Beyond tests and quizzes: Creative assessments in the college classroom. San Francisco: Josey-Bass.Mollborn, S. A. & Hoekstra, A. (2010). A meeting of minds: Using clickers for critical thinking and discussion in large sociology classes. Teaching Sociology, 38 (1), 18-27.Mueller, D. J., & Wasser, V. (1977). Implications of changing answers on objective test items. Journal of Educational Measurement, 14, 9-13.Preszler, R. W., Dawe, A., Shuster, CB, & Shuster, M. (2007). Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses. CBE-Life Sciences Education, 6, 29-41.Shaffer, D. M., & Collura, M. J. (2009). Evaluating the effectiveness of a personal response system in the classroom. Teaching of Psychology, 36, 273-277.Thomas, C. M., Monturao, C., & Conroy, K. M. (2011). Experiences of faculty and students using an audience response system in the classroom. Computers, Informatics & Nursing, 29 (7), 396-400.Yourstone, H. S., Kraye, G. A. (2008). Classroom questioning with immediate electronic response: Do clickers improve learning? Journal of Innovative Education: Decision Sciences, 6 (1), 75-88.Zhu, E. (2007). Teaching with clickers. Center for Research on Learning and Teaching Occasional Papers Series, 22, 1-7.
  • 11. 11 Table 1 Number of sections assigned to each condition Paper and Pencil CRSMorning 9 6Afternoon 4 5
  • 12. 12 Table 2 Quiz means and standard deviations for each conditionQuiz Administration Method Administration Method N M SD SEM Quiz Total: 1-4 paper 227 119.5551 11.02131 .73151 I Clicker 89 119.4944 10.18145 1.07923 Quiz Total: 6-11 paper 227 101.7930 9.82485 .65210 I Clicker 107 100.5888 9.54330 .92259Time of Day Morning or Afternoon Sections N M SD SEM Quiz Total: 1-4 Morning 247 118.9393 11.10118 .70635 Afternoon 69 121.6812 9.27757 1.11689 Quiz Total: 6 - 11 Morning 247 101.0283 10.01397 .63717 Afternoon 87 102.4828 8.87230 .95121
  • 13. 13 Table 3 Independent Samples t-test results Quiz Administration Method Levene’s Test for F Sig t df Sig (2-tailed) Equal VariancesQuizzes 1-4 Equal Variances 1.153 .284 .045 314 .964 Assumed Equal Variances Not .047 173.198 .963 AssumedQuizzes 6-11 Equal Variances .047 .828 1.055 332 .292 Assumed Equal Variances Not 1.066 213.389 .288 Assumed
  • 14. 14 Table 4 Independent Samples t-test results Time of Day Levene’s Test for F Sig t df Sig (2-tailed) Equal VariancesQuizzes 1-4 Equal Variances 3.424 .065 -1.876 314 .062 Assumed Equal Variances Not -2.075 127.630 .040 AssumedQuizzes 6-11 Equal Variances 1.930 .166 -1.199 332 .231 Assumed Equal Variances Not -1.270 168.624 .206 Assumed
  • 15. 15 Table 5 ANOVA Table Quiz Administration Method x Time of Day InteractionTests of Between-Subjects Effects Type III Sum of Partial Eta Source Squares df Mean Square F Sig. Squared a Corrected Model 267.964 3 89.321 .942 .421 .008 Intercept 2406478.679 1 2406478.679 25366.424 .000 .987 AMPM 162.238 1 162.238 1.710 .192 .005 Group 70.509 1 70.509 .743 .389 .002 AMPM * Group 11.022 1 11.022 .116 .733 .002 Error 31306.658 330 94.869 Total 3466236.000 334 Corrected Total 31574.623 333 a. R Squared = .008 (Adjusted R Squared = -.001)

×