Short and sweet

332 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
332
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • MAtt
  • MAtt
  • matt
  • Cat
  • Brennan
  • Brennan
  • Stacey
  • Stacey
  • Stacey This isn’t very surprising because my sample size was really small.
  • StaceyI observed this even on the days where I taught the strategies and explicitly told students that I wanted to see them using the strategies
  • Matt
  • Matt
  • BrennanOverall significance was likely not found due to the academic abilities of my students who already used some of the intervention strategies to certain extents.
  • Brennan
  • Cat
  • Cat
  • Cameron
  • Cameron
  • Cameron
  • Cameron
  • Cat
  • Short and sweet

    1. 1. PHS CARP<br />Stacey, Cameron, Matt, Cat, Brennan<br />
    2. 2. Introduction <br />Population: <br /><ul><li>Patuxent High School in Lusby, MD (Calvert County)
    3. 3. Biology, Social Studies, English </li></ul>Need: According to Principal Highsmith + SIP. . .<br />Students have difficulty taking tests because they don’t understand the language or intent of questions<br />Purpose: <br /><ul><li>To determine if explicitly teaching test-taking skills will improve students’ test-taking skills</li></li></ul><li>Research Questions<br />Did our strategies help students become better test takers?<br />Do students use the strategies that we suggest?<br />
    4. 4. Strategy, Rationale, and Justification<br />1. Key Words <br />Circle the important words that show what the question is asking.<br />2. Rephrase Test Questions<br />Student puts question in his/her own words<br />Our strategy may ameliorate test-taking ability for all disciplines which may lead to improvement in standardized tests <br />Peer-reviewed research supports the specific test-taking strategies that we have chosen (Chittooran & Miles 2001)<br />
    5. 5. Methods<br />Administering pre-test of test-taking strategies<br />Explicitly teach strategies (one per week)<br />Continue with strategies throughout the week(s) <br />Observe class and collect student work to look for strategy use<br />Administer post-test<br />Run T-tests to look for significance<br />
    6. 6. Data Collection Plan<br />
    7. 7. Pre-Post Assessment<br />Used a LikertScale questionnaire and configured our survey results so that a “5” was always positive<br />Sample question: <br />1. How often do you circle or underline parts of a test question to help you answer that question?<br /> 1 – Never<br /> 2 – Not often<br /> 3 – Sometimes<br /> 4 – Very often<br /> 5 - Always<br />
    8. 8. Pre-Post Analysis<br />Used a unpaired T-Test to analyze any significance of student responses between pre/post-tests<br />Compared both sets of test results for each class using a T-Test (alpha level 0.05)<br />Compiled data across all surveyed classes and compared pre and post results using a T-Test (alpha level 0.05) <br />
    9. 9. Class Data – Ms. Meyer <br />1st period standard biology <br />n=15<br />Pretest Average = 18.5<br />Posttest Average = 17.8<br />p > 0.05 so the treatment effect was not significant <br />This means that implementing implicit and explicit instruction with our two test-taking strategies did not produce a significant effect. <br />
    10. 10. Observations/Student Work<br />I observed very few students using the strategies during class work.<br />Results were slightly better when I looked at students quizzes and observed students while testing which was encouraging. <br />Ex. Organelle vs. Process<br />
    11. 11. Class Data – Mr. Stone<br />Data from 4 Periods of Academic English<br />n = 76<br />Pre Test Average = 16.50<br />Post Test Average = 17.42<br />p = 0.034<br />Since p < 0.05, this means that implementing implicit and explicit instruction in our two test-taking strategies did produce a significant effect.<br />
    12. 12. Observations/Student Work<br />I observed a moderate amount of students using the strategies during daily class work. <br />However, during exams when students were reminded of the strategies and encouraged to use them I observed a greater number of students applying the strategies.<br />
    13. 13. Class Data – Mr. Davis<br />Data from 3 periods of Honors/Pre-AP World History<br />n= 72<br />Pre-Test Average= 17.6<br />Post-Test Average= 18.92<br />p= 0.14529<br />p>0.05, so the treatment did not produce a significant result.<br />Note: a significance was found in my 6th period (p= 0.00405)<br />
    14. 14. Observations/Student Work<br />I observed a moderate amount of students using the strategies during class work. However, some of these students were using these strategies before explicit instruction.<br />I was unable to observe student behavior on tests and quizzes as none were taken during the project’s 4-week timeline. <br />
    15. 15. Class Data – Ms. Holland <br />4th and 7th period AP Literature and 5th/6th double-period English Standard<br />N = 37<br />Pre-test average = 17.702<br />Post-test average = 16.783<br />p> 0.05 so the findings were not significant<br />Implementing two explicit test-taking strategies did not produce significant results. <br />
    16. 16. Observations/Student Work <br />Some students used strategy #1, but it is unclear whether they used this strategy prior to our intervention or not. <br />Students were reluctant to even attempt strategy #2. <br />I observed some students in the English 10 Standard class use strategy #1 while taking Benchmark #1, a county-wide multiple-choice test. <br />I observed some AP Literature students using strategy #1 on the practice AP tests that I distributed to the class, but they only took vocabulary tests during the 3-week period of our intervention. <br />
    17. 17. Class Data – Mr. Leischer<br />Data from 3 Periods of Academic English<br />n = 55<br />Pre Test Average = 16.62<br />Post Test Average = 16.56<br />p = 0.47<br />Since p > 0.05, this means that implementing implicit and explicit instruction in our two test-taking strategies did not produce a significant effect.<br />
    18. 18. Observations/Student Work<br />I observed a small amount of student using the strategies we taught. However, most of these students were already using similar strategies before instruction.<br />Observation of strategies took place during district-level benchmark exam and HSA-style question practice sessions.<br />
    19. 19. Conclusions<br />We found that the overall effect of our treatment effect is not significant<br />n= 256 ; p = 0.0581<br />
    20. 20. Conclusions<br />It is unclear whether our test strategies improved student performance or not. <br />Evidence is mixed about whether students use the strategies that we taught. <br />Because of the way we collected our data and differences in n numbers were unable to do the question by question analysis needed to tease apart the data <br />
    21. 21. Implications for Research<br />Repeat with more time and greater n<br />Is there a difference between subjects?<br />Introduce in earlier grade<br />Look at differences in test scores comparing students who were taught/used the strategies compared to a control group<br />

    ×