McLeod 2007 MN-SDC PowerPoint

1,892 views

Published on

2007 Minnesota Staff Development Council Annual Forum. May 16, 2007. Dr. Scott McLeod, CASTLE, www.scottmcleod.net.

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,892
On SlideShare
0
From Embeds
0
Number of Embeds
43
Actions
Shares
0
Downloads
68
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • McLeod 2007 MN-SDC PowerPoint

    1. 1. USING DATA TO MAKE DECISIONS: Results from the Minnesota Statewide DDDM Readiness Study Dr. Scott McLeod Dr. Karen Seashore University of Minnesota
    2. 2. Get this presentation <ul><li>See the RESOURCES </li></ul><ul><li>section of your handout! </li></ul>
    3. 3. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    4. 4. Respondents
    5. 5. Respondents <ul><li>Teachers (n = 3,135 / 11,120?) (28%?) </li></ul><ul><li>Principals (n = 791 / 1,770) (45%) </li></ul><ul><li>Superintendents (n = 202 / 351) (58%) </li></ul><ul><li>District technology coordinators (n = 139 / 351) (40%) </li></ul><ul><li>4,267 Minnesota educators </li></ul><ul><li>Awesome! </li></ul>
    6. 6. Respondents by gender, race / ethnicity 96% White
    7. 7. Respondents by urbanicity
    8. 8. Respondents by level
    9. 9. Respondents by AYP status
    10. 10. Assessment Intensity
    11. 11. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    12. 12. I receive state assessment results each year [teachers]
    13. 13. I receive state assessment results each year [teachers]
    14. 14. I receive other yearly assessment results each year [teachers]
    15. 15. I receive other yearly assessment results each year [teachers]
    16. 16. Teachers collaborate to create and use common periodic assessments for student progress monitoring [teachers]
    17. 17. Teachers collaborate to create and use common periodic assessments for student progress monitoring [teachers]
    18. 18. Teachers use other (not teacher-created) periodic assessments for student progress monitoring [teachers]
    19. 19. Teachers use other (not teacher-created) periodic assessments for student progress monitoring [teachers]
    20. 20. Summary <ul><li>Lots of teachers are NOT intersecting with yearly data </li></ul><ul><li>Some differences between secondary subject areas </li></ul><ul><li>Clear, consistent downward gradient from elementary to secondary </li></ul>Let’s Recap
    21. 21. Beliefs About Types of Assessments
    22. 22. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    23. 23. Assessments are aligned with state curriculum standards
    24. 24. Assessment results are easy to understand and interpret
    25. 25. Assessment results are detailed enough to adequately inform teachers’ instruction
    26. 26. Assessment results are timely enough to adequately inform teachers’ instruction
    27. 27. Summary <ul><li>Weak agreement that assessments are aligned with standards </li></ul><ul><li>Non-state assessments are </li></ul><ul><ul><li>easier to understand </li></ul></ul><ul><ul><li>more detailed </li></ul></ul><ul><ul><li>much more timely </li></ul></ul>Let’s Recap
    28. 28. Other Components of the Core
    29. 29. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    30. 30. Measurable instructional goals
    31. 31. Measurable instructional goals
    32. 32. Measurable instructional goals
    33. 33. Teacher teams (PLCs) that meet regularly
    34. 34. Teacher teams (PLCs) that meet regularly
    35. 35. Teacher teams (PLCs) that meet regularly
    36. 36. Making instructional changes
    37. 37. Making instructional changes
    38. 38. Making instructional changes
    39. 39. Summary <ul><li>Administrators less positive about teacher behavior </li></ul><ul><li>Teachers feel collaboration time is inadequate </li></ul><ul><li>Clear, consistent downward gradient from </li></ul><ul><ul><li>elementary to secondary </li></ul></ul><ul><ul><li>AYP to No AYP </li></ul></ul>Let’s Recap
    40. 40. Supporting Conditions
    41. 41. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    42. 42. Data access and transparency
    43. 43. Data access and transparency
    44. 44. Data access and transparency
    45. 45. Data safety
    46. 46. Data safety
    47. 47. Data safety
    48. 48. Technology
    49. 49. Technology
    50. 50. Technology
    51. 51. Alignment for results
    52. 52. Alignment for results
    53. 53. Alignment for results
    54. 54. Summary <ul><li>Teachers less positive about supporting conditions </li></ul><ul><li>Clear, consistent downward gradient from </li></ul><ul><ul><li>elementary to secondary </li></ul></ul><ul><ul><li>AYP to No AYP </li></ul></ul>
    55. 55. Other Factors
    56. 56. Leadership and support
    57. 57. Leadership and support
    58. 58. Leadership and support
    59. 59. Professional development
    60. 60. Professional development
    61. 61. Professional development
    62. 62. Beliefs
    63. 63. Beliefs
    64. 64. Beliefs
    65. 65. Summary <ul><li>Teachers less positive about </li></ul><ul><ul><li>administrator support </li></ul></ul><ul><ul><li>staff development </li></ul></ul><ul><li>Teachers more likely to believe achievement is out of their control </li></ul><ul><li>Clear, consistent downward gradient from </li></ul><ul><ul><li>elementary to secondary </li></ul></ul><ul><ul><li>AYP to No AYP </li></ul></ul>Let’s recap
    66. 66. A Few Last Things
    67. 67. Teachers most likely to agree that… <ul><li>They have the knowledge and skills to improve student learning </li></ul><ul><li>They can significantly affect students’ achievement levels by trying different teaching methods </li></ul><ul><li>If they constantly analyze what they do and adjust to get better, they will improve </li></ul><ul><li>District goals were focused on student learning </li></ul><ul><li>They feel some personal responsibility when school improvement goals are not met </li></ul>
    68. 68. Teachers most likely to disagree that… <ul><li>They are given adequate time for collaborative planning </li></ul><ul><li>State assessments are timely enough to adequately inform instruction </li></ul><ul><li>They have significant input into data management and analysis practices </li></ul><ul><li>State assessments are detailed enough to adequately inform instruction </li></ul><ul><li>They have received adequate training to effectively interpret and act upon yearly state assessment results </li></ul>
    69. 69. Miscellaneous comments
    70. 70. Our success as educators should be determined primarily by our impact upon student learning
    71. 71. Our success or failure in teaching students is primarily due to factors beyond our control rather than to our own efforts and ability
    72. 72. <ul><li>State test data aren’t very useful </li></ul><ul><li>Teachers feel less positively about school and district DDDM activity than do administrators </li></ul><ul><li>Significant percentages of teachers are not intersecting with DDDM </li></ul><ul><li>Clear, consistent differences between </li></ul><ul><ul><li>elementary and secondary </li></ul></ul><ul><ul><li>AYP and NO AYP </li></ul></ul>Overall summary of descriptive statistics Let’s recap
    73. 73. 9 essential elements of data-driven PLCs Frequent formative assessments Professional learning communities rooted in student information Making instructional changes <ul><li>Data safety Data transparency </li></ul><ul><li>Technology Alignment for results </li></ul>Good baseline data Measurable instructional goals
    74. 74. Next steps = more sophisticated statistics <ul><li>Factor analysis </li></ul><ul><ul><li>Example </li></ul></ul><ul><ul><li>P34 (goals) + P41 (transparency) + P43 (technology) + P47 (prof devt) + P51 + P53 + P54 + P55 (alignment) = ADMIN BEHAVIOR </li></ul></ul>
    75. 75. Next steps = more sophisticated statistics <ul><li>Regression, SEM, maybe HLM </li></ul><ul><ul><li>dependent variables </li></ul></ul><ul><ul><ul><li>DDDM study results (including factors) </li></ul></ul></ul><ul><ul><ul><li>MDE attendance / mobility </li></ul></ul></ul><ul><ul><ul><li>MDE enrollment </li></ul></ul></ul><ul><ul><ul><li>MDE languages </li></ul></ul></ul><ul><ul><ul><li>MDE licensed staff </li></ul></ul></ul><ul><ul><ul><li>NCES Common Core of Data </li></ul></ul></ul><ul><ul><li>independent variables </li></ul></ul><ul><ul><ul><li>DDDM study results (including factors) </li></ul></ul></ul><ul><ul><ul><li>MDE achievement (AYP status, MCAs) </li></ul></ul></ul><ul><ul><ul><li>MDE dropouts / graduation </li></ul></ul></ul>
    76. 76. Wrap-up <ul><li>Questions? </li></ul><ul><li>Reactions? </li></ul><ul><li>Implications for action? </li></ul>

    ×