Dai karen


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Dai karen

  1. 1. DAI Instructional Data Analysis & Data Teams Karen Brooks Coordinator Instructional Data Analysis, Technology & Assessment May 6th , 2014 kbrooks@ulsterboces.org http://www.karenbrooks.wikispaces.com http://www.slideshare.net/kbrooks
  2. 2. Focus for Today • Instructional Data Analysis & Data Teams: – What are they? – How do they function? – How do they relate to me?
  3. 3. Break Down for Day • 9 AM Begin • 10:45 -11:00 Break • 12:00 – 12:30 Lunch • 1:45 – 2:00 Break • 3:00PM Conclude
  4. 4. Grounding/Ice Breaker • How is data like these fruits? » Grapes » Kiwi » Banana » Strawberry
  5. 5. Beginning and End
  6. 6. Data Driven Teams Data Driven Schools Three Pillars of NYS Education • NYS Common Core Learning Standards • http://www.engageny.org/resource/new-york-state-p-12-common-core-learning-standard • Data Driven Instruction • http://www.engageny.org/data-driven-instruction • APPR • http://www.capregboces.org/APPR/FAQs.c fm
  7. 7. So What is a Data Team?
  8. 8. So What is a Data Team? • A data team can generally be defined as a group of educators collaboratively using data to identify and understand opportunities for improvement, then working together to make changes that get measureable results. Using protocols for collaborative inquiry, the group follows a process in which members prepare, implement, and reflect on data-informed actionable goals. • This simple definition can be applied broadly at many levels within a district. • At the classroom level, teachers use data to identify student learning problems and work together to plan instructional changes that will yield improvements in learning. • At the school level, principals and school improvement teams use data to identify goals to drive improvements in the ways teachers collaborate and learn, thereby improving results for all students. • Within a district office, many departments and leaders use data to make decisions regarding the management and efficiency of their particular responsibilities. Cheat Sheet: http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/dddm_desktop_reference_guide.pdf
  9. 9. Instructional Data Team Standards Rubric • http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/standards_for_instructional_data_teams.pdf Sections • Membership • Structure • Process • Gap Analysis • Instructional Action Planning Always looking at process not people.
  10. 10. Simply Put
  11. 11. Data Team Cycle
  12. 12. Data Team Video https://www.youtube.com/watch?v=04-fDELnReE https://www.pinterest.com/emoeggert/plc-videos/
  13. 13. Victoria Bernhardt Multiple Measures http://nces.ed.gov/pubs2007/curriculum/pdf/multiple_measures.pdf • Demographics • Perceptions • Student Learning • School Improvement
  14. 14. Multiple Measures
  15. 15. Taking Multiple Measures Further What data would we need in each of these areas? • Demographics • Perceptions • Student Learning • School Improvement Next Steps • What is the source for this Data? • Who has the access?
  16. 16. District Data Team Samples • http://www.middletownschoo ls.org/page.cfm?p=7538 - Website • http://www.sde.ct.gov/sde/lib /sde/pdf/curriculum/cali/3bdt datateammeetingsteps.pdf Collecting, Charting and Discussing • https://docs.google.com/a/uls terboces.org/viewer? url=http://www.sde.ct.gov/sd e/lib/sde/word_docs/cali/data _teams/guidelines_for_data_ walls.doc Data Walls GuideLines
  17. 17. Perceptions Are a Different Way of Looking At Things
  18. 18. Things are not always what they first appear Need to keep focused
  19. 19. Data Collected Needs to be Consistent in order to have Validity • School Data a Comedy – Dirty Data – Example PBIS • https://www.youtube.com/watch? v=XBv95uMFudE
  20. 20. Data Teams 5 Essential Questions They Ask • Where are we now? • Where do we want to be? • How did we get to where we are now? • How are we going to get to where we want to be? • Is what we are doing making a difference?
  21. 21. Ideal Data Team Planning
  22. 22. So Where Does the Data Team Start
  23. 23. Some Start with a Data Team Survey. This focuses the team. It also helps to focus the essential or guiding questions that keep the team on track. Simply put, what is the problem of focus?
  24. 24. Definition of an Essential Question "A guiding question is the fundamental query that directs the search for understanding." Guiding questions help provide focus.
  25. 25. What do Guiding Questions Look Like? CHARACTERISTICS: • Good guiding questions are open-ended yet focus inquiry on a specific topic. • Guiding questions are non-judgmental, but answering them requires high-level cognitive work. • Good guiding questions contain emotive force and intellectual bite. • Guiding questions are succinct. They contain few words but demand a lot. ADVICE FOR DEVELOPING GOOD GUIDING QUESTIONS: • Determine the theme or concept you want to explore. • Brainstorm a list of questions you believe might cause your to think about the topic but that don't dictate conclusions or limit possible directions of investigation. Wait to evaluate and refine the list until you have several possibilities. • The question must allow for multiple avenues and perspectives. • Consider the six queries that newspapers answer: who, what, when, where, how, and why.
  26. 26. Sample Guiding Questions SAMPLE GUIDING QUESTIONS for DATA STUDY: • How do student outcomes differ by demographics, programs, and schools? • To what extent have specific programs, interventions, and services improved outcomes? • What is the longitudinal progress of a specific cohort of students? • What are the characteristics of students who achieve proficiency and of those who do not? • Where are we making the most progress in closing achievement gaps? • How do absence and mobility affect assessment results? • How do student grades correlate with state assessment results and other measures? RESOURCES: Traver, R. (March, 1998). What is a good guiding question? Educational Leadership, p. 70-73. Ronka, D., Lachat, M, et. al. (December 2008/January 2009). Data: Now What? Educational Leadership, p. 18-24.
  27. 27. The Data Coordinator is always a Member of the Data Team What Impact does this Team Have? Build Awareness • Build a vision for data use that is grounded in positive student outcomes • Articulate the vision for district- wide systemic data use clearly and repeatedly with all stakeholders to paint an evident image of how the future will be better if all engage in this work • Develop and communicate a sense of positive urgency • Share the structure and function of the District Data Team with school-level teams Understand Concerns • Talk openly with staff at all levels in the district about stress they may experience as change is implemented • Actively listen: solicit and act upon the concerns of staff members to facilitate the change process • Acknowledge losses that people may feel as they shift established habits and approach their work in new ways
  28. 28. Continued Model the Process • Lead by example, not by edict • Publicly demonstrate how the District Data Team is moving toward the vision • Present the district-level data overview with school- level participants and other district stakeholders • Design district-level action plans using the Data-Driven Inquiry and Action Cycle Manage the Process • Conduct and maintain a data inventory that includes – school-level data – Coordinate the upload of local data to the Data Warehouse – Maintain an up-to-date data dissemination schedule – Disseminate relevant data sets and displays for school- based action
  29. 29. Culture of Data Use
  30. 30. Can You Pull Data for Me?
  31. 31. Data Team Schedule
  32. 32. Let’s Look at Some Data: » Frequency Distribution by Subgroup » Frequency Distribution » CC Strands Analysis » P-Value Comparison » Sample Grade 6 Glossary of Assessment Terms
  33. 33. Frequency Distribution by Sub Group
  34. 34. Frequency Distribution
  35. 35. Common Core Strand Analysis
  36. 36. P-Value Comparison
  37. 37. P-Value • This common view shows the p-value for each question. For multiple choice items, p-value is the proportion of students responding correctly. For constructed response items, p-value is the mean raw score divided by the maximum number of score points for an item. • District performance on individual questions can be compared to regional levels to determine how similar students performed on a particular question. • The larger the sample size the more accurate the results.
  38. 38. P-Value Basics • Also referred to as the p-value. • The range is from 0% to 100%, or more typically written as a proportion of 0.0 to 1.00. • The higher the value, the easier the item. • Calculation: Divide the number of students who got an item correct by the total number of students who answered it. • Ideal value: Slightly higher than midway between chance (1.00 divided by the number of choices) and a perfect score (1.00) for the item. For example, on a four-alternative, multiple-choice item, the random guessing level is 1.00/4 = 0.25; therefore, the optimal difficulty level is .25 + (1.00 - .25) / 2 = 0.62. On a true-false question, the guessing level is (1.00/2 = .50) and, therefore, the optimal difficulty level is . 50+(1.00-.50)/2 = .75 • P-values above 0.90 are very easy items and should be carefully reviewed based on the instructor’s purpose. For example, if the instructor is using easy “warm-up” questions or aiming for student mastery, than some items with p values above .90 may be warranted. In contrast, if an instructor is mainly interested in differences among students, these items may not be worth testing. • P-values below 0.20 are very difficult items and should be reviewed for possible confusing language, removed from subsequent exams, and/or identified as an area for re-instruction. If almost all of the students get the item wrong, there is either a problem with the item or students were not able to learn the concept. However, if an instructor is trying to determine the top percentage of students that learned a certain concept, this highly difficult item may be necessary. https://www.utexas.edu/academic/ctl/assessment/iar/students/report/itemanalysis.php
  39. 39. 2013 Grade 6 Math District to Region
  40. 40. 2013 Grade 6 Math Region to County
  41. 41. Always Consider
  42. 42. Triangulation
  43. 43. Assessments https://www.youtube.com/watch?v=0fn_vAhu_Lw
  44. 44. Balanced Assessments • Same exams give • Common Rubrics • Assessments given during same time period • Results are looked at through Data conversations with building leaders. • Conversations may also be grade level or subject level wide. They may also be building wide and make its way to the District Data Team.
  45. 45. The Ripple Effect
  46. 46. Organize for Collaborative Work
  47. 47. Process • Fishbone/Hypothesis • 5 Whys
  48. 48. Next Steps Getting Invested in the Data Team Work
  49. 49. The Break Down
  50. 50. Data Team Key Considerations
  51. 51. Relating it Back to Our Work
  52. 52. Know Your Most Important Variable = Your Students https://www.youtube.com/watch?v=cL9Wu2kWwSY
  53. 53. Beginning and End
  54. 54. Ulster BOCES Data Boot Camp August 19th , 20th , 21st , 2014
  55. 55. Thank you Karen Brooks • kbrooks@ulsterboces.org • http://www.karenbrooks.wikispaces.com • http://www.slideshare.net/kbrooks