From Data To Information Perspectives On Policy And Practice


Published on

Presents facets of data-driven decision making within the context of a large urban district.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

From Data To Information Perspectives On Policy And Practice

  1. 1. From Data to Information: Perspectives on Policy and Practice Deborah Lindsey Milwaukee Public Schools Sara Kraemer UW Madison Jeff Watson UW Madison
  2. 2. Overview of Milwaukee Public School Data Warehouse Project Deborah Lindsey Director of Research and Assessment
  3. 3. MPS-WCER Partnership <ul><li>Began work approx. 10 years ago </li></ul><ul><li>Early focus: </li></ul><ul><ul><li>Standards </li></ul></ul><ul><ul><li>Assessment </li></ul></ul><ul><ul><li>Accountability </li></ul></ul><ul><li>Current focus: </li></ul><ul><ul><li>Value-added and </li></ul></ul><ul><ul><li>Program evaluation/research </li></ul></ul><ul><ul><li>Data quality </li></ul></ul><ul><ul><li>System development </li></ul></ul>
  4. 4. Components of Current Work <ul><li>Embedded researcher </li></ul><ul><li>Large-scale program evaluation/research projects (e.g. high school, charter school, supplemental services, literacy evaluations) </li></ul><ul><li>Value-added Analyses VA for district Report Card (7 Years) VA professional development project </li></ul><ul><li>Integrated Resource Information System (IRIS) </li></ul>
  5. 5. MPS Data Warehouse <ul><li>1996--Original DW work was begun </li></ul><ul><li>Conceptual design-data model by UW-Milw School of Business Administration </li></ul><ul><li>1997—Data on demographics, enrollment, testing, and school details were loaded </li></ul><ul><li>1999—Summary data on attendance was added </li></ul><ul><li>2001—Program assignments, incidents, and credit detail was added </li></ul><ul><li>2003—Audit by IBM </li></ul>
  6. 6. MPS Data Warehouse <ul><li>Major IBM findings: </li></ul><ul><ul><li>Lack of meaningful leadership outside of IT </li></ul></ul><ul><ul><li>Lack of direct input from users </li></ul></ul><ul><ul><li>Lack of sufficiently developed functionality </li></ul></ul><ul><ul><li>Lack of documentation (both technical and for users) </li></ul></ul><ul><li>Changes: Fall of 2003 </li></ul><ul><ul><li>Business sponsor identified </li></ul></ul><ul><ul><li>Task list of requested changes was initiated </li></ul></ul>
  7. 7. MPS Data Warehouse <ul><li>03-04--Conducted needs assessment: focus groups of central services staff, teachers, and school administrators </li></ul><ul><li>04-05--Engaged in formal strategic planning to identify future direction of DW </li></ul><ul><li>October 2005--Wrote RFP for re-designed DW </li></ul><ul><ul><li>Past, present, future views of data </li></ul></ul><ul><ul><li>Granular detail </li></ul></ul><ul><ul><li>Readily relates different data items to each other </li></ul></ul><ul><ul><li>Permits multiple filters and drills </li></ul></ul><ul><ul><li>Metadata </li></ul></ul><ul><ul><li>Oracle foundation </li></ul></ul><ul><ul><li>Inclusion of MPS IT staff in development </li></ul></ul>
  8. 8. DW Redesign Process <ul><li>February ’06—Contract signed with Versifit Technologies </li></ul><ul><li>October ’06—New table structure in place in “new” Data Warehouse </li></ul><ul><li>Fall ’06—Consultant hired to assist with report writing </li></ul><ul><li>Fall ’07—Official roll-out of administrator dashboard and reports </li></ul><ul><li>Fall ’07—Training conducted by Research and Assessment staff </li></ul>
  9. 9. DW Updates <ul><li>Ongoing – New reports are added </li></ul><ul><li>Usage reports present info on hits by school, person, and report </li></ul><ul><li>Number of users has increased: March 2008 – 400 June 2008 – 1104 January 2009 – 1296 </li></ul><ul><li>Dashboard revisions to align to strategic plan metrics </li></ul>
  10. 10. Data Warehouse Applications <ul><li>Course selections (e.g. READ 180) </li></ul><ul><li>Assessment planning for ELLs and sDis </li></ul><ul><li>Progress monitoring (benchmark assessments and suspensions in particular) </li></ul><ul><li>Data retreats/School improvement planning </li></ul>
  11. 11. New DW Directions <ul><li>Support data-informed decisions </li></ul><ul><li>Better identify what works and at what cost </li></ul><ul><li>Add professional development data </li></ul><ul><li>Add financial data </li></ul><ul><li>Add staff data </li></ul><ul><li>Redesign principal dashboard </li></ul><ul><ul><li>Better support SIP </li></ul></ul><ul><ul><li>Align with district strategic plan </li></ul></ul>
  12. 12. Case Study Analysis of Milwaukee Public Schools’ School Improvement Planning Process: Preliminary Findings Sara Kraemer Lisa Geraghty Value-Added Research Center University of Wisconsin-Madison This work was funded by the U.S. Department of Education-Institute of Science and the Joyce Foundation.
  13. 13. Research Study <ul><li>Identify and describe the School Improvement Planning (SIP) practices in MPS </li></ul><ul><li>Identify the factors that contribute to and hinder effective SIP processes and practices </li></ul><ul><ul><ul><li>[Emphasis on Value-Added (VA)] </li></ul></ul></ul>
  14. 14. Sample Framework: Value-Added Quadrant Analysis Value-Added Attainment Quadrant 2: High VA / Low Attainment 2 schools: K-8, high school Quadrant 4: Low Attainment / Low VA 2 schools: K-8 (SIFI), Middle school charter Quadrant 1: High VA / High Attainment 2 schools: K-8 (Mosaic) Quadrant 3: High Attainment / Low VA 2 schools: K-5
  15. 15. Study Design and Analysis <ul><li>Semi-structured interviews with principals (8 completed) </li></ul><ul><li>Focus groups with learning teams (3 completed: 2 Q1, 1 Q3) </li></ul><ul><ul><li>Focus groups sometimes included non-participant observation with some questions </li></ul></ul><ul><li>Thematic content analysis of data </li></ul>
  16. 16. Data Use (Quadrant 1) <ul><li>Itemized groupings of WCKE content areas </li></ul><ul><ul><li>Identify targeted content areas </li></ul></ul><ul><li>Benchmarking </li></ul><ul><ul><li>Targeted intervention of students </li></ul></ul><ul><li>Value-Added </li></ul><ul><ul><li>Validation, comparisons between grades </li></ul></ul><ul><ul><li>Need classroom level VA </li></ul></ul>
  17. 17. Data Use (Quadrant 2) <ul><li>Strong focus on item analysis of WKCE </li></ul><ul><li>Benchmark data </li></ul><ul><ul><li>Weekly lesson progress charts </li></ul></ul><ul><ul><li>Monthly CABS </li></ul></ul><ul><ul><li>ThinkLink (need VA ThinkLink) </li></ul></ul><ul><ul><li>Learning walks </li></ul></ul><ul><li>Attendance, suspension, safety data </li></ul><ul><li>“ Culture of data” </li></ul>
  18. 18. Data Use (Quadrant 3) <ul><li>Item-level analysis of WKCE </li></ul><ul><li>ThinkLink benchmark very important </li></ul><ul><li>Need assistance with diagnosis </li></ul><ul><li>Example: Increase scale score growth </li></ul><ul><ul><li>Use scale score in reading, item level analysis of WKCE, use of probes/benchmarks (ThinkLink) </li></ul></ul>
  19. 19. Data Use (Quadrant 4) <ul><li>Focus on student behavior </li></ul><ul><li>“General” approach to data </li></ul><ul><li>Unaware of school performance ranking </li></ul><ul><li>Starting to use problem solving team </li></ul>
  20. 20. Summary <ul><li>Targeted support for various levels of performance </li></ul><ul><li>Schools view/engage data and school improvement differently across performance levels </li></ul><ul><li>Further investigation needed to develop targeted support </li></ul>
  21. 21. Integrated Resource Information System (IRIS) Jeffery Watson Value – Added Research Center UW Madison
  22. 22. IRIS <ul><li>4 year US Dept of Ed IES Grant </li></ul><ul><li>Policy, Finance, and System: Goal 5 </li></ul><ul><li>Started March 1 2008 </li></ul><ul><li>Collaboration between Value-Added Research Center (VARC) and Milwaukee Public Schools (MPS) </li></ul>
  23. 23. Goals <ul><li>WHAT WORKS? </li></ul><ul><ul><li>Connect resources to outcomes </li></ul></ul><ul><ul><li>Track resource expenditures down to school, classroom, and student </li></ul></ul><ul><ul><li>Value Added Analysis </li></ul></ul><ul><ul><li>Connect data to operational decisions and improvement planning </li></ul></ul>
  24. 24. Methods <ul><li>Resource allocation framework </li></ul><ul><li>Assess schools resource allocations </li></ul><ul><li>Assess data quality </li></ul><ul><li>Develop new data systems when needed </li></ul><ul><li>Identify overlapping uses </li></ul>
  25. 25. IRIS Resource Expenditure Framework Classroom demographics, Climate Teacher Characteristics Instructional Practices Instructional Materials, Actual Teacher Salary Classroom Student level demographics and information Courses taken Cost of educational strategies Student School-wide Climate, Prof Community Induction Program Length of Inst. day FTE Core Teachers School Social / Organizational Human Resource Program Financial Level Resource Type
  26. 26. Technical Challenges <ul><li>Data systems integration </li></ul><ul><li>Data quality </li></ul><ul><li>Data dictionary </li></ul><ul><li>Documentation </li></ul><ul><li>Design requirement specification </li></ul>
  27. 27. Social Challenges <ul><li>Documenting operational decisions </li></ul><ul><li>System staffing levels and proficiencies </li></ul><ul><ul><li>Level of support for DW development </li></ul></ul><ul><ul><li>Building level data systems staff </li></ul></ul><ul><li>Silos </li></ul><ul><li>Complex workflow in schools </li></ul><ul><li>Access to staff </li></ul>
  28. 28. IRIS Use Cases Assess individual student needs based on past programming and student achievement Evaluation of student participation in programs over time Calculate per pupil expenditures related to programming Ensure all students receive equitable programming Student Programs Align school programming with respect to student Needs Identifying effective school Programs Tracking resource expenditures of school Programming Ensure that all schools have programming that is aligned with school improvement plan School Programming Identify staff PD needs with respect to school improvement plan Evaluating the impact of a specific PD initiative on student achievement Tracking PD resource expenditures at district- and school- level Ensure school PD is aligned with district Initiatives Professional Development (PD) Resource Expenditure School Administration Research and Assessment Finance and Budget District Administration Type of IRIS Indicator User Groups Table 4. Examples of use cases organized by user group and type of IRIS indicators
  29. 29. Partially High Managing special education data SSIMS i No High Collecting data on school climate and stakeholder perceptions C&P Survey h No Low— to be developed by 2008 Collecting data on school resource utilization SRUS g No High—developed in 2006 Collecting data on instructional practices IPS f No High—MPS currently redesigning Managing student enrollment into supplemental services APlus e No Moderate—2006 Managing finance, accounting and budgetary data IFAS d No Moderate—piloted in 2006 Managing teacher enrollment into district-sponsored professional development Enroll c Partially Moderate—expanded by 2008 Managing employee data PeopleSoft b Yes High Managing student enrollment, scheduling, grades, attendance, teacher course assignments, and discipline eSIS a In data warehouse? Level of development Core function Table 2 Existing MPS Data Systems
  30. 30. Tracking Professional Development <ul><li>Currently two systems </li></ul><ul><ul><li>District-sponsored PD </li></ul></ul><ul><ul><li>Site-based PD </li></ul></ul><ul><li>Goal: track, analyze and report on the types, amounts, and cost of PD that teachers are receiving </li></ul><ul><li>Based on 6 traits of effective PD: </li></ul><ul><ul><li>Active participation, >20 hours, collective participation w/in a school, </li></ul></ul>
  31. 31. IRIS Lines of Work: Year 1 <ul><li>Gap analysis and user needs analysis </li></ul><ul><li>Conceptual design of IRIS </li></ul><ul><li>Data element definition, specification, and data mapping </li></ul><ul><li>Design and improvement of data links and source systems; ensure data quality </li></ul><ul><li>Interface design, including testing and validation </li></ul><ul><li>Documentation of IRIS </li></ul><ul><li>Begin design of training materials </li></ul>
  32. 32. Early IRIS Successes <ul><li>Helped with focusing attention on data quality </li></ul><ul><ul><li>Student – teacher linkages </li></ul></ul><ul><ul><li>Improved access to make corrections by users </li></ul></ul><ul><li>System to track site-based PD </li></ul><ul><li>Revised resource allocation framework </li></ul><ul><li>Instructional Practice Survey </li></ul><ul><li>Increased focus on resource allocations within an improvement framework </li></ul>
  33. 33. Questions <ul><li>Jeffery Watson </li></ul><ul><ul><li>[email_address] </li></ul></ul><ul><ul><li>608.263.0436 </li></ul></ul><ul><li>Sara Kraemer </li></ul><ul><ul><li>[email_address] </li></ul></ul><ul><ul><li>608.265.5624 </li></ul></ul><ul><li>Deborah Lindsey </li></ul><ul><ul><li>[email_address] </li></ul></ul><ul><ul><li>414.475.8751 </li></ul></ul>