DAI
Instructional Data Analysis
&
Data Teams
Karen Brooks
Coordinator Instructional Data Analysis, Technology & Assessment
May 6th
, 2014
kbrooks@ulsterboces.org
http://www.karenbrooks.wikispaces.com
http://www.slideshare.net/kbrooks
Focus for Today
• Instructional Data Analysis &
Data Teams:
– What are they?
– How do they function?
– How do they relate to me?
Break Down for Day
• 9 AM Begin
• 10:45 -11:00 Break
• 12:00 – 12:30 Lunch
• 1:45 – 2:00 Break
• 3:00PM Conclude
Grounding/Ice Breaker
• How is data like these fruits?
» Grapes
» Kiwi
» Banana
» Strawberry
Beginning
and End
Data Driven Teams
Data Driven Schools
Three Pillars of NYS Education
• NYS Common Core Learning Standards
• http://www.engageny.org/resource/new-york-state-p-12-common-core-learning-standard
• Data Driven Instruction
• http://www.engageny.org/data-driven-instruction
• APPR
• http://www.capregboces.org/APPR/FAQs.c
fm
So What is a Data Team?
So What is a Data Team?
• A data team can generally be defined
as a group of educators
collaboratively using data to identify
and understand opportunities for
improvement, then working together
to make changes that get
measureable results. Using protocols
for collaborative inquiry, the group
follows a process in which members
prepare, implement, and reflect on
data-informed actionable goals.
• This simple definition can be applied
broadly at many levels within a
district.
• At the classroom level, teachers use
data to identify student learning
problems and work together to plan
instructional changes that will yield
improvements in learning.
• At the school level, principals and
school improvement teams use data
to identify goals to drive
improvements in the ways teachers
collaborate and learn, thereby
improving results for all students.
• Within a district office, many
departments and leaders use data to
make decisions regarding the
management and efficiency of their
particular responsibilities.
Cheat Sheet:
http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/dddm_desktop_reference_guide.pdf
Instructional Data Team Standards
Rubric
• http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/standards_for_instructional_data_teams.pdf
Sections
• Membership
• Structure
• Process
• Gap Analysis
• Instructional
Action Planning
Always looking at
process not people.
Simply Put
Data Team Cycle
Data Team Video
https://www.youtube.com/watch?v=04-fDELnReE
https://www.pinterest.com/emoeggert/plc-videos/
Victoria Bernhardt
Multiple Measures
http://nces.ed.gov/pubs2007/curriculum/pdf/multiple_measures.pdf
• Demographics
• Perceptions
• Student Learning
• School
Improvement
Multiple Measures
Taking Multiple Measures Further
What data would we need in
each of these areas?
• Demographics
• Perceptions
• Student Learning
• School Improvement
Next Steps
• What is the source for this
Data?
• Who has the access?
District Data Team Samples
• http://www.middletownschoo
ls.org/page.cfm?p=7538 -
Website
• http://www.sde.ct.gov/sde/lib
/sde/pdf/curriculum/cali/3bdt
datateammeetingsteps.pdf
Collecting, Charting and
Discussing
• https://docs.google.com/a/uls
terboces.org/viewer?
url=http://www.sde.ct.gov/sd
e/lib/sde/word_docs/cali/data
_teams/guidelines_for_data_
walls.doc Data Walls
GuideLines
Perceptions Are a Different Way of
Looking At Things
Things are not always what they first
appear
Need to keep focused
Data Collected Needs to be Consistent
in order to have Validity
• School Data a Comedy – Dirty Data – Example
PBIS
• https://www.youtube.com/watch?
v=XBv95uMFudE
Data Teams
5 Essential Questions They Ask
• Where are we now?
• Where do we want to be?
• How did we get to where we are now?
• How are we going to get to where we want to
be?
• Is what we are doing making a difference?
Ideal Data
Team
Planning
So Where
Does the
Data Team
Start
Some Start with a
Data Team Survey.
This focuses the
team.
It also helps to
focus the essential
or guiding questions
that keep the team
on track.
Simply put, what is
the problem of
focus?
Definition of an Essential Question
"A guiding question is the fundamental query
that directs the search for understanding."
Guiding questions help provide focus.
What do Guiding Questions Look Like?
CHARACTERISTICS:
• Good guiding questions are open-ended yet
focus inquiry on a specific topic.
• Guiding questions are non-judgmental, but
answering them requires high-level cognitive
work.
• Good guiding questions contain emotive
force and intellectual bite.
• Guiding questions are succinct. They contain
few words but demand a lot.
ADVICE FOR DEVELOPING GOOD
GUIDING QUESTIONS:
• Determine the theme or concept you want to
explore.
• Brainstorm a list of questions you believe might
cause your to think about the topic but that
don't dictate conclusions or limit possible
directions of investigation. Wait to evaluate and
refine the list until you have several possibilities.
• The question must allow for multiple avenues
and perspectives.
• Consider the six queries that newspapers
answer: who, what, when, where, how, and why.
Sample Guiding Questions
SAMPLE GUIDING QUESTIONS for DATA STUDY:
• How do student outcomes differ by demographics,
programs, and schools?
• To what extent have specific programs,
interventions, and services improved outcomes?
• What is the longitudinal progress of a specific
cohort of students?
• What are the characteristics of students who
achieve proficiency and of those who do not?
• Where are we making the most progress in closing
achievement gaps?
• How do absence and mobility affect assessment
results?
• How do student grades correlate with state
assessment results and other measures?
RESOURCES:
Traver, R. (March, 1998). What is a
good guiding question?
Educational Leadership, p. 70-73.
Ronka, D., Lachat, M, et. al.
(December 2008/January 2009).
Data: Now What? Educational
Leadership, p. 18-24.
The Data Coordinator is always a Member of the Data Team
What Impact does this Team Have?
Build Awareness
• Build a vision for data use that is
grounded in positive student
outcomes
• Articulate the vision for district-
wide systemic data use clearly
and repeatedly with all
stakeholders to paint an evident
image of how the future will be
better if all engage in this work
• Develop and communicate a
sense of positive urgency
• Share the structure and function
of the District Data Team with
school-level teams
Understand Concerns
• Talk openly with staff at all levels
in the district about stress they
may experience as change is
implemented
• Actively listen: solicit and act
upon the concerns of staff
members to facilitate the change
process
• Acknowledge losses that people
may feel as they shift established
habits and approach their work in
new ways
Continued
Model the Process
• Lead by example, not by
edict
• Publicly demonstrate how
the District Data Team is
moving toward the vision
• Present the district-level
data overview with school-
level participants and other
district stakeholders
• Design district-level action
plans using the Data-Driven
Inquiry and Action Cycle
Manage the Process
• Conduct and maintain a
data inventory that includes
– school-level data
– Coordinate the upload of
local data to the Data
Warehouse
– Maintain an up-to-date data
dissemination schedule
– Disseminate relevant data
sets and displays for school-
based action
Culture of Data Use
Can You Pull Data for Me?
Data Team Schedule
Let’s Look at Some Data:
» Frequency Distribution by Subgroup
» Frequency Distribution
» CC Strands Analysis
» P-Value Comparison
» Sample Grade 6
Glossary of Assessment Terms
Frequency
Distribution by
Sub Group
Frequency
Distribution
Common Core
Strand Analysis
P-Value
Comparison
P-Value
• This common view shows
the p-value for each
question. For multiple
choice items, p-value is
the proportion of
students responding
correctly. For constructed
response items, p-value is
the mean raw score
divided by the maximum
number of score points
for an item.
• District performance on
individual questions can
be compared to regional
levels to determine how
similar students
performed on a particular
question.
• The larger the sample size
the more accurate the
results.
P-Value Basics
• Also referred to as the p-value.
• The range is from 0% to 100%, or more
typically written as a proportion of 0.0
to 1.00.
• The higher the value, the easier the
item.
• Calculation: Divide the number of
students who got an item correct by the
total number of students who
answered it.
• Ideal value: Slightly higher than midway
between chance (1.00 divided by the
number of choices) and a perfect score
(1.00) for the item. For example, on a
four-alternative, multiple-choice item,
the random guessing level is 1.00/4 =
0.25; therefore, the optimal difficulty
level is .25 + (1.00 - .25) / 2 = 0.62. On a
true-false question, the guessing level is
(1.00/2 = .50) and, therefore, the
optimal difficulty level is .
50+(1.00-.50)/2 = .75
• P-values above 0.90 are very easy items
and should be carefully reviewed based
on the instructor’s purpose. For example,
if the instructor is using easy “warm-up”
questions or aiming for student mastery,
than some items with p values above .90
may be warranted. In contrast, if an
instructor is mainly interested in
differences among students, these items
may not be worth testing.
• P-values below 0.20 are very difficult
items and should be reviewed for possible
confusing language, removed from
subsequent exams, and/or identified as an
area for re-instruction. If almost all of the
students get the item wrong, there is
either a problem with the item or
students were not able to learn the
concept. However, if an instructor is trying
to determine the top percentage of
students that learned a certain concept,
this highly difficult item may be necessary.
https://www.utexas.edu/academic/ctl/assessment/iar/students/report/itemanalysis.php
2013 Grade 6 Math
District to Region
2013 Grade 6 Math
Region to County
Always Consider
Triangulation
Assessments
https://www.youtube.com/watch?v=0fn_vAhu_Lw
Balanced Assessments
• Same exams give
• Common Rubrics
• Assessments given during same time period
• Results are looked at through Data
conversations with building leaders.
• Conversations may also be grade level or
subject level wide. They may also be building
wide and make its way to the District Data
Team.
The Ripple
Effect
Organize for
Collaborative
Work
Process
• Fishbone/Hypothesis • 5 Whys
Next Steps
Getting Invested in the Data Team Work
The Break Down
Data Team Key Considerations
Relating it Back to Our Work
Know Your Most Important Variable = Your Students
https://www.youtube.com/watch?v=cL9Wu2kWwSY
Beginning
and End
Ulster BOCES Data Boot Camp
August 19th
, 20th
, 21st
, 2014
Thank you
Karen Brooks
• kbrooks@ulsterboces.org
• http://www.karenbrooks.wikispaces.com
• http://www.slideshare.net/kbrooks

Dai karen

  • 1.
    DAI Instructional Data Analysis & DataTeams Karen Brooks Coordinator Instructional Data Analysis, Technology & Assessment May 6th , 2014 kbrooks@ulsterboces.org http://www.karenbrooks.wikispaces.com http://www.slideshare.net/kbrooks
  • 2.
    Focus for Today •Instructional Data Analysis & Data Teams: – What are they? – How do they function? – How do they relate to me?
  • 3.
    Break Down forDay • 9 AM Begin • 10:45 -11:00 Break • 12:00 – 12:30 Lunch • 1:45 – 2:00 Break • 3:00PM Conclude
  • 4.
    Grounding/Ice Breaker • Howis data like these fruits? » Grapes » Kiwi » Banana » Strawberry
  • 5.
  • 8.
    Data Driven Teams DataDriven Schools Three Pillars of NYS Education • NYS Common Core Learning Standards • http://www.engageny.org/resource/new-york-state-p-12-common-core-learning-standard • Data Driven Instruction • http://www.engageny.org/data-driven-instruction • APPR • http://www.capregboces.org/APPR/FAQs.c fm
  • 9.
    So What isa Data Team?
  • 10.
    So What isa Data Team? • A data team can generally be defined as a group of educators collaboratively using data to identify and understand opportunities for improvement, then working together to make changes that get measureable results. Using protocols for collaborative inquiry, the group follows a process in which members prepare, implement, and reflect on data-informed actionable goals. • This simple definition can be applied broadly at many levels within a district. • At the classroom level, teachers use data to identify student learning problems and work together to plan instructional changes that will yield improvements in learning. • At the school level, principals and school improvement teams use data to identify goals to drive improvements in the ways teachers collaborate and learn, thereby improving results for all students. • Within a district office, many departments and leaders use data to make decisions regarding the management and efficiency of their particular responsibilities. Cheat Sheet: http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/dddm_desktop_reference_guide.pdf
  • 11.
    Instructional Data TeamStandards Rubric • http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/cali/standards_for_instructional_data_teams.pdf Sections • Membership • Structure • Process • Gap Analysis • Instructional Action Planning Always looking at process not people.
  • 12.
  • 13.
  • 14.
  • 15.
    Victoria Bernhardt Multiple Measures http://nces.ed.gov/pubs2007/curriculum/pdf/multiple_measures.pdf •Demographics • Perceptions • Student Learning • School Improvement
  • 16.
  • 17.
    Taking Multiple MeasuresFurther What data would we need in each of these areas? • Demographics • Perceptions • Student Learning • School Improvement Next Steps • What is the source for this Data? • Who has the access?
  • 18.
    District Data TeamSamples • http://www.middletownschoo ls.org/page.cfm?p=7538 - Website • http://www.sde.ct.gov/sde/lib /sde/pdf/curriculum/cali/3bdt datateammeetingsteps.pdf Collecting, Charting and Discussing • https://docs.google.com/a/uls terboces.org/viewer? url=http://www.sde.ct.gov/sd e/lib/sde/word_docs/cali/data _teams/guidelines_for_data_ walls.doc Data Walls GuideLines
  • 19.
    Perceptions Are aDifferent Way of Looking At Things
  • 20.
    Things are notalways what they first appear Need to keep focused
  • 21.
    Data Collected Needsto be Consistent in order to have Validity • School Data a Comedy – Dirty Data – Example PBIS • https://www.youtube.com/watch? v=XBv95uMFudE
  • 22.
    Data Teams 5 EssentialQuestions They Ask • Where are we now? • Where do we want to be? • How did we get to where we are now? • How are we going to get to where we want to be? • Is what we are doing making a difference?
  • 23.
  • 24.
  • 25.
    Some Start witha Data Team Survey. This focuses the team. It also helps to focus the essential or guiding questions that keep the team on track. Simply put, what is the problem of focus?
  • 26.
    Definition of anEssential Question "A guiding question is the fundamental query that directs the search for understanding." Guiding questions help provide focus.
  • 27.
    What do GuidingQuestions Look Like? CHARACTERISTICS: • Good guiding questions are open-ended yet focus inquiry on a specific topic. • Guiding questions are non-judgmental, but answering them requires high-level cognitive work. • Good guiding questions contain emotive force and intellectual bite. • Guiding questions are succinct. They contain few words but demand a lot. ADVICE FOR DEVELOPING GOOD GUIDING QUESTIONS: • Determine the theme or concept you want to explore. • Brainstorm a list of questions you believe might cause your to think about the topic but that don't dictate conclusions or limit possible directions of investigation. Wait to evaluate and refine the list until you have several possibilities. • The question must allow for multiple avenues and perspectives. • Consider the six queries that newspapers answer: who, what, when, where, how, and why.
  • 28.
    Sample Guiding Questions SAMPLEGUIDING QUESTIONS for DATA STUDY: • How do student outcomes differ by demographics, programs, and schools? • To what extent have specific programs, interventions, and services improved outcomes? • What is the longitudinal progress of a specific cohort of students? • What are the characteristics of students who achieve proficiency and of those who do not? • Where are we making the most progress in closing achievement gaps? • How do absence and mobility affect assessment results? • How do student grades correlate with state assessment results and other measures? RESOURCES: Traver, R. (March, 1998). What is a good guiding question? Educational Leadership, p. 70-73. Ronka, D., Lachat, M, et. al. (December 2008/January 2009). Data: Now What? Educational Leadership, p. 18-24.
  • 30.
    The Data Coordinatoris always a Member of the Data Team What Impact does this Team Have? Build Awareness • Build a vision for data use that is grounded in positive student outcomes • Articulate the vision for district- wide systemic data use clearly and repeatedly with all stakeholders to paint an evident image of how the future will be better if all engage in this work • Develop and communicate a sense of positive urgency • Share the structure and function of the District Data Team with school-level teams Understand Concerns • Talk openly with staff at all levels in the district about stress they may experience as change is implemented • Actively listen: solicit and act upon the concerns of staff members to facilitate the change process • Acknowledge losses that people may feel as they shift established habits and approach their work in new ways
  • 31.
    Continued Model the Process •Lead by example, not by edict • Publicly demonstrate how the District Data Team is moving toward the vision • Present the district-level data overview with school- level participants and other district stakeholders • Design district-level action plans using the Data-Driven Inquiry and Action Cycle Manage the Process • Conduct and maintain a data inventory that includes – school-level data – Coordinate the upload of local data to the Data Warehouse – Maintain an up-to-date data dissemination schedule – Disseminate relevant data sets and displays for school- based action
  • 32.
  • 33.
    Can You PullData for Me?
  • 34.
  • 35.
    Let’s Look atSome Data: » Frequency Distribution by Subgroup » Frequency Distribution » CC Strands Analysis » P-Value Comparison » Sample Grade 6 Glossary of Assessment Terms
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
    P-Value • This commonview shows the p-value for each question. For multiple choice items, p-value is the proportion of students responding correctly. For constructed response items, p-value is the mean raw score divided by the maximum number of score points for an item. • District performance on individual questions can be compared to regional levels to determine how similar students performed on a particular question. • The larger the sample size the more accurate the results.
  • 41.
    P-Value Basics • Alsoreferred to as the p-value. • The range is from 0% to 100%, or more typically written as a proportion of 0.0 to 1.00. • The higher the value, the easier the item. • Calculation: Divide the number of students who got an item correct by the total number of students who answered it. • Ideal value: Slightly higher than midway between chance (1.00 divided by the number of choices) and a perfect score (1.00) for the item. For example, on a four-alternative, multiple-choice item, the random guessing level is 1.00/4 = 0.25; therefore, the optimal difficulty level is .25 + (1.00 - .25) / 2 = 0.62. On a true-false question, the guessing level is (1.00/2 = .50) and, therefore, the optimal difficulty level is . 50+(1.00-.50)/2 = .75 • P-values above 0.90 are very easy items and should be carefully reviewed based on the instructor’s purpose. For example, if the instructor is using easy “warm-up” questions or aiming for student mastery, than some items with p values above .90 may be warranted. In contrast, if an instructor is mainly interested in differences among students, these items may not be worth testing. • P-values below 0.20 are very difficult items and should be reviewed for possible confusing language, removed from subsequent exams, and/or identified as an area for re-instruction. If almost all of the students get the item wrong, there is either a problem with the item or students were not able to learn the concept. However, if an instructor is trying to determine the top percentage of students that learned a certain concept, this highly difficult item may be necessary. https://www.utexas.edu/academic/ctl/assessment/iar/students/report/itemanalysis.php
  • 42.
    2013 Grade 6Math District to Region
  • 43.
    2013 Grade 6Math Region to County
  • 44.
  • 45.
  • 47.
  • 49.
    Balanced Assessments • Sameexams give • Common Rubrics • Assessments given during same time period • Results are looked at through Data conversations with building leaders. • Conversations may also be grade level or subject level wide. They may also be building wide and make its way to the District Data Team.
  • 50.
  • 51.
  • 52.
  • 53.
    Next Steps Getting Investedin the Data Team Work
  • 54.
  • 61.
    Data Team KeyConsiderations
  • 62.
    Relating it Backto Our Work
  • 63.
    Know Your MostImportant Variable = Your Students https://www.youtube.com/watch?v=cL9Wu2kWwSY
  • 64.
  • 65.
    Ulster BOCES DataBoot Camp August 19th , 20th , 21st , 2014
  • 66.
    Thank you Karen Brooks •kbrooks@ulsterboces.org • http://www.karenbrooks.wikispaces.com • http://www.slideshare.net/kbrooks