Instructional Data Analysis
Coordinator Instructional Data Analysis, Technology & Assessment
Focus for Today
• Instructional Data Analysis &
– What are they?
– How do they function?
– How do they relate to me?
Break Down for Day
• 9 AM Begin
• 10:45 -11:00 Break
• 12:00 – 12:30 Lunch
• 1:45 – 2:00 Break
• 3:00PM Conclude
• How is data like these fruits?
Data Driven Teams
Data Driven Schools
Three Pillars of NYS Education
• NYS Common Core Learning Standards
• Data Driven Instruction
So What is a Data Team?
• A data team can generally be defined
as a group of educators
collaboratively using data to identify
and understand opportunities for
improvement, then working together
to make changes that get
measureable results. Using protocols
for collaborative inquiry, the group
follows a process in which members
prepare, implement, and reflect on
data-informed actionable goals.
• This simple definition can be applied
broadly at many levels within a
• At the classroom level, teachers use
data to identify student learning
problems and work together to plan
instructional changes that will yield
improvements in learning.
• At the school level, principals and
school improvement teams use data
to identify goals to drive
improvements in the ways teachers
collaborate and learn, thereby
improving results for all students.
• Within a district office, many
departments and leaders use data to
make decisions regarding the
management and efficiency of their
Instructional Data Team Standards
• Gap Analysis
Always looking at
process not people.
Taking Multiple Measures Further
What data would we need in
each of these areas?
• Student Learning
• School Improvement
• What is the source for this
• Who has the access?
District Data Team Samples
Collecting, Charting and
walls.doc Data Walls
Perceptions Are a Different Way of
Looking At Things
Things are not always what they first
Need to keep focused
Data Collected Needs to be Consistent
in order to have Validity
• School Data a Comedy – Dirty Data – Example
5 Essential Questions They Ask
• Where are we now?
• Where do we want to be?
• How did we get to where we are now?
• How are we going to get to where we want to
• Is what we are doing making a difference?
Some Start with a
Data Team Survey.
This focuses the
It also helps to
focus the essential
or guiding questions
that keep the team
Simply put, what is
the problem of
Definition of an Essential Question
"A guiding question is the fundamental query
that directs the search for understanding."
Guiding questions help provide focus.
What do Guiding Questions Look Like?
• Good guiding questions are open-ended yet
focus inquiry on a specific topic.
• Guiding questions are non-judgmental, but
answering them requires high-level cognitive
• Good guiding questions contain emotive
force and intellectual bite.
• Guiding questions are succinct. They contain
few words but demand a lot.
ADVICE FOR DEVELOPING GOOD
• Determine the theme or concept you want to
• Brainstorm a list of questions you believe might
cause your to think about the topic but that
don't dictate conclusions or limit possible
directions of investigation. Wait to evaluate and
refine the list until you have several possibilities.
• The question must allow for multiple avenues
• Consider the six queries that newspapers
answer: who, what, when, where, how, and why.
Sample Guiding Questions
SAMPLE GUIDING QUESTIONS for DATA STUDY:
• How do student outcomes differ by demographics,
programs, and schools?
• To what extent have specific programs,
interventions, and services improved outcomes?
• What is the longitudinal progress of a specific
cohort of students?
• What are the characteristics of students who
achieve proficiency and of those who do not?
• Where are we making the most progress in closing
• How do absence and mobility affect assessment
• How do student grades correlate with state
assessment results and other measures?
Traver, R. (March, 1998). What is a
good guiding question?
Educational Leadership, p. 70-73.
Ronka, D., Lachat, M, et. al.
(December 2008/January 2009).
Data: Now What? Educational
Leadership, p. 18-24.
The Data Coordinator is always a Member of the Data Team
What Impact does this Team Have?
• Build a vision for data use that is
grounded in positive student
• Articulate the vision for district-
wide systemic data use clearly
and repeatedly with all
stakeholders to paint an evident
image of how the future will be
better if all engage in this work
• Develop and communicate a
sense of positive urgency
• Share the structure and function
of the District Data Team with
• Talk openly with staff at all levels
in the district about stress they
may experience as change is
• Actively listen: solicit and act
upon the concerns of staff
members to facilitate the change
• Acknowledge losses that people
may feel as they shift established
habits and approach their work in
Model the Process
• Lead by example, not by
• Publicly demonstrate how
the District Data Team is
moving toward the vision
• Present the district-level
data overview with school-
level participants and other
• Design district-level action
plans using the Data-Driven
Inquiry and Action Cycle
Manage the Process
• Conduct and maintain a
data inventory that includes
– school-level data
– Coordinate the upload of
local data to the Data
– Maintain an up-to-date data
– Disseminate relevant data
sets and displays for school-
• This common view shows
the p-value for each
question. For multiple
choice items, p-value is
the proportion of
correctly. For constructed
response items, p-value is
the mean raw score
divided by the maximum
number of score points
for an item.
• District performance on
individual questions can
be compared to regional
levels to determine how
performed on a particular
• The larger the sample size
the more accurate the
• Also referred to as the p-value.
• The range is from 0% to 100%, or more
typically written as a proportion of 0.0
• The higher the value, the easier the
• Calculation: Divide the number of
students who got an item correct by the
total number of students who
• Ideal value: Slightly higher than midway
between chance (1.00 divided by the
number of choices) and a perfect score
(1.00) for the item. For example, on a
four-alternative, multiple-choice item,
the random guessing level is 1.00/4 =
0.25; therefore, the optimal difficulty
level is .25 + (1.00 - .25) / 2 = 0.62. On a
true-false question, the guessing level is
(1.00/2 = .50) and, therefore, the
optimal difficulty level is .
50+(1.00-.50)/2 = .75
• P-values above 0.90 are very easy items
and should be carefully reviewed based
on the instructor’s purpose. For example,
if the instructor is using easy “warm-up”
questions or aiming for student mastery,
than some items with p values above .90
may be warranted. In contrast, if an
instructor is mainly interested in
differences among students, these items
may not be worth testing.
• P-values below 0.20 are very difficult
items and should be reviewed for possible
confusing language, removed from
subsequent exams, and/or identified as an
area for re-instruction. If almost all of the
students get the item wrong, there is
either a problem with the item or
students were not able to learn the
concept. However, if an instructor is trying
to determine the top percentage of
students that learned a certain concept,
this highly difficult item may be necessary.
• Same exams give
• Common Rubrics
• Assessments given during same time period
• Results are looked at through Data
conversations with building leaders.
• Conversations may also be grade level or
subject level wide. They may also be building
wide and make its way to the District Data