SlideShare a Scribd company logo
1 of 9
Download to read offline
Multiple initiatives, multiple challenges: The promise and pitfalls of
implementing data
Lea Hubbard a,
*, Amanda Datnow b
, Laura Pruyn a
a
University of San Diego, United States
b
University of California, San Diego, United States
Introduction
Increasingly, government agencies across the globe are
attempting to motivate educators to use data as a vehicle for
educational improvement (Earl & Fullan, 2003; McPhee & Patrik,
2009). An emphasis on data use has escalated in the Netherlands,
US, Canada, South Africa, New Zealand, and other countries
(Schildkamp & Lai, 2012). In the US, data-driven decision-making
(DDDM) was a major feature of the American Recovery and
Reinvestment Act of 2009 and of the controversial Race to the Top
competition. At all levels of the system, educators are attempting
to respond to these policy demands.
Moving data into useable knowledge to change practice,
however, has significantly challenged principals and teachers.
Prior research has shown that at the school level, principals play a
critical role in motivating teachers to use data and in providing
supports that facilitate data use (Earl & Katz, 2006; Ikemoto &
Marsh, 2007; Levin & Datnow, 2012; Mandinach, Honey, & Light,
2006). Yet some principals lack the knowledge to guide teachers in
data use, and many teachers have not had sufficient training in
how to understand and use data to inform their instructional
decisions. As such, data literacy among educators remains a
persistent concern.
Moreover, in the US, new issues are arising with respect to data
use as educators move toward teaching students 21st century
skills, implementing the new Common Core standards, and
undertaking other efforts to make learning more student-centered.
These initiatives will involve activities that engage students in
critical thinking, generating new knowledge, and learning through
project-based work – all skills which are not easily measured by
traditional assessments. As a result, what counts as ‘‘data’’ will
become increasingly wide-ranging (Levin, Datnow, & Carrier,
2012). Thus, getting teachers together to discuss evidence of
student learning and the development of new forms of assessment
would appear to be a critically important component of this shift.
We are now not only asking teachers to use data to inform decision
making, but also to use more complex forms of data and to
implement new instructional strategies.
Other challenges arise from the fact that efforts to implement
data-driven decision-making sometimes do not account for the
culture and structure existing within a school. Like other reforms,
data use is layered on top of already established routines and
relationships, and some run counter to evidence-informed
practice. Spillane (2012) suggests that organizational routines
are put in place, often with scripts to guide discussions about data
and to transform teaching and learning. However, the ‘‘performa-
tive aspect of organizational routines,’’ that is, how a routine works
Studies in Educational Evaluation 42 (2014) 54–62
A R T I C L E I N F O
Article history:
Received 30 March 2013
Received in revised form 13 August 2013
Accepted 9 October 2013
Available online 1 November 2013
Keywords:
Data use
Project based learning
School reform
A B S T R A C T
Data driven decision making has become a popular reform effort across the globe. New issues are arising
with respect to data use as educators move toward teaching students 21st century skills, as the
implementation of Common Core standards begins in the US, and as other efforts are undertaken to make
learning more student centered. This article reports findings from a year-long case study of a US
elementary school that placed data use at the core of its platform for school reform. The goal of the study
was to determine how teachers implemented data use in concert with other reform initiatives.
Interviews with educators, as well as observations of teacher team meetings, revealed that data-
informed instructional planning occurred primarily in language arts and math, and not in other subjects.
The requirements to implement multiple initiatives created many tensions that decreased teachers’
ability and motivation to use data. How and when teachers used data was the result of a broader set of
policies and structures at the federal, district, and school levels, as well as the capacity of the teachers and
principal at the school. Implications for research and practice are discussed.
ß 2013 Elsevier Ltd. All rights reserved.
* Corresponding author. Tel.: +1 760 943 0412.
E-mail address: lhubbard@sandiego.edu (L. Hubbard).
Contents lists available at ScienceDirect
Studies in Educational Evaluation
journal homepage: www.elsevier.com/stueduc
0191-491X/$ – see front matter ß 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.stueduc.2013.10.003
in daily practice and is enabled and/or constrained by ‘‘institu-
tional, historical and cultural situations’’ (p. 125) is usually not
addressed. We know little about these dynamics.
As a recent analysis of a collective body of research on DDDM
points out, we are faced with a ‘‘blunt understanding of data use’’
(Moss, 2012). The remedy, according to Coburn and Turner (2012),
is to conduct investigations into the practice of using data. These
authors urge that such investigations include a closer focus on the
micro interactions of those involved, as well as on the degree to
which participants are embedded in a context that is influenced by
macro-level policies and structures within the educational system.
We undertook such an investigation.
This article reports findings from a year-long case study of the
actions of a principal and teachers at Orchid Heights,1
a US
elementary school that has placed data use at the core of its
platform for school reform. The primary research questions that
guided this study were:
 How do teachers implement data use in tandem with other
reform initiatives?
 What are the actions taken and the challenges faced by educators
in moving data into useable knowledge to inform instruction?
Using a sociocultural perspective, we focused on teachers’ actions
and beliefs, as well as on the institutional context in which they
worked, in order to understand how the educators at Orchid Heights
constructed data use. We found that the formally scheduled grade-
level team meetings, which were designed to allow for discussions of
data and lesson plans, did not always produce the intended results.
District benchmark data were used primarily to build students’
language arts and math skills and not for the purposes of planning
social studies and science lessons. This is not surprising since
students were assessed in these areas and not others. This meant,
however, that the use of this data to inform instructional decision
making was limited to language arts and math, rather than used
across the curriculum as intended. Moreover, requirements to
implement multiple other educational initiatives at the school
created tensions that undercut teachers’ ability and motivation to
more fully integrate data use into their daily practice. How and when
teachers used data was determined by the interaction of multiple
factors, including a broad set of policies and structures in place at the
federal, district and school levels, as well as the capacity of the
teachers and principal.
The remainder of the article is structured as follows: we begin
with a review of the relevant literature and then turn to an
explanation of the methodology and description of the district and
school setting. After a detailed report and discussion of the
findings, we present conclusions and consider their implications.
Review of the literature
Prior research on (1) data use and the role of teacher
collaboration and (2) the challenges of balancing multiple reform
demands and building capacity provided the framework for our
investigation. Taken together, studies in these two areas helped
expand our awareness of the kinds of issues principals and
teachers at Orchid Heights faced in their efforts to use data to
inform instructional decision-making.
Data use and the role of teacher collaboration
Broadly speaking, data-driven decision-making is the process
by which administrators and teachers collect and analyze data to
guide educational decisions (Ikemoto  Marsh, 2007). While each
locale may take a different approach to data use, the underlying
belief is that carefully analyzing evidence about student learning,
such as using standardized test score data and/or student work,
will allow teachers to target instruction toward students’
individual needs (Mandinach  Honey, 2008). The theory is that
by working together, teachers will be able to assist each other in
making sense of the data, engaging in joint action planning, and
sharing instructional strategies.
Overall, it is clear from prior research that evidence of student
learning needs to be actively used to improve instruction in
schools. Research on high-performing districts reveals that such
districts integrate the examination of data and evidence-informed
decision making into daily school and district processes (Foley 
Sigler, 2009; Leithwood, 2008). Many districts have invested in
management information systems, benchmark assessments, and
professional development to build expertise and capacity at the
school level (Datnow, Park,  Wohlstetter 2007; Hamilton et al.,
2009; Supovitz  Taylor, 2003). Some districts have also contracted
with external agencies and consultants to assist in their capacity-
building efforts district-wide (Marsh et al., 2005).
Providing structured time for collaboration is one of the ways
that many districts and schools attempt to support teachers’ use of
data (Honig  Venkateswaran, 2012; Mandinach  Honey, 2008;
Means, Padilla,  Gallagher, 2012). In fact, a majority of high data
use districts provide structured time for collaboration (Marsh,
2012; Means et al., 2010). Opportunities for cross-school interac-
tion are a key ingredient of support for data use (Marsh, 2012).
The presence of a leader who promotes a culture of inquiry
within teacher work groups can aid in making conversations about
data more productive (Horn  Little, 2010; Young, 2006). This is in
part because the knowledge within and among teacher groups can
vary widely, leading to uneven results. For example, teacher teams
with limited expertise can misinterpret or misuse data, or work
together to perpetuate poor classroom practice (see review by
Daly, 2012). On the other hand, groups with a great deal of
collective expertise can be much more generative of learning (Horn
 Little, 2010).
Even with the scaffolds of support that many districts and
schools now provide, the process of engaging in DDDM has proven
to be quite complex. Data from assessments may show patterns of
student achievement, but they do not tell teachers what to do
differently in the classroom (Dowd, 2005; Supovitz, 2009).
Moreover, some argue that the data from large-scale assessments
may be useful for school and system planning, but they are less
useful at the teacher or student level (Rogosa, 2005; Supovitz,
2009). The use of assessment data can be powerful at the teacher
level, but a great deal depends on the level of inquiry that occurs
around the data.
Multiple reforms and capacity building for change
As we noted above, data-driven decision making is often
implemented as one of numerous reform initiatives in a school or
district. This is not surprising, as many educators and scholars see
data use as part of a larger process of continuous improvement.
Thus, schools may be implementing various reforms (e.g.,
implementing small learning communities, adopting a new math
program) and using data to track their progress toward the goals of
these initiatives.
Reform efforts can be planned in ways that are mutually
supportive and cohere around a common goal. Prior research
suggests, however, that schools sometimes face challenges
balancing multiple reform demands. This is especially the case
when reforms do not cohere and result in conflicting directions of
change. Almost fifteen years ago, school change expert Michael
Fullan (1999) noted that the biggest problem facing schools was
fragmentation and overload. Even with the move toward district
coherence in the past decade, many schools still struggle with1
Pseudonyms are used throughout to protect anonymity.
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 55
fragmentation. Hatch (2002) explains that while it is possible to
coordinate multiple reforms in ways that would support coherence
and capacity to improve student learning, doing so is not simple.
He argues that this is in part because schools lack capacity:
In many ways, the push to make improvement programs
available to more and more schools is fueled by the hope that
these programs can help many schools develop the capacity to
change. But ironically, the implementation of these improve-
ment programs is difficult precisely because schools lack the
capacity to change (p. x).
Although Hatch’s findings were in reference to comprehensive
school reform models, the same conclusions about capacity hold
true in the current reform era. In a more recent article, Madda,
Halverson, and Gomez (2007) note that many district initiatives
conflict with each other or with existing practices in schools. They
note that districts need to consider how new initiatives are likely to
fare in the actual contexts of use and to build this knowledge into
their designs of instructional coherence.
When teachers do not have the capacity and are not provided
with the necessary guidance to integrate reforms in a meaningful
way, they are likely to attend to certain aspects of reform while
ignoring others. The sense-making teachers engage in around
reform happens both individually and collectively (Coburn, 2001).
Coburn (2001) finds that patterns of interaction among teachers
influence how teachers adopt, adapt, or disregard reform
initiatives, thus mediating the influence of these reforms on
classroom practice. Teachers’ own prior knowledge of and
experiences with reform also mediate how they respond to new
initiatives (Spillane, Reiser,  Reimer, 2002). Teachers tend to focus
on the aspects of new reforms that are familiar to them, leaving
aside the aspects that are difficult to understand or implement
(Spillane et al., 2002). For these reasons, leaders play a key role in
helping teachers find coherence among reforms and in assisting
them in learning how to integrate reforms measures into their
current practices. Capacity building efforts are critical, and districts
can play an important part in this area (Cawelti  Protheroe, 2007).
In fact, high-performing districts are characterized by a heavy
investment in capacity-building among leaders and teachers,
particularly around instructional improvement (Leithwood, 2008).
In sum, the literature identifies factors that facilitate or impede
data use, especially in concert with other reform initiatives.
Teacher collaboration can play an important role in getting data
use into practice, but a great deal depends on the capacity of the
teachers and leaders – as we will see in the case of Orchid Heights.
Methodology
This study was conducted between October 2011 and May
2012. We adopted a case study approach because this methodolo-
gy is an ideal strategy for exploring situations in which the
intervention being examined (here, data-driven decision-making)
has no single set of outcomes (Merriam, 1998; Rallis  Rossman,
2001; Stake, 1995; Yin, 2009). Case studies provide opportunities
to understand phenomena in their real-life contexts. In this
instance, a case study approach allowed us to investigate the
perspectives of Orchid Heights educators involved in DDDM, the
everyday practices, behaviors, and ideologies that constructed
instructional decision-making at the school, and the challenges the
teachers faced in trying to make use of data.
All teachers at Orchid Heights were expected to use DDDM and
also to work on implementing multiple other reform initiatives.
Each teacher was given time in the instructional day to collaborate
with grade-level colleagues. We focused our investigation on the
regularly scheduled grade-level meetings in which teachers met to
plan their lessons and data use. Meetings were held weekly and
each lasted about 1½ hours. At the recommendation of the
principal, we observed the fourth-grade team, which consisted of
two full-time teachers, and the first-grade team, which consisted of
two full-time and two part-time faculty. (The part-time faculty
were not required to attend all of the meetings. They were present,
however, for the majority of the meetings we observed.) The
principal judged these two grade-level teams as ahead of other
Orchid Heights faculty in terms of designing and implementing
reform initiatives at the school.
We conducted interviews once with each of the four full-time
faculty in the first and fourth grades, twice with the principal (once
at the beginning of the school year and once at the end), and once
with the district superintendent. The same semi-structured
interview guide was used for all teachers. One researcher
conducted the principal and superintendent interview. We used
a semi-structured interview guide for all interviews because doing
so allowed us to include questions shaped by our review of the
literature on DDDM, without completely sacrificing the benefit of
flexibility that a more conversational interview approach permits
(Charmaz, 2006). Our questions were designed primarily to elicit
information that would help us understand participants’ actions as
well as the context in which they took place. So, for example, we
sought to find out how teachers defined data and which factors
seemed to support and which challenge data use across the
curriculum. Except for one teacher interview, two members of our
research team conducted each teacher interview. All tapes were
listened to by at least two members of the research team, and all
were transcribed verbatim. Two members of the research team
coded each transcript to ensure consistency in analysis.
Fourteen grade-level meetings (nine fourth grade and five first
grade) were observed over the course of the academic year and in
most cases, there were two researchers present. We used an
ethnographic approach in our observations. We audio recorded the
entirety of each meeting, paying attention not only to what
teachers said about their practice but also to the interaction among
teachers and how they described the social context in which they
worked. Informed by the concepts in the literature review, we
transcribed, coded and analyzed this data as well in order to
identify themes.
We used sociocultural constructivist methodology to interpret
the data from this study. This approach recognizes that it is
essential to ‘‘rely as much as possible on the participants’ views of
the situation being studied’’ (Creswell, 2009:8) and to place an
emphasis on the phenomenon being studied through the analysis
of the social contexts in which the data are collected (Charmaz,
2006). We were able to take advantage of the serendipitous nature
of qualitative research by letting our respondents take us in
directions that we had not predicted, while also in some cases
building upon and in others questioning the knowledge we had
gathered from previous research. For example, respondents
pointed us to the impact of the multiple initiatives on DDDM,
an issue whose significance we had not anticipated. Theorizing in
the interpretive tradition, as suggested by Charmaz (2006), we
used what we observed in meetings and gathered in interviews to
‘‘delve into the implicit meanings and processes’’ (p. 146)
associated with the implementation of DDDM at Orchid Heights.
The district and school context
Orchid Heights Elementary is located in a K-6th grade school
district that serves a predominately Caucasian population (80%).
Student performance measures on the most recent state assess-
ment test place this district among the highest achieving school
districts in the state. We chose to investigate this district because
the reform-minded superintendent and assistant superintendent
have pushed aggressively for multiple reform initiatives, including
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6256
data use, over the last three years in an effort to not only maintain
the district’s legacy of distinction (all of the district’s schools have
been recognized as distinguished schools), but also to further
improve the nine district elementary schools. We chose Orchid
Heights for this study because the superintendent viewed the
principal as a strong leader who fully supported district initiatives,
and thus potentially put Orchid Heights ahead of many other
district schools in implementing reform measures.
Generally, the district’s initiatives have revolved around efforts
to align teaching and learning with data. Data use was considered
essential not only because district administrators viewed it as a
savvy strategy to improve student learning, but also because
federal No Child Left Behind (NCLB) legislation demanded that
schools use data to inform their instruction. Beginning in 2007–
2008, under NCLB, schools are required to track data related to
school demographics, assessments, accountability, and teacher
quality. Moreover, under NCLB’s accountability provisions, schools
must make ‘‘adequate yearly progress’’ for all student groups,
including English Language Learners.2
Beginning several years ago the district began administering
benchmark assessments in language arts and math. Data from
these assessments were made available to teachers soon after each
assessment, with the assumption that they would guide data
driven decision making. In addition, the district required schools to
put student-centered learning in place as evidenced by the
implementation of Project Based Learning (PBL). The recent push
by the federal government for the adoption of Common Core
Standards (CCS) is likely to have motivated the district superin-
tendent to use PBL as a way to support teachers in their efforts to
meet those new standards. CCS clearly specifies what is expected of
students at each grade level. PBL and CCS are considered by some to
mesh well because PBL emphasizes critical thinking skills and
supports interdisciplinary instruction (Markham, 2012), key
dimensions of the CCS. The district believed that PBL, with its
thematic approach, offered teachers the opportunity to embed
English language arts and math instruction into other subject
areas. The district also encouraged each school under its auspices
to select other initiatives that would support student achievement.
We describe the initiatives chosen by Orchid Heights below.
To support teachers in the implementation of reform initiatives,
the district asked that all teachers be given the time to meet weekly
in grade-level teams to plan their instruction. Teachers were
expected to work collaboratively, to engage in team-level planning
for PBL units, and to facilitate the work of integrating discussions of
data into PBL at their school.
Orchid Heights
Orchid Heights serves a more diverse population of students
than other schools in the district. The school demographics reveal
that approximately 54 percent of the students are Hispanic/Latino,
43 percent are White/European/American/Other, 2 percent are
Asian American/Orchid Islander, and 1 percent are African
American. Moreover, roughly 44 percent of Orchid Heights
students qualify for a free or reduced-price meal subsidy. The
most recent test data indicate that the school’s scores are well
above the state average. Despite this success, the principal and
teachers recognize that all students are not achieving, and this is
particularly true of their English Language Learner (ELL)
population. For example, state test results from the 2010–11
school year reveal that only 39 percent of the school’s ELL students
scored at proficient or advanced level on the state test in English
Language Arts, as compared with 87 percent of the non-Hispanic
(Caucasian) student population (School Accountability Report
Card, 2011:5).
To comply with the district’s request that schools select
additional initiatives that would help further improve their
schools, Orchid Heights chose the International Baccalaureate
Program (IB).3
According to the principal, teachers wanted a higher
profile for their school and they hoped the IB program would help
them attain that status. They also wanted to be able to make use of
student assessments based on data from sources other than test
scores. The IB curriculum, with its focus on units of study and the
research process, allows teachers to construct a variety of
assessments. The IB program is consistent with the school-wide
goal of helping students develop the knowledge and skills
necessary to become global citizens who are prepared for the
future. The principal explained that she intended to integrate the
district mandated PBL initiative with the IB program to help the
school achieve IB certification.
Orchid Heights also chose to adopt the Guided Language
Acquisition Design Program (GLAD),4
an initiative that offers
strategies to support ELL students. Several grade levels also use
IPads with their students and, most recently, a new physical
education initiative has been introduced for all grade levels. These
school-specific initiatives were implemented simultaneously with
Orchid Heights’ already in-place reform emphasis on data use,
developing students’ basic skills, and teaching to the current state
standards, as well as the new Common Core standards. In adopting
these multiple initiatives, the principal expected that teachers
would integrate the initiatives and not consider each one an ‘‘add
on.’’ Teachers would ideally embed data use in all of the reforms,
and doing so would, hopefully, improve test scores.
In general, Orchid Heights teachers felt that they were part of a
culture that supported reform. One teacher described her colleagues
as‘‘verycollaborative. . .withamindset thatchangeisgood.Weneed
to embrace it. If everyone works together, we can overcome
whatever the perspective is out there in the community [about this
school].’’ One of the first-grade teachers explained that teachers
were anxious to counter the public perception of the school,
‘‘because our school’s always been. . .they used to call it ‘[Orchid]
Hole’. . .it has always been the lower performing and the under-
performing. . . the. . .the stepchild of the district kind of thing.’’
Despite their receptivity to change, some teachers admit they are
‘‘growing weary’’ and feel they cannot ‘‘wrap their heads around one
more thing.’’ According to a fourth-grade teacher, ‘‘it feels like we are
building the plane while flying it.’’ Responsibility for implementing
multiple initiatives simultaneously creates multiple tensions for
teachers and precludes some important opportunities to use data.
We discuss these tensions in the next section.
Findings
In attempting to understand how teachers implemented data
use, we found that the first- and fourth-grade teachers we studied
at Orchid Heights Elementary School readily embraced data-driven
decision-making in the areas of English language arts and math,
2
See ‘‘Improving Data Quality for Title I Standards, Assessments, and
Accountability Reporting: Guidelines for States, LEAs, and Schools (Non-Regulatory
Guidance),’’ U.S. Department of Education, Office of Elementary and Secondary
Education, April 2006, pp. 6–7, found on the U.S. Department of Education website,
http://www2.ed.gov/policy/elsec/guid/standardsassessment/nclbdataguidan-
ce.pdf, Accessed on 11/23/12.
3
The International Baccalaureate is a non-profit educational foundation. Its four
programs for students aged 3 to 19 ‘‘help develop the intellectual, personal,
emotional and social skills to live, learn and work in a rapidly globalizing world’’
(http://www.ibo.org/general/who.cfm. Accessed 11/24/12).
4
Project GLAD is a professional development model to support literacy and
language acquisition instruction. Originally developed in the Fountain Valley School
District in California, GLAD is now supported by independent trainers. For more
information, see http://www.projectglad.com. Accessed 8/10/13.
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 57
using district benchmark data and student work to guide their
classroom practice in these two subjects. This data was made
available to them through a web-based data management system,
which teachers could access individually. To help support
instruction, the principal also constructed a detailed report of
student data for each teacher.
During interviews and in the grade-level team meetings we
observed, the teachers indicated verbally and through their body
language (e.g., affirming through facial expressions and/or sharing
knowing glances with their colleagues) that they were familiar and
comfortable with using test score data to identify their students’
areas of strength and weakness and confident in their ability to
orient their own teaching strategies to address the needs the data
helped them identify. Teachers also knew how to efficiently access
the test score data they wanted. This data was not used, however,
in the areas of social studies and science. In those subjects,
although teachers collected various forms of data and verbalized a
willingness to apply assessment results to classroom practice, they
lacked training in how to use the benchmark data in English
language arts and math to inform instruction for social studies and
science. This is perhaps not surprising given the fact that English
language arts and math were the focus of these assessments, and
their results were reported in a way that lent themselves more to
skill building in core subject areas rather than to planning project
based lessons. We found that despite strong leadership and
support from their principal, by the end of the academic year,
teachers had made limited progress integrating data into their
planning for instruction across the curriculum.
Below, we examine these results in greater detail. We consider
the consequences of our findings, including how the compartmen-
talized use of data affects student learning, in the conclusion.
Challenges teachers face in moving data into useable knowledge for
guiding instruction
Understanding teachers’ data use in tandem with other reform
initiatives revealed that Orchid Heights teachers faced a staggering
set of challenges. Like all other teachers in their district, they were
expected to comply with a large number of instructional
requirements in a short period of time, identify student needs,
meet specific grade-level standards, and establish classroom goals
for the year, all guided by data they had prior experience using.
They also were expected to systematically monitor students’ skill
development in English language arts, using the district-adopted
textbook series and its ‘‘theme tests.’’ In addition to meeting these
district-wide expectations, they had site-specific responsibilities.
Because Orchid Heights had chosen to adopt an IB program, the
teachers had to successfully implement six units of IB curriculum
in a one-year period – and they would need to continue to do so
annually in order to maintain IB certification. The IB program
stipulates that there is to be little, if any, overlap between the units,
and teachers must conduct ongoing planning and assessment of
the units. The need to meet such stipulations in order to achieve
and maintain IB certification placed significant demands on Orchid
Heights teachers.
The district-wide adoption of Project Based Learning exacted its
own, different demands at the school. The district provided a
district-wide professional development (PD) event at the begin-
ning of the year to help teachers become acquainted with PBL and
learn how to implement PBL units in their classrooms. The training
had some benefits but teachers admitted ongoing struggles with
implementing PBL as they sought to learn how to ‘‘do projects,
motivate students to learn more, work in collaborative groups and
prepare them for beyond elementary school,’’ as one teacher
explained. We found that over the course of the academic year, the
teachers, who were already feeling stressed over requirements to
use data to develop students’ basic skills, improve standardized
test scores, and implement IB and PBL, made very little progress in
integrating initiatives and engaging in the use of test score data
across the curriculum.
Difficulties integrating data use across the curriculum
For Orchid Heights teachers, looking at benchmark data and
responding to what that data told them about their students’ needs
occurred systematically in two subjects, English language arts
(ELA) and math – the two subjects for which students’ proficiency
is formally and routinely assessed. Areas revealed by test score
data as requiring more skill building were addressed during the
separate and bounded instructional time for these two subjects. In
math, knowing from test scores that students needed more
support with rounding numbers, for example, teachers tested and
retested them on their ‘‘rounding ability’’ creating new lessons to
reemphasize the concept and doing ‘‘drills over and over again’’. To
help improve their multiplication skills, students used a computer
program to self-test their level of understanding. One teacher
recalled: ‘‘I was down to four students who [hadn’t] received ten
points yet – but I kept revisiting old goals so I could put their name
up on the board to keep all students motivated.’’
The teachers worked ceaselessly to build their students’ basic
skills in English language arts and math, but such instruction
remained mostly in its own instructional silo, held apart from other
subject areas. Although it is understandable that language arts and
math would receive the majority of instructional attention given
these subjects are the major focus of high stakes state account-
ability tests, this compartmentalization limited instructional
change since benchmark assessment data were not integrated in
planning across multiple subject areas. We found one important
exception to this pattern. In the latter part of the academic year, the
first-grade teacher team discussed a plan to incorporate ELA and
ELD (English Language Development) into their IB unit on ‘‘How the
World Works.’’ While their main focus was to develop – within the
IB – a PBL unit on weather and natural disasters as a way of
teaching students about matter, and to discuss the water cycle as
part of their science unit, they also intended to ‘‘increase the trans-
disciplinary aspect of the unit.’’ As one of the teachers put it, ‘‘We
want to use weather and matter to teach idioms such as, ‘March
always comes in like a lamb and goes out like a lion,’ and ‘It’s
raining cats and dogs.’’’ The team had recognized the importance of
integrating students’ English language needs into their IB/PBL unit
– clearly an important first step toward using data effectively to
support the acquisition of knowledge. However, in practice, the
teachers apparently found the task too complex. That kind of data
use, they admitted, remained ‘‘not yet fully integrated with the
reform initiatives.’’
Conflicts meeting the needs of ALL students
Absent an understanding of how to integrate English language
arts and math benchmark data into subject area instruction,
specifically into the PBL units designed to teach social studies and
science, teachers struggled to meet the needs of all students. For
the most part, PBL planning led to lessons in which all students
were treated as though they would benefit from the same
instruction. One of the fourth-grade teachers recognized this as
a problem, noting that the current PBL design did not address the
needs of her under-achieving students. These students were falling
behind during the PBL instructional time because she and her
colleague had not differentiated the instruction to provide
individualized support for those who needed it.
This teacher considered improving students’ access to knowl-
edge an essential goal of any initiative and she particularly worried
about her English language learner (ELL) population. She ques-
tioned her colleagues frequently about how to provide instruction
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6258
within the PBL framework for students who have been identified as
struggling to learn English, much less other subjects. She described
Francisco, an English language learner who was at a significant
disadvantage when the class started a PBL unit on fossil fuels. For
him, fossil fuels were an entirely new and very abstract concept.
She believed the vocabulary was too complex for him. The PBL unit,
which required that students immediately jump into the work,
presumed an in-depth understanding of the concept and the
vocabulary and did not provide sufficient scaffolds for English
language learners. She believed that since Francisco had no
previous opportunity to acquire this understanding, he could not
contribute in a meaningful way to his collaborative group. The
lesson had not been designed in a way that differentiated for
students’ varied language skills.
The benefits of contextualizing basic skill development within
PBL has been recognized by others (Buck Institute, 2012). But the
Orchid Heights teachers, who were not provided training in how to
integrate data on students’ language arts and math achievement
with science and social studies instruction during PBL, perceived
the struggles of students like Francisco as inevitable:
It is nearly impossible for those kids who are reading far below
their level to meet the grade-level standard. The reality is that
there are some kids for whom that goal is simply not attainable,
at least not this year. How much do we really want to test and
retest them?
Fourth grade teachers did not feel as though students at other
achievement levels benefited fully from the first PBL unit they
designed either. Teachers pointed to the collaborative group
structure of the PBL as the source of the problem. One teacher,
describing a group in her class, noted that ‘‘the kids ignored the
student who came prepared in advance and had all information
and research done.’’ As this teacher explained, ‘‘Collaboration is an
issue at this age – teaching them this concept of collaboration and
managing it has been a challenge.’’
The observational data they had gathered by watching students
in the course of the PBL lessons led teachers to conclude that
students struggled with collaboration. This conclusion was indeed
important and would help guide future planning. However, it was
also clear the teachers, despite their best intentions, had not yet
learned how to integrate their analysis into their PBL planning in a
way that would lead them to differentiate instruction, arrange the
groups somewhat differently, and make curricular adjustments to
improve learning opportunities for all students.
How data use is shaped by the presence of multiple initiatives
One tension that hindered the integration of data use across
subject areas was related to what teachers perceived to be the
incompatibility of multiple initiatives. During the initial months of
implementation of the new initiatives, teachers were preoccupied
with trying to understand whether and to what extent IB and PBL
shared a coherent logic. IB Unit planners (forms used to plan units)
used language that was obtuse, academic and non-user friendly,
according to the teachers. Teachers felt that the IB unit planners did
not align with the PBL unit planners. Both IB and PBL were ‘‘jargon
heavy’’; moreover, they each used different language, which
confused teachers. By mid-year, teachers were still questioning the
IB-PBL fit.
Planning units for both IB and PBL took double the time and this
created another tension. One fourth grade teacher noted that it
took a few months to do one IB unit well. To accomplish the
development, implementation, and assessment of six IB units in a
period of only nine months was viewed as ‘‘unrealistic and
unreasonable.’’ Moreover, IB units were not permitted to last more
than six weeks, which meant that two units were not allowed to be
implemented simultaneously – obviously presenting additional
challenges.
Teachers expressed concern over another misfit between
initiatives. Data generated from benchmark test results had set
them on a path of improving students’ basic skills, but the
requirements associated with IB and/or PBL instruction seemed to
call for a different approach. As one teacher said,
This is where I am struggling – what we were doing [before PBL,
namely re-teaching basic skills], makes sense to me, but PBL
seems like a different realm. I’m on board with PBL but they
[teaching basic skills and PBL] seem like two diametrically
opposed things.
Another teacher commented, ‘‘now, with the requirement to
implement PBL and IB, it’s kind of about pulling the two together –
skills and projects; choice, collaboration, motivation, and stan-
dards. It’s hard – and management is difficult.’’ The perceived
dissonance that plagued teachers is captured in this question,
posed by a fourth grade teacher: ‘‘How do you teach basic skills and
teach kids that are below grade level, using PBL?’’
When we asked first-grade teachers whether data regarding
students’ achievement in language arts and math was integrated
with PBL, one answered, ‘‘Not yet. I have to be honest, not with us.
Science and social studies standards, yes. But they’re not directly
tested until what, fifth grade?’’ One of the teachers at this grade
level confessed that when they planned their PBL units, ‘‘I don’t
think we looked at it [data] at all for PBL.’’ Similarly, a fourth-grade
teacher explained that data that identified students’ language arts
and math needs were not considered when they organized their
PBL instruction.
It is important to note, however, that the teachers did collect
data during and after PBL units. Teachers administered a variety of
tests to assess students’ content knowledge in social studies and
science. They used rubrics, videotaped student presentations, and
asked students to reflect on their content learning using formative
and summative assessments. (The IB program mandates summa-
tive assessment.) Ideas for PBL assessment often were borrowed
from the Buck Institute.5
However, teachers did not use the district
benchmark results – data that offered them information on
students’ language and math strengths and needs – to guide them
in constructing their PBL units, as the principal had intended.
Continued divisions between content areas
Teachers dedicated about 30 percent of their time to PBL and
about 70 percent to the development of their students’ basic
English language and math skills because, according to one
teacher, ‘‘Students have to have skills and knowledge before they
can take [the knowledge] and use it in a project. So there should
always be a certain amount of frontloading/direct instruction.’’
Interestingly, over time, although they continued to struggle to
understand how the data they gathered on students’ academic
needs in language and math could be integrated within their PBL
units, teachers we interviewed began talking about IB and PBL
interchangeably. In preparing for the third PBL, and likely due to
diligent efforts to reconcile the two initiatives, one teacher
commented, ‘‘I now see IB and PBL as the same thing because IB
incorporates PBL. . .IB is just a specific program that uses PBL. . .it
embraces the PBL model. . .so, as far as I’m concerned, there is no
difference between the two.’’ This dramatic shift in perception
occurred alongside the more entrenched perspective that the
5
The Buck Institute provides training and support for schools implementing PBL.
The Buck Institute for Education (BIE) has created free materials – ‘‘FreeBIEs’’ – such
as planning forms, student handouts, rubrics, and articles for educators to
download and use to design, assess, and manage projects. The teachers claimed
these resources were incredibly valuable.
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 59
teaching and learning of basic skills and PBL/IB would remain
separate efforts, and only the former would be systematically
guided by data. This presumably limited the power of data
informed decision making at the school.
Data use aligned with issues related to teacher accountability
and exacerbated the split between subject area divisions. Language
arts and math are subjects tested with the state’s high-stakes
standardized test annually beginning in second grade; science is
not assessed until fifth grade and social studies is not assessed in
elementary school at all. Thus, the first- and fourth-grade teachers
felt freer to teach these subjects within the PBL structure. One
teacher used a Venn diagram and described the separation this
way:
Now, as far as the [English Language Arts] Standards go, at this
point, it is still textbook/basic skill development . . . for ELLs –
out here [outside PBL]. At this point we’re in transition. You
have math out here. And within this IB/PBL circle you
have. . .you might have some social studies, and you have some
science that is incorporated into the PBL. Math will never be
totally integrated. [teacher’s emphasis]
Teachers felt that the current instructional arrangement would
most likely continue although, as they spent more and more time
on social studies and science and less time on the ELA component,
they worried that soon ‘‘we will have to figure out how to
incorporate the ELA skills into those areas and make it more
seamless.’’ Despite their persistent effort and motivation, teachers
admitted that they were not yet adept at integrating the
curriculum and using data generated from ELA and math to
support social studies and science instruction.
Discussion accounting for the challenges and the successes
How do we account for the instructional divisions and
difficulties in the use of data found at Orchid Heights? Teachers
pointed to the effects of several important tensions. One was a lack
of resources. Apart from the state-adopted textbook, teachers had
few primary resources from which to draw ideas and content for
the PBL units. Some content is available online, but it takes time to
find and even more time to figure out how to incorporate it into
lesson plans. Moreover, the textbooks they had did not ‘‘match up’’
with the goals of their ‘‘IB and PBL units’’. One teacher noted:
If we had more ELA materials that went with the IB units or PBL
units, then the units could be in a more literature form – that
would lend itself well. It’s just a transition. At least our school is
transitioning away from the ELA text and moving on towards
novel sets and integrated IB units, but we’re not there yet.
The school librarian was very helpful, but the teachers noted
that she had not yet been trained in the IB curriculum. And budget
constraints were expected to limit what new materials the school
would be able to purchase. Orchid Heights’ teachers had to ‘‘wing
it’’; they were not given a budget to fund their reform efforts.
Despite some positive developments, such as receiving grants and
assistance from college students, teachers found it difficult to
provide all students with access and equity within the PBL
structure.
Capacity issues
Individual capacity varied across teachers as well. One of the
fourth-grade teachers was in a Master’s degree program specializ-
ing in PBL. This training gave her additional insight into how best to
develop, implement, and assess PBL units. She had access to
monthly project peer review that provided her with feedback from
others engaged in PBL. As a result, she was much more
knowledgeable than her colleagues and able to add a great deal
of support to her fourth grade team. Most other Orchid Heights
teachers were struggling to plan PBL lessons in their grade-level
teams. They valued the autonomy the principal gave them, since it
left them free to make their own decisions and set their own
timeline for PBL implementation, but they lacked a deep
understanding of PBL and how to integrate ELA and math data
across the curriculum. They wanted an opportunity to connect
with other teachers who also were implementing PBL projects.
They felt they would be able to build their own capacity to do the
work if they were able to see examples of units their peers had
developed, learn how these teachers created their curriculum, gain
a sense of what their peers wrestled with, and examine others’
successes and challenges with incorporating reading comprehen-
sion, language arts, and skill development.
While teacher capacity affected the implementation of PBL and
data use, it is important to note that teachers were constrained in
their use of data simply because standardized data was not
available for social studies and science. Indeed, these areas were
not assessed on the district benchmarks, just as they were not
assessed in most districts, nor were they assessed at the state level
with the exception of science in one elementary grade. Teachers
were clearly challenged by the absence of data.
The importance of leadership
The Orchid Heights principal was highly sensitive to the value of
data and did what she could to support its use among her teachers.
She gave them time to work in grade-level groups to plan their units
of instruction. She analyzed benchmark data, then ‘‘chunked the
data,’’ as she explained, sorting students by demographics,
achievement level, and economic status to maximize the planning
for instruction for each child. Students who were identified as
language learners, receiving free or reduced-price lunch, and below
basic skill levels were flagged to receive immediate attention
because they were seen as being in triple jeopardy of falling behind.
The principal collaborated with teachers to identify students by
grade level who were below basic grade level and helped them
interpret students’ instructional needs. She ensured that low
performing students were placed in groups where they received
focused attention for 40 minutes a day/four days a week.
While the principal supported teachers in their implementation
of IB and PBL, and when working with ‘‘high risk’’ students, that
attention, too, was compartmentalized and limited to basic skills
that were divorced from the broader curriculum. Although clearly
the principals’ actions were laudable, one concern is that by
focusing on ELA and math data, like her teachers, she gained no
systematic knowledge of the students’ grasp of social studies and
science.
Thus far the principal acknowledged that success with data use
had been limited, but based on her routine observations of
classroom practice, she felt that the faculty had made progress over
the course of the year. Indeed the entire staff was using data to
inform instruction in language arts and math, everyone was
implementing IB, and the majority of the teachers were
implementing PBL. She attributed these successes to the way
she rolled out the initiatives, which she described as slow. She
exerted some but not a lot of pressure on teachers to achieve
during this first year of implementation. She deliberately tracked
changes in benchmark data in order to show the teachers the
extent to which their efforts were paying off in achievement gains.
She was able to show through enrollment figures and feedback
from parents how the adoption of these multiple initiatives had
gained the school greater visibility in the community and earned
kudos from parents who were glad to be part of such a
‘‘progressive’’ school. The principal felt her strategic efforts had
resulted in some ‘‘shift’’ in teacher attitudes and practice and that
the school was moving forward.
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6260
She acknowledged, however, that there was still a long way to
go. For the most part, she explained that teachers ‘‘went through
the motions’’ and were ‘‘lacking a passion for PBL.’’ They wanted,
instead, to adhere to practices from ‘‘back in the day,’’ which meant
teaching subjects in isolation, independent of each other. By the
end of the year, despite the principal’s efforts to provide a structure
and a culture of data use and her efforts to promote progressive
instructional student-centered strategies, integration across sub-
jects had not occurred. The PBL structure, in principle, supports the
integration of all subjects. Data use within the PBL units, however,
focused on student work and, while clearly helpful, in general,
there were missed opportunities to extend and deepen instruction
in the area of students’ language and math needs.
The principal was able to show the faculty that much work still
needed to be done and that full implementation of the initiatives
was essential, particularly because of the persistent underachieve-
ment of their ELL students. Her ability to collect and analyze
pertinent data and then share that with her teachers built an even
more powerful case for the need to accelerate the use of data across
the curriculum.
The curriculum and pedagogical divisions at Orchid Heights
were striking enough to cause the principal to worry about
whether she would be able to ‘‘marry the two.’’ While she was able
to point out the need for that union, she was not able to be explicit
about how to accomplish the ‘‘marriage.’’ She admitted to
struggling with how to help the faculty integrate the various
initiatives in a way that put the use of benchmark data and student
needs at the center of their reform efforts, and how to create a
coherent course of study that would capitalize on this valuable
data. These questions continued to challenge this data-minded
school leader as she made plans for the next academic year.
Conclusion and implications
This district believed that if they were to improve educational
outcomes for all students, data had to inform instruction. To a large
extent, Orchid Heights educators agreed. The principal and
teachers worked together to provide language arts and math
instruction and used district benchmark data to inform their
practice for these subjects. When teaching social studies and
science (using IB and PBL), although clearly using student work to
inform their instruction, they did not find the assessment data in
English language arts and math to be helpful in their planning. The
compartmentalization and absence of integrating this data across
the curriculum was due to the impact of ‘‘institutional, historical
and cultural factors’’ (Spillane, 2012:125). Building on the work of
Spillane (2012), but going further to provide a micro-analysis of
teachers’ actual experiences with data, we found that institution-
ally, the principal and teachers had constructed teaching practices
and organized the school day in a way that kept English language
arts and math data in an instructional silo. Historically and
culturally, Orchid Heights’ teachers were driven to teach in a way
they had always taught – with the belief that subjects were best
taught in isolation. Teachers’ own prior knowledge and experi-
ences had influenced their relationships with data use (Spillane
et al., 2002). They were accustomed to teaching one subject,
assessing student work, using the data to inform the teaching of
that subject and moving on to repeat the cycle with a different
subject. Integrating content areas in an interdisciplinary way and
understanding the usefulness of using data across content areas
was not only unfamiliar but uncomfortable terrain. Predispositions
and prior teaching experiences constrained opportunities for
change.
The larger context in which teachers worked mattered as well.
Since ELA and math were the subjects teachers were held most
accountable for by the federal, state, district, and school principal,
they felt compelled to focus on basic skill development and
benchmark data for those subjects. In the subjects of social studies
and science, for which there was less or no state accountability at
the elementary level, teachers felt freer to use the IB and PBL units
and also less able (and willing) to use ELA and math benchmark
assessment data to inform instruction. We are not arguing that
using benchmark data are always the most helpful data, especially
in guiding project based instruction across the curriculum. We are
suggesting, however, that compartmentalizing specific data driven
decision making has consequences for teaching and learning.
This investigation into data use helps us to understand not only
the specific circumstances faced by the principal and teachers at
Orchid Heights but also contributes to efforts to answer larger
questions about organizational learning and how institutional
structures and culture influence daily practices. By focusing on the
meaning that teachers gave to data use and to the federal, state,
district and school context in which they were situated, we have
been able to deconstruct what Fullan (1999) has so aptly pointed
out: that one of the biggest problem facing schools is fragmenta-
tion and overload. These educators and many in the U.S. and
internationally, are implementing multiple initiatives and they do
not know how to integrate them. They lack the knowledge and a
strong rationale for doing so. As a result curriculum is fragmented
and often produces incoherence. Teachers are overwhelmed and
forced to rely on what they know. Often they do not have the
capacity (both human and instructional resources) to most
effectively use all kinds of data across the curriculum, and they
lack opportunities to build their skills. They do not have the
capacity to manage numerous reform demands, which are
designed to be mutually supportive but which inadvertently pull
them in conflicting directions.
It seems likely that as Orchid Heights teachers move forward
with their work they will need additional support to effect change.
The Orchid Heights principal, as many principals across the US, is
constrained, however, in her capacity to provide the guidance
necessary to do the work. She had limited opportunities to enhance
her own professional development. As prior research has
suggested, districts can play an important role in capacity building
(Cawelti  Protheroe, 2007), but when districts are financially
strapped as they are in the current economy, principals and
teachers are left without the support they need.
We also learn from this study that even a generous allocation of
time is insufficient to move data use to center stage. Teachers (and
school leaders) need knowledge and resources that can help them
to engage with data and to know how to use data to shape a
coherent educational plan in the context of a school that is
implementing multiple initiatives. As districts increasingly add
more reform initiatives to teachers’ plate to satisfy the world of
high-stakes accountability, they must recognize the importance of
providing teachers and school leaders with the requisite knowl-
edge and skills to integrate them in order to affect change.
Author note
An earlier draft of this paper was presented at the International
Congress of School Effectiveness and School Improvement, Chile,
January 2013. We are grateful to the participants in this study and
to the University of San Diego for their generous support of this
research.
References
Buck Institute website. PBL Planning Resources for Teachers. Retrieved from http://
www.bie.org. Accessed: 18.11.12.
Cawelti, G.,  Protheroe, G. (2007). The school board and central office in school
improvement. In Walberg, H. (Ed.). Handbook on restructuring and substantial school
improvement. (pp.37–52). .
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 61
Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative
analysis. Thousand Oaks, CA: Sage Publications.
Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate
reading policy in their professional communities. Educational Evaluation and Policy
Analysis, 23(2), 145–170.
Coburn, C. E.,  Turner, E. O. (2012). The practice of data use: An introduction. American
Journal of Education, 118(2), 99–110.
Creswell, J. (2009). Research design (3rd ed.). Thousands Oaks, California: Sage Pub-
lications.
Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in
educational improvement. Teachers College Record, 114(11), 1–21.
Datnow, A., Park, V.,  Wohlstetter, P. (2007). Achieving with data: How high-performing
school systems use data to improve instruction for elementary students. Center on
EducationalGovernance,RossierSchoolofEducation,UniversityofSouthern California.
Dowd, A. (2005). Data don’t drive: Building a practitioner-driven culture of inquiry to
assess community college performance. Indianapolis, IN: Lumina Foundation for
Education.
Earl, L.,  Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal of
Education, 33(3), 383–394.
Earl, L.,  Katz, S. (2006). Leading schools in a data rich world. Thousand Oaks, CA: Corwin
Press.
Foley, E.,  Sigler, D. (Winter, 2009). Getting smarter: A framework for districts. Voices
in Urban Education, 22, 5–12.
Fullan, M. (1999). Change forces: The sequel. London, England: Falmer Press.
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J.,  Wayman, J. (2009).
Using student achievement data to support instructional decision making (NCEE 2009-
4067). Washington, DC: National Center for Education Evaluation and Regional
Assistance Institute of Education Sciences US Department of Education.
Hatch, T. (2002). What happens when improvement initiatives collide? Phi Delta
Kappan, 8(8), 626.
Honig, M. I.,  Venkateswaran, N. (2012). School–central office relationships in evi-
dence use: Understanding evidence use as a systems problem. American Journal of
Education, 118(2), 199–222.
Horn, I. S.,  Little, J. W. (2010). Attending to problems of practice: Routines and
resources for professional learning in teachers’ workplace interactions. American
Educational Research Journal, 47(1), 181–217.
Ikemoto, G. S.,  Marsh, J. A. (2007). Cutting through the ‘‘data driven’’ mantra:
Different conceptions of data-driven decision making. In P. A. Moss (Ed.), Evidence
and decision making (National Society for the Study of Education Yearbook, Vol. 106,
Issue 1, pp. 105–131). Chicago: National Society for the Study of Education.
International Baccalaureate Diploma Programme website. Retrieved from http://
www.ibo.org/general/who.cfm. Accessed: 24.11.12.
Leithwood, K. (2008). Characteristics of high performing school districts: A review of
empirical evidence. Calgary, AB: College of Alberta School Superintendents.
Levin, B., Datnow, A.,  Carrier, N. (2012). Changing school district practices. Boston, MA:
Jobs for the Future.http://www.studentsatthecenter.org/papers/changing-school-
district-practices.
Levin, J.,  Datnow, A. (2012). The principal as agent of mediated educational reform:
Dynamic models of case studies of data driven decision making. School Effectiveness
and School Improvement, 23(2), 179–201.
Madda, C., Halverson, R. M.,  Gomez, L. (2007). Exploring coherence as an organiza-
tional resource for carrying out reform initiatives. Teachers College Record, 109(8),
1957–1979.
Mandinach, E. B.,  Honey, M. (Eds.). (2008). Data driven school improvement: Linking
data and learning. New York: Teachers College Press.
Mandinach, E. B., Honey, M.,  Light, D. (2006). A theoretical framework for data driven
decision making. Paper presented at the annual meeting of the American Educational
Researchers Association.
Markham, T. (2012). Top ten tools for PBL. Retrieved from www.thommarkham.com.
Accessed: 25.11.12.
Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and
gaps. Teachers College Record, 114(11), 1–48.
Marsh, J., Kerr, K. A., Ikemoto, G. S., Darilek, H., Suttorp, M., Zimmer, R., et al. (2005). The
role of districts in fostering instructional improvement: Factors affecting data use.
RAND Education.
McPhee, A.,  Patrik, F. (2009). ‘‘The pupils will suffer if we don’t work’’: Teacher
professionalism and reactions to policy change in Scotland. Scottish Educational
Review, 41(1), 86–96.
Means, B., Padilla, C.,  Gallagher, L. (2010). Use of education data at the local level: From
accountability to instructional improvement. Washington, DC: US Department of
Education Office of Planning Evaluation and Policy Development.
Merriam, S. B. (1998). Qualitative research and case study applications in education. San
Francisco: Jossey-Bass Publishers.
Moss, P. A. (February, 2012). Exploring the macro-micro dynamic in data use practice.
American Journal of Education, 118(2), 223–232 http://dx.doi.org/10.1086/663274.
Rallis, S. F.,  Rossman, G. B. (2001). Communicating quality and qualities: The role of
the evaluator as critical friend. In A. P. Benson, D. M. Hinn,  C. Lloyd (Eds.), Visions
of quality: How evaluators define, understand, and represent program quality (pp.
107–120). Oxford, UK: JAI Press.
Rogosa, D. (2005). Statistical misunderstandings of the properties of school scores and
school accountability. In J. L. Herman  E. H. Haertel (Eds.), Uses and misuses of data
for educational accountability and improvement. 104th
yearbook of the National
Society for the Study of Education (pp. 147–174). Malden, MA: Blackwell Publish-
ing.
Schildkamp, K.,  Lai, M. K. (2012). Introduction. In K. Schildkamp, M. K. Lai,  L. Earl
(Eds.), Data-based decision making in education: Challenges and opportunities (pp. 1–
9). Dordrecht, Netherlands: Springer.
School Accountability Report Card (SARC), 2011. Data Almanac.
Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making
phenomenon. American Journal of Education, 118(2), 113–141.
Spillane, J. P., Reiser, B. J.,  Reimer, T. (2002). Policy implementation and cognition:
Reframing and refocusing implementation research. Review of Educational Re-
search, 72(3), 387–431.
Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage Publications.
Supovitz, J. A. (2009). Can high stakes testing leverage educational improvement?
Prospects from the last decade of testing and accountability reform. Journal of
Educational Change, 10(2  3), 211–227.
Supovitz, J.,  Taylor, B. S. (2003). The impacts of standards-based reform in Duval County,
Florida 1999–2002. Philadelphia, PA: Consortium for Policy Research in Education.
U.S. Department of Education, Office of Elementary and Secondary Education. (April,
2006). Improving Data Quality for Title I Standards, Assessments, and Accountability
Reporting: Guidelines for States, LEAs, and Schools (Non-Regulatory Guidance). (Pro-
duced by DTI Associates, A Haverstick Company, under U.S. DOE Contract No. ED-
01-CO-0066/0009), pp. 6–7. Retrieved from U.S. Department of Education website,
http://www2.ed.gov/policy/elsec/guid/standardsassessment/nclbdataguidan-
ce.pdf. Accessed: 23.11.12.
Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Applied Social
Research Methods Series (Vol. 5). Thousand Oaks, CA: Sage Publications.
Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team
norms. American Journal of Education, 112(4), 521–548.
L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6262

More Related Content

What's hot

Course Evaluation Poster
Course Evaluation PosterCourse Evaluation Poster
Course Evaluation PosterBridget Hanley
 
Heliyon d-21-02382 r2 (1)
Heliyon d-21-02382 r2 (1)Heliyon d-21-02382 r2 (1)
Heliyon d-21-02382 r2 (1)Pedro Prestes
 
Jumping Hurdles to Technology Integration
Jumping Hurdles to Technology IntegrationJumping Hurdles to Technology Integration
Jumping Hurdles to Technology IntegrationLisa Durff
 
Using e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instructionUsing e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instructionCCS Presentation Systems Inc.
 
National Implications: The Impact of Teacher Graduate Degrees on Student Math...
National Implications: The Impact of Teacher Graduate Degrees on Student Math...National Implications: The Impact of Teacher Graduate Degrees on Student Math...
National Implications: The Impact of Teacher Graduate Degrees on Student Math...William Kritsonis
 
Lamar research institute_collaboration_3-22-2013_final
Lamar research institute_collaboration_3-22-2013_finalLamar research institute_collaboration_3-22-2013_final
Lamar research institute_collaboration_3-22-2013_finalLamar University
 
AERA 2010 - Teaching and Learning with Technology - IT as a Value Added Comp...
AERA 2010 - Teaching and Learning with Technology -  IT as a Value Added Comp...AERA 2010 - Teaching and Learning with Technology -  IT as a Value Added Comp...
AERA 2010 - Teaching and Learning with Technology - IT as a Value Added Comp...Martin Sandler
 
Research proposal
Research proposal Research proposal
Research proposal Sarah Richer
 
Effectiveness of computer assisted stad cooperative learning strategy on phys...
Effectiveness of computer assisted stad cooperative learning strategy on phys...Effectiveness of computer assisted stad cooperative learning strategy on phys...
Effectiveness of computer assisted stad cooperative learning strategy on phys...Gambari Amosa Isiaka
 
Predictors of Success: Student Achievement in Schools
Predictors of Success: Student Achievement in SchoolsPredictors of Success: Student Achievement in Schools
Predictors of Success: Student Achievement in SchoolsSchool Improvement Network
 
Bahan inovasi pembelajaran mat
Bahan inovasi pembelajaran matBahan inovasi pembelajaran mat
Bahan inovasi pembelajaran matSugiatno Sakidin
 
Data Informed Decisions
Data Informed DecisionsData Informed Decisions
Data Informed Decisionsmblake1
 
The student experience of a collaborative e-learning university module. Miche...
The student experience of a collaborative e-learning university module. Miche...The student experience of a collaborative e-learning university module. Miche...
The student experience of a collaborative e-learning university module. Miche...eraser Juan José Calderón
 
Pan Canadian Research Agenda 2008
Pan Canadian Research Agenda 2008Pan Canadian Research Agenda 2008
Pan Canadian Research Agenda 2008BCcampus
 
PhD proposal presentation
PhD proposal presentationPhD proposal presentation
PhD proposal presentationMichael Rowe
 

What's hot (19)

Course Evaluation Poster
Course Evaluation PosterCourse Evaluation Poster
Course Evaluation Poster
 
Heliyon d-21-02382 r2 (1)
Heliyon d-21-02382 r2 (1)Heliyon d-21-02382 r2 (1)
Heliyon d-21-02382 r2 (1)
 
Jumping Hurdles to Technology Integration
Jumping Hurdles to Technology IntegrationJumping Hurdles to Technology Integration
Jumping Hurdles to Technology Integration
 
Using e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instructionUsing e instruction’s® cps™ to support effective instruction
Using e instruction’s® cps™ to support effective instruction
 
National Implications: The Impact of Teacher Graduate Degrees on Student Math...
National Implications: The Impact of Teacher Graduate Degrees on Student Math...National Implications: The Impact of Teacher Graduate Degrees on Student Math...
National Implications: The Impact of Teacher Graduate Degrees on Student Math...
 
Lamar research institute_collaboration_3-22-2013_final
Lamar research institute_collaboration_3-22-2013_finalLamar research institute_collaboration_3-22-2013_final
Lamar research institute_collaboration_3-22-2013_final
 
AERA 2010 - Teaching and Learning with Technology - IT as a Value Added Comp...
AERA 2010 - Teaching and Learning with Technology -  IT as a Value Added Comp...AERA 2010 - Teaching and Learning with Technology -  IT as a Value Added Comp...
AERA 2010 - Teaching and Learning with Technology - IT as a Value Added Comp...
 
Final chapter2 edit
Final chapter2 editFinal chapter2 edit
Final chapter2 edit
 
Research proposal
Research proposal Research proposal
Research proposal
 
Effectiveness of computer assisted stad cooperative learning strategy on phys...
Effectiveness of computer assisted stad cooperative learning strategy on phys...Effectiveness of computer assisted stad cooperative learning strategy on phys...
Effectiveness of computer assisted stad cooperative learning strategy on phys...
 
Predictors of Success: Student Achievement in Schools
Predictors of Success: Student Achievement in SchoolsPredictors of Success: Student Achievement in Schools
Predictors of Success: Student Achievement in Schools
 
AERA paperfinal
AERA paperfinalAERA paperfinal
AERA paperfinal
 
Bahan inovasi pembelajaran mat
Bahan inovasi pembelajaran matBahan inovasi pembelajaran mat
Bahan inovasi pembelajaran mat
 
Data Informed Decisions
Data Informed DecisionsData Informed Decisions
Data Informed Decisions
 
Research On Research
Research On ResearchResearch On Research
Research On Research
 
The student experience of a collaborative e-learning university module. Miche...
The student experience of a collaborative e-learning university module. Miche...The student experience of a collaborative e-learning university module. Miche...
The student experience of a collaborative e-learning university module. Miche...
 
IRTpdpaper
IRTpdpaperIRTpdpaper
IRTpdpaper
 
Pan Canadian Research Agenda 2008
Pan Canadian Research Agenda 2008Pan Canadian Research Agenda 2008
Pan Canadian Research Agenda 2008
 
PhD proposal presentation
PhD proposal presentationPhD proposal presentation
PhD proposal presentation
 

Viewers also liked

La culpa es del programador versión 1.0
La culpa es del programador  versión 1.0La culpa es del programador  versión 1.0
La culpa es del programador versión 1.0Duban Garces
 
apiculture et exercice illegal de la pharmacie
apiculture et exercice illegal de la pharmacieapiculture et exercice illegal de la pharmacie
apiculture et exercice illegal de la pharmacieHENRI & PARTNERS
 
Принцип правовой определенности
Принцип правовой определенностиПринцип правовой определенности
Принцип правовой определенностиjannyly
 
I grupos interactivos
I grupos interactivosI grupos interactivos
I grupos interactivosValladolid
 
Lembar kerja siswa 2 manusia purba
Lembar kerja siswa 2 manusia purbaLembar kerja siswa 2 manusia purba
Lembar kerja siswa 2 manusia purbaRizky Aji
 
Muhammad cendekiawan
Muhammad cendekiawanMuhammad cendekiawan
Muhammad cendekiawanCendekiawan16
 
Esboço de sistema de governança racional e democrática de um país
Esboço de sistema de governança racional e democrática de um paísEsboço de sistema de governança racional e democrática de um país
Esboço de sistema de governança racional e democrática de um paísFernando Alcoforado
 
Getting Social Media RIGHT - #NSBAConf
Getting Social Media RIGHT - #NSBAConfGetting Social Media RIGHT - #NSBAConf
Getting Social Media RIGHT - #NSBAConfAngela Maiers
 
Ip college second cut off 2014
Ip college second cut off 2014Ip college second cut off 2014
Ip college second cut off 2014CareerCo
 
Vertigo films presentation
Vertigo films presentationVertigo films presentation
Vertigo films presentationKsenia Tamarova
 
Lembar kerja siswa 6 penduduk asli indonesia
Lembar kerja siswa 6 penduduk asli indonesiaLembar kerja siswa 6 penduduk asli indonesia
Lembar kerja siswa 6 penduduk asli indonesiaRizky Aji
 

Viewers also liked (20)

Introduccion
IntroduccionIntroduccion
Introduccion
 
La culpa es del programador versión 1.0
La culpa es del programador  versión 1.0La culpa es del programador  versión 1.0
La culpa es del programador versión 1.0
 
 
apiculture et exercice illegal de la pharmacie
apiculture et exercice illegal de la pharmacieapiculture et exercice illegal de la pharmacie
apiculture et exercice illegal de la pharmacie
 
Принцип правовой определенности
Принцип правовой определенностиПринцип правовой определенности
Принцип правовой определенности
 
I grupos interactivos
I grupos interactivosI grupos interactivos
I grupos interactivos
 
Rahimulla CV
Rahimulla CVRahimulla CV
Rahimulla CV
 
Lembar kerja siswa 2 manusia purba
Lembar kerja siswa 2 manusia purbaLembar kerja siswa 2 manusia purba
Lembar kerja siswa 2 manusia purba
 
Awave.
Awave. Awave.
Awave.
 
Muhammad cendekiawan
Muhammad cendekiawanMuhammad cendekiawan
Muhammad cendekiawan
 
Warp films
Warp filmsWarp films
Warp films
 
Esboço de sistema de governança racional e democrática de um país
Esboço de sistema de governança racional e democrática de um paísEsboço de sistema de governança racional e democrática de um país
Esboço de sistema de governança racional e democrática de um país
 
Herramientas web 2
Herramientas web 2Herramientas web 2
Herramientas web 2
 
Getting Social Media RIGHT - #NSBAConf
Getting Social Media RIGHT - #NSBAConfGetting Social Media RIGHT - #NSBAConf
Getting Social Media RIGHT - #NSBAConf
 
Presentacion
PresentacionPresentacion
Presentacion
 
Ip college second cut off 2014
Ip college second cut off 2014Ip college second cut off 2014
Ip college second cut off 2014
 
Vertigo films
Vertigo filmsVertigo films
Vertigo films
 
Vertigo films presentation
Vertigo films presentationVertigo films presentation
Vertigo films presentation
 
El boxeo
El boxeoEl boxeo
El boxeo
 
Lembar kerja siswa 6 penduduk asli indonesia
Lembar kerja siswa 6 penduduk asli indonesiaLembar kerja siswa 6 penduduk asli indonesia
Lembar kerja siswa 6 penduduk asli indonesia
 

Similar to Implementing Data-Driven Initiatives in Schools: Promise and Pitfalls

Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docxRunning Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docxagnesdcarey33086
 
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...Cathy Cavanaugh
 
2013, year 4 leaders eval report
2013, year 4 leaders eval report2013, year 4 leaders eval report
2013, year 4 leaders eval reportLouise Smyth
 
Leaders evaluation report year 2 2011
Leaders evaluation report year 2 2011Leaders evaluation report year 2 2011
Leaders evaluation report year 2 2011Louise Smyth
 
Belinda's common core research paper
Belinda's common core research paperBelinda's common core research paper
Belinda's common core research paperBelinda35
 
One One Qual
One One QualOne One Qual
One One Qualhargraves
 
One One Qual
One One QualOne One Qual
One One Qualhargraves
 
Contextual Influences on the Implementation of a Schoolwide .docx
Contextual Influences on the Implementation of a Schoolwide .docxContextual Influences on the Implementation of a Schoolwide .docx
Contextual Influences on the Implementation of a Schoolwide .docxmelvinjrobinson2199
 
Sample Abstract Name__________.docx
                                 Sample Abstract Name__________.docx                                 Sample Abstract Name__________.docx
Sample Abstract Name__________.docxhallettfaustina
 
Illustration ThinkstockiStockTeachers know all the terms.docx
Illustration ThinkstockiStockTeachers know all the terms.docxIllustration ThinkstockiStockTeachers know all the terms.docx
Illustration ThinkstockiStockTeachers know all the terms.docxsleeperharwell
 
School districts are in the process of adopting theResponse .docx
School districts are in the process of adopting theResponse .docxSchool districts are in the process of adopting theResponse .docx
School districts are in the process of adopting theResponse .docxanhlodge
 
Administrator Work In Leveraging Technologies For Students With Disabilities ...
Administrator Work In Leveraging Technologies For Students With Disabilities ...Administrator Work In Leveraging Technologies For Students With Disabilities ...
Administrator Work In Leveraging Technologies For Students With Disabilities ...Nathan Mathis
 
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...Learner and Instructional Factors Influencing Learning Outcomes within a Blen...
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...Zalina Zamri
 
21st Century Pedagogy: Transformational Approach
21st Century Pedagogy: Transformational Approach21st Century Pedagogy: Transformational Approach
21st Century Pedagogy: Transformational Approachijtsrd
 
Year 3 leaders eval report year 3 2012
Year 3 leaders eval report  year 3 2012Year 3 leaders eval report  year 3 2012
Year 3 leaders eval report year 3 2012Louise Smyth
 
Teacher opinions about the use of Value-Added models
Teacher opinions about the use of Value-Added models Teacher opinions about the use of Value-Added models
Teacher opinions about the use of Value-Added models llee18
 
Response To Intervention
Response To InterventionResponse To Intervention
Response To InterventionPaul Schumann
 
Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityGlenn E. Malone, EdD
 
Stne seminar phase 2 180308
Stne seminar phase 2 180308Stne seminar phase 2 180308
Stne seminar phase 2 180308Scottish TNE
 

Similar to Implementing Data-Driven Initiatives in Schools: Promise and Pitfalls (20)

Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docxRunning Header PROJECT BASED LEARNING PROJECT BASED LEARNING   .docx
Running Header PROJECT BASED LEARNING PROJECT BASED LEARNING .docx
 
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...
An evaluation of_the_conditions,_processes,_and_consequences_of_laptop_comput...
 
2013, year 4 leaders eval report
2013, year 4 leaders eval report2013, year 4 leaders eval report
2013, year 4 leaders eval report
 
Leaders evaluation report year 2 2011
Leaders evaluation report year 2 2011Leaders evaluation report year 2 2011
Leaders evaluation report year 2 2011
 
Belinda's common core research paper
Belinda's common core research paperBelinda's common core research paper
Belinda's common core research paper
 
One One Qual
One One QualOne One Qual
One One Qual
 
One One Qual
One One QualOne One Qual
One One Qual
 
Contextual Influences on the Implementation of a Schoolwide .docx
Contextual Influences on the Implementation of a Schoolwide .docxContextual Influences on the Implementation of a Schoolwide .docx
Contextual Influences on the Implementation of a Schoolwide .docx
 
Sample Abstract Name__________.docx
                                 Sample Abstract Name__________.docx                                 Sample Abstract Name__________.docx
Sample Abstract Name__________.docx
 
Illustration ThinkstockiStockTeachers know all the terms.docx
Illustration ThinkstockiStockTeachers know all the terms.docxIllustration ThinkstockiStockTeachers know all the terms.docx
Illustration ThinkstockiStockTeachers know all the terms.docx
 
School districts are in the process of adopting theResponse .docx
School districts are in the process of adopting theResponse .docxSchool districts are in the process of adopting theResponse .docx
School districts are in the process of adopting theResponse .docx
 
Administrator Work In Leveraging Technologies For Students With Disabilities ...
Administrator Work In Leveraging Technologies For Students With Disabilities ...Administrator Work In Leveraging Technologies For Students With Disabilities ...
Administrator Work In Leveraging Technologies For Students With Disabilities ...
 
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...Learner and Instructional Factors Influencing Learning Outcomes within a Blen...
Learner and Instructional Factors Influencing Learning Outcomes within a Blen...
 
21st Century Pedagogy: Transformational Approach
21st Century Pedagogy: Transformational Approach21st Century Pedagogy: Transformational Approach
21st Century Pedagogy: Transformational Approach
 
Year 3 leaders eval report year 3 2012
Year 3 leaders eval report  year 3 2012Year 3 leaders eval report  year 3 2012
Year 3 leaders eval report year 3 2012
 
Teacher opinions about the use of Value-Added models
Teacher opinions about the use of Value-Added models Teacher opinions about the use of Value-Added models
Teacher opinions about the use of Value-Added models
 
Response To Intervention
Response To InterventionResponse To Intervention
Response To Intervention
 
Eeefetiveness
EeefetivenessEeefetiveness
Eeefetiveness
 
Wsu Ppt Building District Data Capacity
Wsu Ppt Building District Data CapacityWsu Ppt Building District Data Capacity
Wsu Ppt Building District Data Capacity
 
Stne seminar phase 2 180308
Stne seminar phase 2 180308Stne seminar phase 2 180308
Stne seminar phase 2 180308
 

Implementing Data-Driven Initiatives in Schools: Promise and Pitfalls

  • 1. Multiple initiatives, multiple challenges: The promise and pitfalls of implementing data Lea Hubbard a, *, Amanda Datnow b , Laura Pruyn a a University of San Diego, United States b University of California, San Diego, United States Introduction Increasingly, government agencies across the globe are attempting to motivate educators to use data as a vehicle for educational improvement (Earl & Fullan, 2003; McPhee & Patrik, 2009). An emphasis on data use has escalated in the Netherlands, US, Canada, South Africa, New Zealand, and other countries (Schildkamp & Lai, 2012). In the US, data-driven decision-making (DDDM) was a major feature of the American Recovery and Reinvestment Act of 2009 and of the controversial Race to the Top competition. At all levels of the system, educators are attempting to respond to these policy demands. Moving data into useable knowledge to change practice, however, has significantly challenged principals and teachers. Prior research has shown that at the school level, principals play a critical role in motivating teachers to use data and in providing supports that facilitate data use (Earl & Katz, 2006; Ikemoto & Marsh, 2007; Levin & Datnow, 2012; Mandinach, Honey, & Light, 2006). Yet some principals lack the knowledge to guide teachers in data use, and many teachers have not had sufficient training in how to understand and use data to inform their instructional decisions. As such, data literacy among educators remains a persistent concern. Moreover, in the US, new issues are arising with respect to data use as educators move toward teaching students 21st century skills, implementing the new Common Core standards, and undertaking other efforts to make learning more student-centered. These initiatives will involve activities that engage students in critical thinking, generating new knowledge, and learning through project-based work – all skills which are not easily measured by traditional assessments. As a result, what counts as ‘‘data’’ will become increasingly wide-ranging (Levin, Datnow, & Carrier, 2012). Thus, getting teachers together to discuss evidence of student learning and the development of new forms of assessment would appear to be a critically important component of this shift. We are now not only asking teachers to use data to inform decision making, but also to use more complex forms of data and to implement new instructional strategies. Other challenges arise from the fact that efforts to implement data-driven decision-making sometimes do not account for the culture and structure existing within a school. Like other reforms, data use is layered on top of already established routines and relationships, and some run counter to evidence-informed practice. Spillane (2012) suggests that organizational routines are put in place, often with scripts to guide discussions about data and to transform teaching and learning. However, the ‘‘performa- tive aspect of organizational routines,’’ that is, how a routine works Studies in Educational Evaluation 42 (2014) 54–62 A R T I C L E I N F O Article history: Received 30 March 2013 Received in revised form 13 August 2013 Accepted 9 October 2013 Available online 1 November 2013 Keywords: Data use Project based learning School reform A B S T R A C T Data driven decision making has become a popular reform effort across the globe. New issues are arising with respect to data use as educators move toward teaching students 21st century skills, as the implementation of Common Core standards begins in the US, and as other efforts are undertaken to make learning more student centered. This article reports findings from a year-long case study of a US elementary school that placed data use at the core of its platform for school reform. The goal of the study was to determine how teachers implemented data use in concert with other reform initiatives. Interviews with educators, as well as observations of teacher team meetings, revealed that data- informed instructional planning occurred primarily in language arts and math, and not in other subjects. The requirements to implement multiple initiatives created many tensions that decreased teachers’ ability and motivation to use data. How and when teachers used data was the result of a broader set of policies and structures at the federal, district, and school levels, as well as the capacity of the teachers and principal at the school. Implications for research and practice are discussed. ß 2013 Elsevier Ltd. All rights reserved. * Corresponding author. Tel.: +1 760 943 0412. E-mail address: lhubbard@sandiego.edu (L. Hubbard). Contents lists available at ScienceDirect Studies in Educational Evaluation journal homepage: www.elsevier.com/stueduc 0191-491X/$ – see front matter ß 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.stueduc.2013.10.003
  • 2. in daily practice and is enabled and/or constrained by ‘‘institu- tional, historical and cultural situations’’ (p. 125) is usually not addressed. We know little about these dynamics. As a recent analysis of a collective body of research on DDDM points out, we are faced with a ‘‘blunt understanding of data use’’ (Moss, 2012). The remedy, according to Coburn and Turner (2012), is to conduct investigations into the practice of using data. These authors urge that such investigations include a closer focus on the micro interactions of those involved, as well as on the degree to which participants are embedded in a context that is influenced by macro-level policies and structures within the educational system. We undertook such an investigation. This article reports findings from a year-long case study of the actions of a principal and teachers at Orchid Heights,1 a US elementary school that has placed data use at the core of its platform for school reform. The primary research questions that guided this study were: How do teachers implement data use in tandem with other reform initiatives? What are the actions taken and the challenges faced by educators in moving data into useable knowledge to inform instruction? Using a sociocultural perspective, we focused on teachers’ actions and beliefs, as well as on the institutional context in which they worked, in order to understand how the educators at Orchid Heights constructed data use. We found that the formally scheduled grade- level team meetings, which were designed to allow for discussions of data and lesson plans, did not always produce the intended results. District benchmark data were used primarily to build students’ language arts and math skills and not for the purposes of planning social studies and science lessons. This is not surprising since students were assessed in these areas and not others. This meant, however, that the use of this data to inform instructional decision making was limited to language arts and math, rather than used across the curriculum as intended. Moreover, requirements to implement multiple other educational initiatives at the school created tensions that undercut teachers’ ability and motivation to more fully integrate data use into their daily practice. How and when teachers used data was determined by the interaction of multiple factors, including a broad set of policies and structures in place at the federal, district and school levels, as well as the capacity of the teachers and principal. The remainder of the article is structured as follows: we begin with a review of the relevant literature and then turn to an explanation of the methodology and description of the district and school setting. After a detailed report and discussion of the findings, we present conclusions and consider their implications. Review of the literature Prior research on (1) data use and the role of teacher collaboration and (2) the challenges of balancing multiple reform demands and building capacity provided the framework for our investigation. Taken together, studies in these two areas helped expand our awareness of the kinds of issues principals and teachers at Orchid Heights faced in their efforts to use data to inform instructional decision-making. Data use and the role of teacher collaboration Broadly speaking, data-driven decision-making is the process by which administrators and teachers collect and analyze data to guide educational decisions (Ikemoto Marsh, 2007). While each locale may take a different approach to data use, the underlying belief is that carefully analyzing evidence about student learning, such as using standardized test score data and/or student work, will allow teachers to target instruction toward students’ individual needs (Mandinach Honey, 2008). The theory is that by working together, teachers will be able to assist each other in making sense of the data, engaging in joint action planning, and sharing instructional strategies. Overall, it is clear from prior research that evidence of student learning needs to be actively used to improve instruction in schools. Research on high-performing districts reveals that such districts integrate the examination of data and evidence-informed decision making into daily school and district processes (Foley Sigler, 2009; Leithwood, 2008). Many districts have invested in management information systems, benchmark assessments, and professional development to build expertise and capacity at the school level (Datnow, Park, Wohlstetter 2007; Hamilton et al., 2009; Supovitz Taylor, 2003). Some districts have also contracted with external agencies and consultants to assist in their capacity- building efforts district-wide (Marsh et al., 2005). Providing structured time for collaboration is one of the ways that many districts and schools attempt to support teachers’ use of data (Honig Venkateswaran, 2012; Mandinach Honey, 2008; Means, Padilla, Gallagher, 2012). In fact, a majority of high data use districts provide structured time for collaboration (Marsh, 2012; Means et al., 2010). Opportunities for cross-school interac- tion are a key ingredient of support for data use (Marsh, 2012). The presence of a leader who promotes a culture of inquiry within teacher work groups can aid in making conversations about data more productive (Horn Little, 2010; Young, 2006). This is in part because the knowledge within and among teacher groups can vary widely, leading to uneven results. For example, teacher teams with limited expertise can misinterpret or misuse data, or work together to perpetuate poor classroom practice (see review by Daly, 2012). On the other hand, groups with a great deal of collective expertise can be much more generative of learning (Horn Little, 2010). Even with the scaffolds of support that many districts and schools now provide, the process of engaging in DDDM has proven to be quite complex. Data from assessments may show patterns of student achievement, but they do not tell teachers what to do differently in the classroom (Dowd, 2005; Supovitz, 2009). Moreover, some argue that the data from large-scale assessments may be useful for school and system planning, but they are less useful at the teacher or student level (Rogosa, 2005; Supovitz, 2009). The use of assessment data can be powerful at the teacher level, but a great deal depends on the level of inquiry that occurs around the data. Multiple reforms and capacity building for change As we noted above, data-driven decision making is often implemented as one of numerous reform initiatives in a school or district. This is not surprising, as many educators and scholars see data use as part of a larger process of continuous improvement. Thus, schools may be implementing various reforms (e.g., implementing small learning communities, adopting a new math program) and using data to track their progress toward the goals of these initiatives. Reform efforts can be planned in ways that are mutually supportive and cohere around a common goal. Prior research suggests, however, that schools sometimes face challenges balancing multiple reform demands. This is especially the case when reforms do not cohere and result in conflicting directions of change. Almost fifteen years ago, school change expert Michael Fullan (1999) noted that the biggest problem facing schools was fragmentation and overload. Even with the move toward district coherence in the past decade, many schools still struggle with1 Pseudonyms are used throughout to protect anonymity. L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 55
  • 3. fragmentation. Hatch (2002) explains that while it is possible to coordinate multiple reforms in ways that would support coherence and capacity to improve student learning, doing so is not simple. He argues that this is in part because schools lack capacity: In many ways, the push to make improvement programs available to more and more schools is fueled by the hope that these programs can help many schools develop the capacity to change. But ironically, the implementation of these improve- ment programs is difficult precisely because schools lack the capacity to change (p. x). Although Hatch’s findings were in reference to comprehensive school reform models, the same conclusions about capacity hold true in the current reform era. In a more recent article, Madda, Halverson, and Gomez (2007) note that many district initiatives conflict with each other or with existing practices in schools. They note that districts need to consider how new initiatives are likely to fare in the actual contexts of use and to build this knowledge into their designs of instructional coherence. When teachers do not have the capacity and are not provided with the necessary guidance to integrate reforms in a meaningful way, they are likely to attend to certain aspects of reform while ignoring others. The sense-making teachers engage in around reform happens both individually and collectively (Coburn, 2001). Coburn (2001) finds that patterns of interaction among teachers influence how teachers adopt, adapt, or disregard reform initiatives, thus mediating the influence of these reforms on classroom practice. Teachers’ own prior knowledge of and experiences with reform also mediate how they respond to new initiatives (Spillane, Reiser, Reimer, 2002). Teachers tend to focus on the aspects of new reforms that are familiar to them, leaving aside the aspects that are difficult to understand or implement (Spillane et al., 2002). For these reasons, leaders play a key role in helping teachers find coherence among reforms and in assisting them in learning how to integrate reforms measures into their current practices. Capacity building efforts are critical, and districts can play an important part in this area (Cawelti Protheroe, 2007). In fact, high-performing districts are characterized by a heavy investment in capacity-building among leaders and teachers, particularly around instructional improvement (Leithwood, 2008). In sum, the literature identifies factors that facilitate or impede data use, especially in concert with other reform initiatives. Teacher collaboration can play an important role in getting data use into practice, but a great deal depends on the capacity of the teachers and leaders – as we will see in the case of Orchid Heights. Methodology This study was conducted between October 2011 and May 2012. We adopted a case study approach because this methodolo- gy is an ideal strategy for exploring situations in which the intervention being examined (here, data-driven decision-making) has no single set of outcomes (Merriam, 1998; Rallis Rossman, 2001; Stake, 1995; Yin, 2009). Case studies provide opportunities to understand phenomena in their real-life contexts. In this instance, a case study approach allowed us to investigate the perspectives of Orchid Heights educators involved in DDDM, the everyday practices, behaviors, and ideologies that constructed instructional decision-making at the school, and the challenges the teachers faced in trying to make use of data. All teachers at Orchid Heights were expected to use DDDM and also to work on implementing multiple other reform initiatives. Each teacher was given time in the instructional day to collaborate with grade-level colleagues. We focused our investigation on the regularly scheduled grade-level meetings in which teachers met to plan their lessons and data use. Meetings were held weekly and each lasted about 1½ hours. At the recommendation of the principal, we observed the fourth-grade team, which consisted of two full-time teachers, and the first-grade team, which consisted of two full-time and two part-time faculty. (The part-time faculty were not required to attend all of the meetings. They were present, however, for the majority of the meetings we observed.) The principal judged these two grade-level teams as ahead of other Orchid Heights faculty in terms of designing and implementing reform initiatives at the school. We conducted interviews once with each of the four full-time faculty in the first and fourth grades, twice with the principal (once at the beginning of the school year and once at the end), and once with the district superintendent. The same semi-structured interview guide was used for all teachers. One researcher conducted the principal and superintendent interview. We used a semi-structured interview guide for all interviews because doing so allowed us to include questions shaped by our review of the literature on DDDM, without completely sacrificing the benefit of flexibility that a more conversational interview approach permits (Charmaz, 2006). Our questions were designed primarily to elicit information that would help us understand participants’ actions as well as the context in which they took place. So, for example, we sought to find out how teachers defined data and which factors seemed to support and which challenge data use across the curriculum. Except for one teacher interview, two members of our research team conducted each teacher interview. All tapes were listened to by at least two members of the research team, and all were transcribed verbatim. Two members of the research team coded each transcript to ensure consistency in analysis. Fourteen grade-level meetings (nine fourth grade and five first grade) were observed over the course of the academic year and in most cases, there were two researchers present. We used an ethnographic approach in our observations. We audio recorded the entirety of each meeting, paying attention not only to what teachers said about their practice but also to the interaction among teachers and how they described the social context in which they worked. Informed by the concepts in the literature review, we transcribed, coded and analyzed this data as well in order to identify themes. We used sociocultural constructivist methodology to interpret the data from this study. This approach recognizes that it is essential to ‘‘rely as much as possible on the participants’ views of the situation being studied’’ (Creswell, 2009:8) and to place an emphasis on the phenomenon being studied through the analysis of the social contexts in which the data are collected (Charmaz, 2006). We were able to take advantage of the serendipitous nature of qualitative research by letting our respondents take us in directions that we had not predicted, while also in some cases building upon and in others questioning the knowledge we had gathered from previous research. For example, respondents pointed us to the impact of the multiple initiatives on DDDM, an issue whose significance we had not anticipated. Theorizing in the interpretive tradition, as suggested by Charmaz (2006), we used what we observed in meetings and gathered in interviews to ‘‘delve into the implicit meanings and processes’’ (p. 146) associated with the implementation of DDDM at Orchid Heights. The district and school context Orchid Heights Elementary is located in a K-6th grade school district that serves a predominately Caucasian population (80%). Student performance measures on the most recent state assess- ment test place this district among the highest achieving school districts in the state. We chose to investigate this district because the reform-minded superintendent and assistant superintendent have pushed aggressively for multiple reform initiatives, including L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6256
  • 4. data use, over the last three years in an effort to not only maintain the district’s legacy of distinction (all of the district’s schools have been recognized as distinguished schools), but also to further improve the nine district elementary schools. We chose Orchid Heights for this study because the superintendent viewed the principal as a strong leader who fully supported district initiatives, and thus potentially put Orchid Heights ahead of many other district schools in implementing reform measures. Generally, the district’s initiatives have revolved around efforts to align teaching and learning with data. Data use was considered essential not only because district administrators viewed it as a savvy strategy to improve student learning, but also because federal No Child Left Behind (NCLB) legislation demanded that schools use data to inform their instruction. Beginning in 2007– 2008, under NCLB, schools are required to track data related to school demographics, assessments, accountability, and teacher quality. Moreover, under NCLB’s accountability provisions, schools must make ‘‘adequate yearly progress’’ for all student groups, including English Language Learners.2 Beginning several years ago the district began administering benchmark assessments in language arts and math. Data from these assessments were made available to teachers soon after each assessment, with the assumption that they would guide data driven decision making. In addition, the district required schools to put student-centered learning in place as evidenced by the implementation of Project Based Learning (PBL). The recent push by the federal government for the adoption of Common Core Standards (CCS) is likely to have motivated the district superin- tendent to use PBL as a way to support teachers in their efforts to meet those new standards. CCS clearly specifies what is expected of students at each grade level. PBL and CCS are considered by some to mesh well because PBL emphasizes critical thinking skills and supports interdisciplinary instruction (Markham, 2012), key dimensions of the CCS. The district believed that PBL, with its thematic approach, offered teachers the opportunity to embed English language arts and math instruction into other subject areas. The district also encouraged each school under its auspices to select other initiatives that would support student achievement. We describe the initiatives chosen by Orchid Heights below. To support teachers in the implementation of reform initiatives, the district asked that all teachers be given the time to meet weekly in grade-level teams to plan their instruction. Teachers were expected to work collaboratively, to engage in team-level planning for PBL units, and to facilitate the work of integrating discussions of data into PBL at their school. Orchid Heights Orchid Heights serves a more diverse population of students than other schools in the district. The school demographics reveal that approximately 54 percent of the students are Hispanic/Latino, 43 percent are White/European/American/Other, 2 percent are Asian American/Orchid Islander, and 1 percent are African American. Moreover, roughly 44 percent of Orchid Heights students qualify for a free or reduced-price meal subsidy. The most recent test data indicate that the school’s scores are well above the state average. Despite this success, the principal and teachers recognize that all students are not achieving, and this is particularly true of their English Language Learner (ELL) population. For example, state test results from the 2010–11 school year reveal that only 39 percent of the school’s ELL students scored at proficient or advanced level on the state test in English Language Arts, as compared with 87 percent of the non-Hispanic (Caucasian) student population (School Accountability Report Card, 2011:5). To comply with the district’s request that schools select additional initiatives that would help further improve their schools, Orchid Heights chose the International Baccalaureate Program (IB).3 According to the principal, teachers wanted a higher profile for their school and they hoped the IB program would help them attain that status. They also wanted to be able to make use of student assessments based on data from sources other than test scores. The IB curriculum, with its focus on units of study and the research process, allows teachers to construct a variety of assessments. The IB program is consistent with the school-wide goal of helping students develop the knowledge and skills necessary to become global citizens who are prepared for the future. The principal explained that she intended to integrate the district mandated PBL initiative with the IB program to help the school achieve IB certification. Orchid Heights also chose to adopt the Guided Language Acquisition Design Program (GLAD),4 an initiative that offers strategies to support ELL students. Several grade levels also use IPads with their students and, most recently, a new physical education initiative has been introduced for all grade levels. These school-specific initiatives were implemented simultaneously with Orchid Heights’ already in-place reform emphasis on data use, developing students’ basic skills, and teaching to the current state standards, as well as the new Common Core standards. In adopting these multiple initiatives, the principal expected that teachers would integrate the initiatives and not consider each one an ‘‘add on.’’ Teachers would ideally embed data use in all of the reforms, and doing so would, hopefully, improve test scores. In general, Orchid Heights teachers felt that they were part of a culture that supported reform. One teacher described her colleagues as‘‘verycollaborative. . .withamindset thatchangeisgood.Weneed to embrace it. If everyone works together, we can overcome whatever the perspective is out there in the community [about this school].’’ One of the first-grade teachers explained that teachers were anxious to counter the public perception of the school, ‘‘because our school’s always been. . .they used to call it ‘[Orchid] Hole’. . .it has always been the lower performing and the under- performing. . . the. . .the stepchild of the district kind of thing.’’ Despite their receptivity to change, some teachers admit they are ‘‘growing weary’’ and feel they cannot ‘‘wrap their heads around one more thing.’’ According to a fourth-grade teacher, ‘‘it feels like we are building the plane while flying it.’’ Responsibility for implementing multiple initiatives simultaneously creates multiple tensions for teachers and precludes some important opportunities to use data. We discuss these tensions in the next section. Findings In attempting to understand how teachers implemented data use, we found that the first- and fourth-grade teachers we studied at Orchid Heights Elementary School readily embraced data-driven decision-making in the areas of English language arts and math, 2 See ‘‘Improving Data Quality for Title I Standards, Assessments, and Accountability Reporting: Guidelines for States, LEAs, and Schools (Non-Regulatory Guidance),’’ U.S. Department of Education, Office of Elementary and Secondary Education, April 2006, pp. 6–7, found on the U.S. Department of Education website, http://www2.ed.gov/policy/elsec/guid/standardsassessment/nclbdataguidan- ce.pdf, Accessed on 11/23/12. 3 The International Baccalaureate is a non-profit educational foundation. Its four programs for students aged 3 to 19 ‘‘help develop the intellectual, personal, emotional and social skills to live, learn and work in a rapidly globalizing world’’ (http://www.ibo.org/general/who.cfm. Accessed 11/24/12). 4 Project GLAD is a professional development model to support literacy and language acquisition instruction. Originally developed in the Fountain Valley School District in California, GLAD is now supported by independent trainers. For more information, see http://www.projectglad.com. Accessed 8/10/13. L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 57
  • 5. using district benchmark data and student work to guide their classroom practice in these two subjects. This data was made available to them through a web-based data management system, which teachers could access individually. To help support instruction, the principal also constructed a detailed report of student data for each teacher. During interviews and in the grade-level team meetings we observed, the teachers indicated verbally and through their body language (e.g., affirming through facial expressions and/or sharing knowing glances with their colleagues) that they were familiar and comfortable with using test score data to identify their students’ areas of strength and weakness and confident in their ability to orient their own teaching strategies to address the needs the data helped them identify. Teachers also knew how to efficiently access the test score data they wanted. This data was not used, however, in the areas of social studies and science. In those subjects, although teachers collected various forms of data and verbalized a willingness to apply assessment results to classroom practice, they lacked training in how to use the benchmark data in English language arts and math to inform instruction for social studies and science. This is perhaps not surprising given the fact that English language arts and math were the focus of these assessments, and their results were reported in a way that lent themselves more to skill building in core subject areas rather than to planning project based lessons. We found that despite strong leadership and support from their principal, by the end of the academic year, teachers had made limited progress integrating data into their planning for instruction across the curriculum. Below, we examine these results in greater detail. We consider the consequences of our findings, including how the compartmen- talized use of data affects student learning, in the conclusion. Challenges teachers face in moving data into useable knowledge for guiding instruction Understanding teachers’ data use in tandem with other reform initiatives revealed that Orchid Heights teachers faced a staggering set of challenges. Like all other teachers in their district, they were expected to comply with a large number of instructional requirements in a short period of time, identify student needs, meet specific grade-level standards, and establish classroom goals for the year, all guided by data they had prior experience using. They also were expected to systematically monitor students’ skill development in English language arts, using the district-adopted textbook series and its ‘‘theme tests.’’ In addition to meeting these district-wide expectations, they had site-specific responsibilities. Because Orchid Heights had chosen to adopt an IB program, the teachers had to successfully implement six units of IB curriculum in a one-year period – and they would need to continue to do so annually in order to maintain IB certification. The IB program stipulates that there is to be little, if any, overlap between the units, and teachers must conduct ongoing planning and assessment of the units. The need to meet such stipulations in order to achieve and maintain IB certification placed significant demands on Orchid Heights teachers. The district-wide adoption of Project Based Learning exacted its own, different demands at the school. The district provided a district-wide professional development (PD) event at the begin- ning of the year to help teachers become acquainted with PBL and learn how to implement PBL units in their classrooms. The training had some benefits but teachers admitted ongoing struggles with implementing PBL as they sought to learn how to ‘‘do projects, motivate students to learn more, work in collaborative groups and prepare them for beyond elementary school,’’ as one teacher explained. We found that over the course of the academic year, the teachers, who were already feeling stressed over requirements to use data to develop students’ basic skills, improve standardized test scores, and implement IB and PBL, made very little progress in integrating initiatives and engaging in the use of test score data across the curriculum. Difficulties integrating data use across the curriculum For Orchid Heights teachers, looking at benchmark data and responding to what that data told them about their students’ needs occurred systematically in two subjects, English language arts (ELA) and math – the two subjects for which students’ proficiency is formally and routinely assessed. Areas revealed by test score data as requiring more skill building were addressed during the separate and bounded instructional time for these two subjects. In math, knowing from test scores that students needed more support with rounding numbers, for example, teachers tested and retested them on their ‘‘rounding ability’’ creating new lessons to reemphasize the concept and doing ‘‘drills over and over again’’. To help improve their multiplication skills, students used a computer program to self-test their level of understanding. One teacher recalled: ‘‘I was down to four students who [hadn’t] received ten points yet – but I kept revisiting old goals so I could put their name up on the board to keep all students motivated.’’ The teachers worked ceaselessly to build their students’ basic skills in English language arts and math, but such instruction remained mostly in its own instructional silo, held apart from other subject areas. Although it is understandable that language arts and math would receive the majority of instructional attention given these subjects are the major focus of high stakes state account- ability tests, this compartmentalization limited instructional change since benchmark assessment data were not integrated in planning across multiple subject areas. We found one important exception to this pattern. In the latter part of the academic year, the first-grade teacher team discussed a plan to incorporate ELA and ELD (English Language Development) into their IB unit on ‘‘How the World Works.’’ While their main focus was to develop – within the IB – a PBL unit on weather and natural disasters as a way of teaching students about matter, and to discuss the water cycle as part of their science unit, they also intended to ‘‘increase the trans- disciplinary aspect of the unit.’’ As one of the teachers put it, ‘‘We want to use weather and matter to teach idioms such as, ‘March always comes in like a lamb and goes out like a lion,’ and ‘It’s raining cats and dogs.’’’ The team had recognized the importance of integrating students’ English language needs into their IB/PBL unit – clearly an important first step toward using data effectively to support the acquisition of knowledge. However, in practice, the teachers apparently found the task too complex. That kind of data use, they admitted, remained ‘‘not yet fully integrated with the reform initiatives.’’ Conflicts meeting the needs of ALL students Absent an understanding of how to integrate English language arts and math benchmark data into subject area instruction, specifically into the PBL units designed to teach social studies and science, teachers struggled to meet the needs of all students. For the most part, PBL planning led to lessons in which all students were treated as though they would benefit from the same instruction. One of the fourth-grade teachers recognized this as a problem, noting that the current PBL design did not address the needs of her under-achieving students. These students were falling behind during the PBL instructional time because she and her colleague had not differentiated the instruction to provide individualized support for those who needed it. This teacher considered improving students’ access to knowl- edge an essential goal of any initiative and she particularly worried about her English language learner (ELL) population. She ques- tioned her colleagues frequently about how to provide instruction L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6258
  • 6. within the PBL framework for students who have been identified as struggling to learn English, much less other subjects. She described Francisco, an English language learner who was at a significant disadvantage when the class started a PBL unit on fossil fuels. For him, fossil fuels were an entirely new and very abstract concept. She believed the vocabulary was too complex for him. The PBL unit, which required that students immediately jump into the work, presumed an in-depth understanding of the concept and the vocabulary and did not provide sufficient scaffolds for English language learners. She believed that since Francisco had no previous opportunity to acquire this understanding, he could not contribute in a meaningful way to his collaborative group. The lesson had not been designed in a way that differentiated for students’ varied language skills. The benefits of contextualizing basic skill development within PBL has been recognized by others (Buck Institute, 2012). But the Orchid Heights teachers, who were not provided training in how to integrate data on students’ language arts and math achievement with science and social studies instruction during PBL, perceived the struggles of students like Francisco as inevitable: It is nearly impossible for those kids who are reading far below their level to meet the grade-level standard. The reality is that there are some kids for whom that goal is simply not attainable, at least not this year. How much do we really want to test and retest them? Fourth grade teachers did not feel as though students at other achievement levels benefited fully from the first PBL unit they designed either. Teachers pointed to the collaborative group structure of the PBL as the source of the problem. One teacher, describing a group in her class, noted that ‘‘the kids ignored the student who came prepared in advance and had all information and research done.’’ As this teacher explained, ‘‘Collaboration is an issue at this age – teaching them this concept of collaboration and managing it has been a challenge.’’ The observational data they had gathered by watching students in the course of the PBL lessons led teachers to conclude that students struggled with collaboration. This conclusion was indeed important and would help guide future planning. However, it was also clear the teachers, despite their best intentions, had not yet learned how to integrate their analysis into their PBL planning in a way that would lead them to differentiate instruction, arrange the groups somewhat differently, and make curricular adjustments to improve learning opportunities for all students. How data use is shaped by the presence of multiple initiatives One tension that hindered the integration of data use across subject areas was related to what teachers perceived to be the incompatibility of multiple initiatives. During the initial months of implementation of the new initiatives, teachers were preoccupied with trying to understand whether and to what extent IB and PBL shared a coherent logic. IB Unit planners (forms used to plan units) used language that was obtuse, academic and non-user friendly, according to the teachers. Teachers felt that the IB unit planners did not align with the PBL unit planners. Both IB and PBL were ‘‘jargon heavy’’; moreover, they each used different language, which confused teachers. By mid-year, teachers were still questioning the IB-PBL fit. Planning units for both IB and PBL took double the time and this created another tension. One fourth grade teacher noted that it took a few months to do one IB unit well. To accomplish the development, implementation, and assessment of six IB units in a period of only nine months was viewed as ‘‘unrealistic and unreasonable.’’ Moreover, IB units were not permitted to last more than six weeks, which meant that two units were not allowed to be implemented simultaneously – obviously presenting additional challenges. Teachers expressed concern over another misfit between initiatives. Data generated from benchmark test results had set them on a path of improving students’ basic skills, but the requirements associated with IB and/or PBL instruction seemed to call for a different approach. As one teacher said, This is where I am struggling – what we were doing [before PBL, namely re-teaching basic skills], makes sense to me, but PBL seems like a different realm. I’m on board with PBL but they [teaching basic skills and PBL] seem like two diametrically opposed things. Another teacher commented, ‘‘now, with the requirement to implement PBL and IB, it’s kind of about pulling the two together – skills and projects; choice, collaboration, motivation, and stan- dards. It’s hard – and management is difficult.’’ The perceived dissonance that plagued teachers is captured in this question, posed by a fourth grade teacher: ‘‘How do you teach basic skills and teach kids that are below grade level, using PBL?’’ When we asked first-grade teachers whether data regarding students’ achievement in language arts and math was integrated with PBL, one answered, ‘‘Not yet. I have to be honest, not with us. Science and social studies standards, yes. But they’re not directly tested until what, fifth grade?’’ One of the teachers at this grade level confessed that when they planned their PBL units, ‘‘I don’t think we looked at it [data] at all for PBL.’’ Similarly, a fourth-grade teacher explained that data that identified students’ language arts and math needs were not considered when they organized their PBL instruction. It is important to note, however, that the teachers did collect data during and after PBL units. Teachers administered a variety of tests to assess students’ content knowledge in social studies and science. They used rubrics, videotaped student presentations, and asked students to reflect on their content learning using formative and summative assessments. (The IB program mandates summa- tive assessment.) Ideas for PBL assessment often were borrowed from the Buck Institute.5 However, teachers did not use the district benchmark results – data that offered them information on students’ language and math strengths and needs – to guide them in constructing their PBL units, as the principal had intended. Continued divisions between content areas Teachers dedicated about 30 percent of their time to PBL and about 70 percent to the development of their students’ basic English language and math skills because, according to one teacher, ‘‘Students have to have skills and knowledge before they can take [the knowledge] and use it in a project. So there should always be a certain amount of frontloading/direct instruction.’’ Interestingly, over time, although they continued to struggle to understand how the data they gathered on students’ academic needs in language and math could be integrated within their PBL units, teachers we interviewed began talking about IB and PBL interchangeably. In preparing for the third PBL, and likely due to diligent efforts to reconcile the two initiatives, one teacher commented, ‘‘I now see IB and PBL as the same thing because IB incorporates PBL. . .IB is just a specific program that uses PBL. . .it embraces the PBL model. . .so, as far as I’m concerned, there is no difference between the two.’’ This dramatic shift in perception occurred alongside the more entrenched perspective that the 5 The Buck Institute provides training and support for schools implementing PBL. The Buck Institute for Education (BIE) has created free materials – ‘‘FreeBIEs’’ – such as planning forms, student handouts, rubrics, and articles for educators to download and use to design, assess, and manage projects. The teachers claimed these resources were incredibly valuable. L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 59
  • 7. teaching and learning of basic skills and PBL/IB would remain separate efforts, and only the former would be systematically guided by data. This presumably limited the power of data informed decision making at the school. Data use aligned with issues related to teacher accountability and exacerbated the split between subject area divisions. Language arts and math are subjects tested with the state’s high-stakes standardized test annually beginning in second grade; science is not assessed until fifth grade and social studies is not assessed in elementary school at all. Thus, the first- and fourth-grade teachers felt freer to teach these subjects within the PBL structure. One teacher used a Venn diagram and described the separation this way: Now, as far as the [English Language Arts] Standards go, at this point, it is still textbook/basic skill development . . . for ELLs – out here [outside PBL]. At this point we’re in transition. You have math out here. And within this IB/PBL circle you have. . .you might have some social studies, and you have some science that is incorporated into the PBL. Math will never be totally integrated. [teacher’s emphasis] Teachers felt that the current instructional arrangement would most likely continue although, as they spent more and more time on social studies and science and less time on the ELA component, they worried that soon ‘‘we will have to figure out how to incorporate the ELA skills into those areas and make it more seamless.’’ Despite their persistent effort and motivation, teachers admitted that they were not yet adept at integrating the curriculum and using data generated from ELA and math to support social studies and science instruction. Discussion accounting for the challenges and the successes How do we account for the instructional divisions and difficulties in the use of data found at Orchid Heights? Teachers pointed to the effects of several important tensions. One was a lack of resources. Apart from the state-adopted textbook, teachers had few primary resources from which to draw ideas and content for the PBL units. Some content is available online, but it takes time to find and even more time to figure out how to incorporate it into lesson plans. Moreover, the textbooks they had did not ‘‘match up’’ with the goals of their ‘‘IB and PBL units’’. One teacher noted: If we had more ELA materials that went with the IB units or PBL units, then the units could be in a more literature form – that would lend itself well. It’s just a transition. At least our school is transitioning away from the ELA text and moving on towards novel sets and integrated IB units, but we’re not there yet. The school librarian was very helpful, but the teachers noted that she had not yet been trained in the IB curriculum. And budget constraints were expected to limit what new materials the school would be able to purchase. Orchid Heights’ teachers had to ‘‘wing it’’; they were not given a budget to fund their reform efforts. Despite some positive developments, such as receiving grants and assistance from college students, teachers found it difficult to provide all students with access and equity within the PBL structure. Capacity issues Individual capacity varied across teachers as well. One of the fourth-grade teachers was in a Master’s degree program specializ- ing in PBL. This training gave her additional insight into how best to develop, implement, and assess PBL units. She had access to monthly project peer review that provided her with feedback from others engaged in PBL. As a result, she was much more knowledgeable than her colleagues and able to add a great deal of support to her fourth grade team. Most other Orchid Heights teachers were struggling to plan PBL lessons in their grade-level teams. They valued the autonomy the principal gave them, since it left them free to make their own decisions and set their own timeline for PBL implementation, but they lacked a deep understanding of PBL and how to integrate ELA and math data across the curriculum. They wanted an opportunity to connect with other teachers who also were implementing PBL projects. They felt they would be able to build their own capacity to do the work if they were able to see examples of units their peers had developed, learn how these teachers created their curriculum, gain a sense of what their peers wrestled with, and examine others’ successes and challenges with incorporating reading comprehen- sion, language arts, and skill development. While teacher capacity affected the implementation of PBL and data use, it is important to note that teachers were constrained in their use of data simply because standardized data was not available for social studies and science. Indeed, these areas were not assessed on the district benchmarks, just as they were not assessed in most districts, nor were they assessed at the state level with the exception of science in one elementary grade. Teachers were clearly challenged by the absence of data. The importance of leadership The Orchid Heights principal was highly sensitive to the value of data and did what she could to support its use among her teachers. She gave them time to work in grade-level groups to plan their units of instruction. She analyzed benchmark data, then ‘‘chunked the data,’’ as she explained, sorting students by demographics, achievement level, and economic status to maximize the planning for instruction for each child. Students who were identified as language learners, receiving free or reduced-price lunch, and below basic skill levels were flagged to receive immediate attention because they were seen as being in triple jeopardy of falling behind. The principal collaborated with teachers to identify students by grade level who were below basic grade level and helped them interpret students’ instructional needs. She ensured that low performing students were placed in groups where they received focused attention for 40 minutes a day/four days a week. While the principal supported teachers in their implementation of IB and PBL, and when working with ‘‘high risk’’ students, that attention, too, was compartmentalized and limited to basic skills that were divorced from the broader curriculum. Although clearly the principals’ actions were laudable, one concern is that by focusing on ELA and math data, like her teachers, she gained no systematic knowledge of the students’ grasp of social studies and science. Thus far the principal acknowledged that success with data use had been limited, but based on her routine observations of classroom practice, she felt that the faculty had made progress over the course of the year. Indeed the entire staff was using data to inform instruction in language arts and math, everyone was implementing IB, and the majority of the teachers were implementing PBL. She attributed these successes to the way she rolled out the initiatives, which she described as slow. She exerted some but not a lot of pressure on teachers to achieve during this first year of implementation. She deliberately tracked changes in benchmark data in order to show the teachers the extent to which their efforts were paying off in achievement gains. She was able to show through enrollment figures and feedback from parents how the adoption of these multiple initiatives had gained the school greater visibility in the community and earned kudos from parents who were glad to be part of such a ‘‘progressive’’ school. The principal felt her strategic efforts had resulted in some ‘‘shift’’ in teacher attitudes and practice and that the school was moving forward. L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6260
  • 8. She acknowledged, however, that there was still a long way to go. For the most part, she explained that teachers ‘‘went through the motions’’ and were ‘‘lacking a passion for PBL.’’ They wanted, instead, to adhere to practices from ‘‘back in the day,’’ which meant teaching subjects in isolation, independent of each other. By the end of the year, despite the principal’s efforts to provide a structure and a culture of data use and her efforts to promote progressive instructional student-centered strategies, integration across sub- jects had not occurred. The PBL structure, in principle, supports the integration of all subjects. Data use within the PBL units, however, focused on student work and, while clearly helpful, in general, there were missed opportunities to extend and deepen instruction in the area of students’ language and math needs. The principal was able to show the faculty that much work still needed to be done and that full implementation of the initiatives was essential, particularly because of the persistent underachieve- ment of their ELL students. Her ability to collect and analyze pertinent data and then share that with her teachers built an even more powerful case for the need to accelerate the use of data across the curriculum. The curriculum and pedagogical divisions at Orchid Heights were striking enough to cause the principal to worry about whether she would be able to ‘‘marry the two.’’ While she was able to point out the need for that union, she was not able to be explicit about how to accomplish the ‘‘marriage.’’ She admitted to struggling with how to help the faculty integrate the various initiatives in a way that put the use of benchmark data and student needs at the center of their reform efforts, and how to create a coherent course of study that would capitalize on this valuable data. These questions continued to challenge this data-minded school leader as she made plans for the next academic year. Conclusion and implications This district believed that if they were to improve educational outcomes for all students, data had to inform instruction. To a large extent, Orchid Heights educators agreed. The principal and teachers worked together to provide language arts and math instruction and used district benchmark data to inform their practice for these subjects. When teaching social studies and science (using IB and PBL), although clearly using student work to inform their instruction, they did not find the assessment data in English language arts and math to be helpful in their planning. The compartmentalization and absence of integrating this data across the curriculum was due to the impact of ‘‘institutional, historical and cultural factors’’ (Spillane, 2012:125). Building on the work of Spillane (2012), but going further to provide a micro-analysis of teachers’ actual experiences with data, we found that institution- ally, the principal and teachers had constructed teaching practices and organized the school day in a way that kept English language arts and math data in an instructional silo. Historically and culturally, Orchid Heights’ teachers were driven to teach in a way they had always taught – with the belief that subjects were best taught in isolation. Teachers’ own prior knowledge and experi- ences had influenced their relationships with data use (Spillane et al., 2002). They were accustomed to teaching one subject, assessing student work, using the data to inform the teaching of that subject and moving on to repeat the cycle with a different subject. Integrating content areas in an interdisciplinary way and understanding the usefulness of using data across content areas was not only unfamiliar but uncomfortable terrain. Predispositions and prior teaching experiences constrained opportunities for change. The larger context in which teachers worked mattered as well. Since ELA and math were the subjects teachers were held most accountable for by the federal, state, district, and school principal, they felt compelled to focus on basic skill development and benchmark data for those subjects. In the subjects of social studies and science, for which there was less or no state accountability at the elementary level, teachers felt freer to use the IB and PBL units and also less able (and willing) to use ELA and math benchmark assessment data to inform instruction. We are not arguing that using benchmark data are always the most helpful data, especially in guiding project based instruction across the curriculum. We are suggesting, however, that compartmentalizing specific data driven decision making has consequences for teaching and learning. This investigation into data use helps us to understand not only the specific circumstances faced by the principal and teachers at Orchid Heights but also contributes to efforts to answer larger questions about organizational learning and how institutional structures and culture influence daily practices. By focusing on the meaning that teachers gave to data use and to the federal, state, district and school context in which they were situated, we have been able to deconstruct what Fullan (1999) has so aptly pointed out: that one of the biggest problem facing schools is fragmenta- tion and overload. These educators and many in the U.S. and internationally, are implementing multiple initiatives and they do not know how to integrate them. They lack the knowledge and a strong rationale for doing so. As a result curriculum is fragmented and often produces incoherence. Teachers are overwhelmed and forced to rely on what they know. Often they do not have the capacity (both human and instructional resources) to most effectively use all kinds of data across the curriculum, and they lack opportunities to build their skills. They do not have the capacity to manage numerous reform demands, which are designed to be mutually supportive but which inadvertently pull them in conflicting directions. It seems likely that as Orchid Heights teachers move forward with their work they will need additional support to effect change. The Orchid Heights principal, as many principals across the US, is constrained, however, in her capacity to provide the guidance necessary to do the work. She had limited opportunities to enhance her own professional development. As prior research has suggested, districts can play an important role in capacity building (Cawelti Protheroe, 2007), but when districts are financially strapped as they are in the current economy, principals and teachers are left without the support they need. We also learn from this study that even a generous allocation of time is insufficient to move data use to center stage. Teachers (and school leaders) need knowledge and resources that can help them to engage with data and to know how to use data to shape a coherent educational plan in the context of a school that is implementing multiple initiatives. As districts increasingly add more reform initiatives to teachers’ plate to satisfy the world of high-stakes accountability, they must recognize the importance of providing teachers and school leaders with the requisite knowl- edge and skills to integrate them in order to affect change. Author note An earlier draft of this paper was presented at the International Congress of School Effectiveness and School Improvement, Chile, January 2013. We are grateful to the participants in this study and to the University of San Diego for their generous support of this research. References Buck Institute website. PBL Planning Resources for Teachers. Retrieved from http:// www.bie.org. Accessed: 18.11.12. Cawelti, G., Protheroe, G. (2007). The school board and central office in school improvement. In Walberg, H. (Ed.). Handbook on restructuring and substantial school improvement. (pp.37–52). . L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–62 61
  • 9. Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. Thousand Oaks, CA: Sage Publications. Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading policy in their professional communities. Educational Evaluation and Policy Analysis, 23(2), 145–170. Coburn, C. E., Turner, E. O. (2012). The practice of data use: An introduction. American Journal of Education, 118(2), 99–110. Creswell, J. (2009). Research design (3rd ed.). Thousands Oaks, California: Sage Pub- lications. Daly, A. J. (2012). Data, dyads, and dynamics: Exploring data use and social networks in educational improvement. Teachers College Record, 114(11), 1–21. Datnow, A., Park, V., Wohlstetter, P. (2007). Achieving with data: How high-performing school systems use data to improve instruction for elementary students. Center on EducationalGovernance,RossierSchoolofEducation,UniversityofSouthern California. Dowd, A. (2005). Data don’t drive: Building a practitioner-driven culture of inquiry to assess community college performance. Indianapolis, IN: Lumina Foundation for Education. Earl, L., Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal of Education, 33(3), 383–394. Earl, L., Katz, S. (2006). Leading schools in a data rich world. Thousand Oaks, CA: Corwin Press. Foley, E., Sigler, D. (Winter, 2009). Getting smarter: A framework for districts. Voices in Urban Education, 22, 5–12. Fullan, M. (1999). Change forces: The sequel. London, England: Falmer Press. Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009- 4067). Washington, DC: National Center for Education Evaluation and Regional Assistance Institute of Education Sciences US Department of Education. Hatch, T. (2002). What happens when improvement initiatives collide? Phi Delta Kappan, 8(8), 626. Honig, M. I., Venkateswaran, N. (2012). School–central office relationships in evi- dence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199–222. Horn, I. S., Little, J. W. (2010). Attending to problems of practice: Routines and resources for professional learning in teachers’ workplace interactions. American Educational Research Journal, 47(1), 181–217. Ikemoto, G. S., Marsh, J. A. (2007). Cutting through the ‘‘data driven’’ mantra: Different conceptions of data-driven decision making. In P. A. Moss (Ed.), Evidence and decision making (National Society for the Study of Education Yearbook, Vol. 106, Issue 1, pp. 105–131). Chicago: National Society for the Study of Education. International Baccalaureate Diploma Programme website. Retrieved from http:// www.ibo.org/general/who.cfm. Accessed: 24.11.12. Leithwood, K. (2008). Characteristics of high performing school districts: A review of empirical evidence. Calgary, AB: College of Alberta School Superintendents. Levin, B., Datnow, A., Carrier, N. (2012). Changing school district practices. Boston, MA: Jobs for the Future.http://www.studentsatthecenter.org/papers/changing-school- district-practices. Levin, J., Datnow, A. (2012). The principal as agent of mediated educational reform: Dynamic models of case studies of data driven decision making. School Effectiveness and School Improvement, 23(2), 179–201. Madda, C., Halverson, R. M., Gomez, L. (2007). Exploring coherence as an organiza- tional resource for carrying out reform initiatives. Teachers College Record, 109(8), 1957–1979. Mandinach, E. B., Honey, M. (Eds.). (2008). Data driven school improvement: Linking data and learning. New York: Teachers College Press. Mandinach, E. B., Honey, M., Light, D. (2006). A theoretical framework for data driven decision making. Paper presented at the annual meeting of the American Educational Researchers Association. Markham, T. (2012). Top ten tools for PBL. Retrieved from www.thommarkham.com. Accessed: 25.11.12. Marsh, J. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48. Marsh, J., Kerr, K. A., Ikemoto, G. S., Darilek, H., Suttorp, M., Zimmer, R., et al. (2005). The role of districts in fostering instructional improvement: Factors affecting data use. RAND Education. McPhee, A., Patrik, F. (2009). ‘‘The pupils will suffer if we don’t work’’: Teacher professionalism and reactions to policy change in Scotland. Scottish Educational Review, 41(1), 86–96. Means, B., Padilla, C., Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. Washington, DC: US Department of Education Office of Planning Evaluation and Policy Development. Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass Publishers. Moss, P. A. (February, 2012). Exploring the macro-micro dynamic in data use practice. American Journal of Education, 118(2), 223–232 http://dx.doi.org/10.1086/663274. Rallis, S. F., Rossman, G. B. (2001). Communicating quality and qualities: The role of the evaluator as critical friend. In A. P. Benson, D. M. Hinn, C. Lloyd (Eds.), Visions of quality: How evaluators define, understand, and represent program quality (pp. 107–120). Oxford, UK: JAI Press. Rogosa, D. (2005). Statistical misunderstandings of the properties of school scores and school accountability. In J. L. Herman E. H. Haertel (Eds.), Uses and misuses of data for educational accountability and improvement. 104th yearbook of the National Society for the Study of Education (pp. 147–174). Malden, MA: Blackwell Publish- ing. Schildkamp, K., Lai, M. K. (2012). Introduction. In K. Schildkamp, M. K. Lai, L. Earl (Eds.), Data-based decision making in education: Challenges and opportunities (pp. 1– 9). Dordrecht, Netherlands: Springer. School Accountability Report Card (SARC), 2011. Data Almanac. Spillane, J. P. (2012). Data in practice: Conceptualizing the data-based decision-making phenomenon. American Journal of Education, 118(2), 113–141. Spillane, J. P., Reiser, B. J., Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Re- search, 72(3), 387–431. Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage Publications. Supovitz, J. A. (2009). Can high stakes testing leverage educational improvement? Prospects from the last decade of testing and accountability reform. Journal of Educational Change, 10(2 3), 211–227. Supovitz, J., Taylor, B. S. (2003). The impacts of standards-based reform in Duval County, Florida 1999–2002. Philadelphia, PA: Consortium for Policy Research in Education. U.S. Department of Education, Office of Elementary and Secondary Education. (April, 2006). Improving Data Quality for Title I Standards, Assessments, and Accountability Reporting: Guidelines for States, LEAs, and Schools (Non-Regulatory Guidance). (Pro- duced by DTI Associates, A Haverstick Company, under U.S. DOE Contract No. ED- 01-CO-0066/0009), pp. 6–7. Retrieved from U.S. Department of Education website, http://www2.ed.gov/policy/elsec/guid/standardsassessment/nclbdataguidan- ce.pdf. Accessed: 23.11.12. Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Applied Social Research Methods Series (Vol. 5). Thousand Oaks, CA: Sage Publications. Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms. American Journal of Education, 112(4), 521–548. L. Hubbard et al. / Studies in Educational Evaluation 42 (2014) 54–6262