INFORMATION TO USERS
This manuscript has been reproduced from the microfilm master. UMI
films the text directly from the original or copy submitted. Thus, some
thesis and dissertation copies are in typewriter face, while others may be
from any type ofcomputer printer.
The quality of this reproduction is dependent upon the quality of the
copy submitted. Broken or indistinct print, colored or poor quality
illustrations and photographs, print bleedthrough, substandard margins,
and improper alignment can adversely affect reproduction.
In the unlikely event that the author did not send UMI a complete
manuscript and there are missing pages, these will be noted. Also, if
unauthorized copyright material had to be removed, a note will indicate
the deletion.
Oversize materials (e.g., maps, drawings, charts) are reproduced by
sectioning the original, beginning at the upper left-hand comer and
continuing from left to right in equal sections with small overlaps. Each
original is also photographed in one exposure and is included in reduced
form at the back ofthe book.
Photographs included in the original manuscript have been reproduced
xerographically in this copy. Higher quality 6” x 9” black and white
photographic prints are available for any photographs or illustrations
appearing in this copy for an additional charge. Contact UMI directly to
order.
UMIA Bell & Howell Information Company
300 North Zeeb Road, Ann Arbor MI 48106-1346 USA
313/761-4700 800/521-0600
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without permission.
NOTE TO USERS
The original manuscript received by UMI contains indistinct,
slanted and or light print. All efforts were made to acquire
the highest quality manuscript from the author or school.
Microfilmed as received.
This reproduction is the best copy available
UMI
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without permission.
Using the Malcolm Baldrige National Quality Award Education Pilot Criteria for Self-
Assessment of School Districts
Presented in Partial Fulfillment o f the Requirements for the
Degree o f Doctor o f Philosophy
with a
Major in Education
in the
College o f Graduate Studies
University of Idaho
By
Sally Anderson
December, 1997
Major Professor: Dr. Cleve Taylor
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
UMI Number: 9827869
Copyright 1998 by
Anderson, Sally C.
AH rights reserved.
UMI Microform 9827869
Copyright 1998, by UMI Company. AH rights reserved.
This microform edition is protected against unauthorized
copying under Title 17, United States Code.
UMI300 North Zeeb Road
Ann Arbor, MI 48103
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Authorization to Submit Dissertation
This dissertation o f Sally Anderson, submitted for the degree o f Doctor o f Education with a
major in Education and titled, “Using the Malcom Baldrige National Quality Award
Education Pilot Criteria for Self-Assessment of School Districts” has been reviewed in final
form, as indicated by the signatures and dates given below. Permission is now granted to
submit final copies to the College o f Graduate Studies for approval.
Major Professor Date*:
■1
Date:Committee Members
Dr. Michael Tomlin
r Penny Schweibert
Date: #
Date."2
Department
Administrator
Dean, College of
Education
Dr. Roger Reynobison
/ — Date: V 9 $
Dr. .ferry T/lbhscherer
Date
Dr. Dale Gentry
Final Approval and Acceptance by the College o f Graduate Studies
V I A - — Date: _ _ 5 V / 3 / ^ /
Jeanme M. Shreeve
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Abstract
The demand for improvements in education continues. Failed reform attempts,
educational fads, and poor planning designs have been cited as variables affecting the
approach to improvements in public schools. This study examines the literature o f failed
reforms, current approaches to determine the performance o f schools, and models o f
improvement based on quality theory and practices.
This study investigated the perceptions o f three types o f educators— superintendents,
principals, and teachers— regarding the performance o f their school district in seven
categories o f organizational performance. The size o f the district based on student
enrollment was used as the second independent variable to determine if there were any
significant differences in perceptions based on size o f district. An instrument was developed
using the criteria in the Malcolm Baldrige National Quality Award, 1997 version; the
Education Pilot criteria; curriculum audit standards; and accreditation standards. The study
used a proportional stratified random sampling procedure by size o f district and type of
educator. The findings were analyzed using a two-way analysis o f variance for each of the
seven categories.
The study found the reliability of the instrument to be a low o f .74 for School District
Results to a high o f .85 for Leadership. Significant differences in the perceptions of
performance o f the school districts in each o f the seven categories were found to exist
between superintendents and teachers, as well as principals and teachers. No significant
differences were found between superintendents and principals or in any category by the size
of district. The study discusses the implications o f the findings for a framework for school
improvement.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Acknowledgements
The ideas, design, and completion o f this project were the result o f many dialogues
and the collective knowledge o f many. I wish to express my sincere appreciation for the
support and guidance o f my advisor, Dr. Cleve Taylor, and the mentoring o f my committee,
Dr. Roger Reynoldson, Dr. Penny Schweibert, and Dr. Mike Tomlin.
I wish to thank Dr. Mike Friend o f the Idaho School Administrators Association and
Jim Shackleford o f the Idaho Teachers Association for their contributions o f resources and
support for this study. My sincere appreciation goes to Dr. Carolyn Keeler, Dr. Del Siegle,
and Dr. Bill Parrett for their technical expertise and recommendations. I also wish to thank
Eleanor Fisk for her assistance with the laborious task o f scanning the returned instruments
and Alice Gould, Stephanie Fox , Dawn Davis, and Chris Latter for their assistance in the
details and preparation o f this document and the defense.
My most sincere appreciation is to my husband, Mike, and our boys, A. J. and Jon,
for countless sacrifices they made so that my goals could be accomplished.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Dedication
This effort is dedicated to the three men who have taught me the most important
lessons in my life. To the memory of my father who gave me the thirst for new knowledge
and the potential to seek it; to my husband, whose love is the greatest gift o f my life and
whose commitment, support, and patience are true models for all; and to my son, A. J., who
inspires me to grow and who will always be a continual source o f pride and enlightenment.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
vi
Table o f Contents
Page
Authorization to Submit Dissertation...................................................................................................... ii
Abstract.........................................................................................................................................................iii
Acknowledgements.................................................................................................................................... iv
Dedication.......................................................................................................................................................v
List o f Tables.............................................................................................................................................viii
List of Figures..............................................................................................................................................ix
Chapter 1: Introduction............................................................................................................................. 1
Background o f the Problem....................................................................................................... 1
The Effectiveness o f School Reform..........................................................................2
Statement of the Problem..............................................................................................................5
Significance o f the Problem.........................................................................................................5
Traditional Methods for Determining School Performance...............................................6
Quality Models for Organizational Effectiveness..................................................................7
Research Questions........................................................................................................................ 8
Hypotheses.......................................................................................................................................9
Limitations...................................................................................................................................... 9
Delimitations................................................................................................................................. 10
Definitions......................................................................................................................................10
Summary.........................................................................................................................................13
Chapter 2: Literature Review...................................................................................................................14
Introduction....................................................................................................................................14
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Education as a System ................................................................................................................ 14
Organizational Effectiveness....................................................................................................17
Determining Effectiveness in SchoolSystems.......................................................................21
Quality Theory................................................................................................................34
Continuous Improvement in Business.....................................................................40
Summary.........................................................................................................................................59
Chapter 3: Methodology...........................................................................................................................60
Introduction....................................................................................................................................60
The Research M odel....................................................................................................................60
Instrumentation............................................................................................................................. 61
Subjects and Settings.................................................................................................................. 63
Collection o f Data........................................................................................................................ 64
Data .Analysis................................................................................................................................65
Summary.........................................................................................................................................65
Chapter 4: Findings....................................................................................................................................6 6
Introduction....................................................................................................................................6 6
Rate o f Return...............................................................................................................................67
Characteristics o f Sam ple......................................................................................................... 68
Reliability o f Performance Analysis for School Districts................................................71
Descriptive Analysis................................................................................................................... 72
Inferential Statistical A nalysis...............................................................................................115
Analysis o f “Do Not Know” Responses.............................................................................. 121
Usefulness o f the Instrument as a Tool.................................................................................122
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
viii
Summary......................................................................................................................................124
Chapter 5: Summary, Conclusions, and Recommendations.........................................................125
Summary......................................................................................................................................125
Conclusions.................................................................................................................................126
Recommendations.....................................................................................................................128
References................................................................................................................................................ 131
Appendix A: Instrument........................................................................................................................140
Appendix B: Panel o f Experts Used in Content Validation.........................................................175
Appendix C: Letter to Panel of Experts.............................................................................................176
Appendix D: Matrix o f Population Sample.......................................................................................177
Appendix E: Codes on the Instrument...............................................................................................178
Appendix F: Cover Letter and Directions.........................................................................................179
Appendix G: Postcard Reminder.........................................................................................................181
Appendix H: Letters o f Support..........................................................................................................182
Appendix I: Districts by Enrollment S ize.........................................................................................184
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
ix
List o f Tables
Page
Table 1. Factors o f Organizational Effectiveness............................................................................22
Table 2. Meta-analysis Findings of School System Evaluation Components a
Reported in the Literature....................................................................................................23
Table 3. Curriculum Audit Findings o f 67 School Districts
Between 1988 and 1994...................................................................................................... 29
Table 4. A Comparison Betweeen Teaching Theories of QualityExperts...............................40
Table 5. Baldrige National Quality Award Criteria 1997............................................................ 47
Table 6 . Validity o f the MBNQA M odel...........................................................................................48
Table 7. Accuracy o f the MBNQA W eights................................................................................... 49
Table 8 . Core Values/Concepts o f MBNQA Education Pilot1995............................................. 54
Table 9. 1995 MBNQA Educational Pilot Criteria......................................................................... 58
Table 10. Stratified Random Sample Matrix....................................................................................64
Table 11. Total Return Rates by Educator Position........................................................................67
Table 12. Frequencies and Percentages o f Returns Received by Educator Position
And District Size.................................................................................................................... 68
Table 13. Percentage of Highest Degree and Time in Position by Size and Position...........70
Table 14. Rank Order o f Combined Sam ple....................................................................................69
Table 15. Reliability o f Instrument..................................................................................................... 71
Table 16. Means by District Size and Positions Combined......................................................... 72
Table 17. Means by District Size for Districts With More Than 5,000 Students Enrolled.. 73
Table 18. Means by District Size for Districts With 4, 999 to 2,500 Students Enrolled......73
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Table 19. Means by District Size for Districts With 2, 499 to 1, 000 Students Enrolled 74
Table 20. Means by District Size for Districts With 999 to 500 Students Enrolled...............74
Table 21. Means by District Size for Districts With Less Than 499 Students Enrolled....... 75
Table 22. Means by Educator Position for Superintendents.........................................................75
Table 23. Means by Education Position for Principals.................................................................. 76
Table 24. Means by Educator Position for Teachers......................................................................76
Table 25. Item Frequency and Percentage o f Response by Position and Size o f District
For Items in the Leadership Category............................................................................... 80
Table 26. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the Strategic Planning Category..................................................................87
Table 27. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the Student Focus and Satisfaction/Stakeholder Categories............... 90
Table 28. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the Information and Analysis Category.................................................... 94
Table 29. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the Human Resource Development Category.........................................97
Table 30. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the Educational Process Management Category.................................. 103
Table 31. Item Frequency and Percentage of Response by Position and Size o f District
For Items in the School Districts Results Category.....................................................109
Table 32. Two-Way ANOVA Leadership Construct....................................................................116
Table 33. Two-Way ANOVA Strategic Planning Construct...................................................... 117
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
xi
Table 34. Two-Way ANOVA Student/Stakeholder Construct..................................................118
Table 35. Two-Way ANOVA Information and Analysis Construct........................................119
Table 36. Two-Way ANOVA Human Resource/Management Construct.............................. 119
Table 37. Two-Way ANOVA Educational Process/Operational Management
Construct..................................................................................................................................120
Table 38. Two-Way ANOVA School District Results Construct.............................................121
Table 39. Chi Square for “Do Not Know” Responses..................................................................122
Table 40. Combined Percentage for Usefulness o f Instrumentation........................................123
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
xii
List o f Figures
Page
Figure 1. An Educational System as an Open System.....................................................................16
Figure 2. A Quality Systems Model for Performance Improvement.......................................129
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
1
Chapter 1
Introduction
Background of the Problem
The performance o f schools is currently determined by a multitude o f indicators
based on political, traditional, and institutional influences. Public opinion for the
performance o f the complete system is often based on one or more o f these indicators
(Bracey, 1997; Bushweller, 1996; Elam, Lowell & Gallup, 1996). Although improvements
are occurring in many o f the nation’s schools, results are still anecdotal, isolated, and far
from replicable (Fullan, 1993). Public criticism still abounds and the perception o f inferior
quality and poor performance remains (Bushweller, 1996; Hodgkinson, 1996; Houston,
1996; Huelskamp, 1993). The demands for greater accountability for publicly funded
institutions have not diminished. The lack o f evidence o f improved performance, effective
planning, and the increase spending o f public funds without discernible measures o f tangible
results have led to the demand for more business-like strategies (DeMont, 1973; Gerstner,
1995; Kearns & Doyle, 1988).
School improvement and how to achieve it continues to inspire public, political, and
professional dialogue and debate. The approach to improving public schools is as varied as
the prophets and their doctrines. Little to no sustainable improvements, public hostility, and
disenfranchised teachers are left in the wake of such well-intentioned efforts (English & Hill,
1994). When teachers from the high performing Willamette Primary School in Oregon were
asked why they thought so many schools were failing, they blamed the pursuit o f “it” (Sagor,
1995). Solving the problems in education with a one-solution approach perpetuates the
notion that “it” will remedy the problem and things will be better once we find “it.” These
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
2
types o f solutions are often at a visible, obvious level denying the complexity o f other
interdependent relationships and root causes (Bernhardt, 1994; Deming, 1994; Scholtes,
1995). Non-systemic interventions to improve organizations merely shift problems from one
part o f the organization to another (Senge, 1990).
The Effectiveness o f School Reform
Upon review o f current literature, factors affecting the efficacy o f reform strategies
appear to fall into three categories: (a) selection o f reform initiatives, (b) implementation o f
reform initiatives, and (c) the improvement or change strategy selected.
The 1960s saw a multitude of reform initiatives influenced significantly by a national
concern that American education was falling behind foreign accomplishments and the civil
rights movement (Fullan, 1993). Solutions were often superficial, quick-fix remedies made
impatiently as a result of various pressures facing the decision makers or educational fads
(Fields, 1994; Fullan, 1992). The result was often— and still continues to be— an abundance
of disjointed, incomplete improvement initiatives (Fullan, 1997). The presumption that
developing innovations on a national scale would lead to widespread adoption appeared to be
flawed (Fullan, 1993).
Flawed implementation is another source o f much discussion in the literature of
educational reform. Berman and McLaughlin (1977) did a comprehensive study of programs
that were federally funded. They found many examples o f failed implementation which
included failure to take into account local nuances and capacity, desire for additional funds
for political reasons rather than educational reasons, and the presumption that innovations are
implemented one at a time contrary to the reality o f schools. Another perspective offered by
Fullan and Miles (1992) is the misunderstanding o f resistance. They argue that issues of
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
3
effective implementation and communication strategy are often at the heart o f the reason why
reform fails rather than personal attitudes and resistance (Fullan et al, 1992).
A third reason for failed reform reflected in the literature is related to the absence o f a
specific change strategy. It has been found that often each stakeholder o f education has a
different, and usually faulty, belief about how change occurs (Fullan, 1993; Fullan et al.
1992; Hargreaves, 1997). This results in confusion and conflict during both design and
implementation phases. Denial o f the complexity o f problems and solutions is often
observed. Critics o f past reform efforts advocate for three things: (a) greater recognition of
the complexity o f the educational system; (b) deeper second order changes in the
organization; and (c) the need to be created, designed, and implemented by those
knowledgeable o f the institution (Fields, 1994; Hargreaves, 1997; O’Neil, 1995; Sarason,
1990; Wagner, 1993). Reform efforts, particularly those driven through mandated practices
contingent on state or federal dollars, often result in symbols o f improvement over substance
(Berman, 1977; Wagner, 1993).
During the past decade, most people involved in the reform o f education have come
to advocate a systemic perspective (Fullan, 1992; Timpane & Reich, 1997). The resurgence
o f interest in systems theory applications is currently resulting in a heightened attention to
and recognition o f the complexity of organizations, particularly educational institutions. A
central principle underlying systems thinking is that structure influences behavior (Deming,
1986; Patterson, Purkey & Parker, 1986; Senge, 1990). The structure, therefore, used to
initiate, conduct, and evaluate an improvement process is related to the potential
effectiveness o f each specific solution deployed.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
4
Sarason (1990) proposes two basic premises to influence the design and
implementation o f solutions in school reform efforts. The first is the presence o f a
conceptual framework that recognizes the relatedness o f human behavior. The second is a
thorough understanding of the context o f that improvement. Reformers to date have been
criticized for not having an implicit theory about how to achieve change, and they do not
always recognize the influence o f the intractability o f the system (Fullan & Miles, 1992;
Sarason, 1990). Ignoring these two factors can result in an approach that seeks the cure and
ignores the diagnoses. Focusing on doing the right thing, over doing in the right way, can
result in using the means as the end (Bennis, 1976).
In K-12 educational systems, the development o f school improvement plans often
becomes a substitute for results (Sergiovanni, 1992). A focus on the outcomes or results of
education have rarely been operationalized (Schmoker, 1996). Educators often resist
confronting the results and using them to make decisions for school improvement (Bernhardt,
1994; Schmoker, 1996). Schools are traditionally limited in their use o f information and
have little need to depend on systematic feedback from a variety of their customers
(Bernhardt, 1994; Consortium on Productivity in Schools, 1995; Schmoker. 1996). The
capacity o f data and information to reveal strengths, weaknesses, successes, and failures are
threatening to educators particularly in a political context (Schmoker, 1996). Schmoker
(1996) further states that schools are too poorly organized to see the connection between
effort and outcomes.
The theoretical base upon which improvements are determined and made in the total
organization, or any part o f the total organization, is critically important in demonstrating
outcomes (Deming, 1991). The framework which follows from the theory results in the
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
5
models, methods, and tools (Skrtic, 1991). This study investigates a theory and the
development and use o f a framework for a process o f organizational analysis o f its
performance.
Statement o f the Problem
The process o f improving school performance often lacks an organized strategy,
processes for decision-making, deployment o f those decisions, and a mechanism for
evaluating results o f those strategies (Bernhardt, 1994; Fullan & Miles, 1992). The manner
in which schools think about the school improvement process determines their ability to
deploy it successfully (Bernhardt, 1994). The current methods o f determining organizational
performance in schools, identifying the areas o f improvement, and implementing these
changes lack a conceptual framework which recognizes the relatedness o f human behavior
(Sarason, 1990). The accreditation process, once intended to be a mechanism for self-study,
has become a political formality which focuses on the surface indicators with no mechanism
for deeper improvements leading to results (Portner, 1997). Education is lacking a useful
comprehensive framework for systemic analysis o f its performance and its approach to
improvement.
Significance o f the Problem
Goodlad (1984) remarked that in order to survive, an institution must have the faith of
its clients in its usefulness and a measure o f satisfaction o f its performance. More than 10
years later, public education continues to be challenged by the many constituencies who have
similar criticisms (Bushweller, 1996; Hodgkinson, 1996; Houston, 1996; Huelskamp, 1993;
Gerstner, 1995; Kearns et al, 1988). The use o f measures of satisfaction from the customers
o f education is limited. Subsequent approaches to improving performance that have the
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
6
potential o f increasing satisfaction are diverse, scattered, and often politically motivated
(Consortium on Productivity in the Schools, 1995). This research explores the existing
approaches used to determine the performance o f a school district. The arguments germane
to this research have been organized into three categories: (a) the investigation o f the
traditional methods for currently determining the performance of a school district, (b) the
concept of organizational effectiveness, and (c) the potential use o f models emerging from
quality theory to analyze organizational effectiveness in school districts.
Traditional Methods For Determining School Performance
Determining the performance of education is an undertaking worthy of
in-depth analysis o f its own. It has not been established in the literature that the measures
currently used and analyzed are the appropriate indicators o f the performance o f the
educational system (Huelskamp, 1993). Traditional methods o f assessing the successes and
failures o f public education include, most typically, multiple constituency models. These
models are designed to meet standards or criteria set by various stakeholders for various
purposes (Brassard, 1993).
Traditional models include financial, management, and curricular audit procedures;
program evaluation studies; federal or state compliance reviews; and specific indicators of
student performance. Accreditation is currently the most comprehensive practice which
purports to determine the performance o f a school (Portner, 1997). The accreditation
process, once a status symbol for schools, now is viewed as a routine examine with little
relevance to school improvement (Portner, 1997). It does not, however, look
comprehensively at the entire school district since schools are accredited as singular units.
The performance o f schools, and, therefore, school systems, is often inferred by the general
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
7
public based on the performance o f student scores on annual measures as reported in the
media (Rotberg, 1996). Specific units or functions o f the school district are often reviewed,
as required by state and federal laws, such as financial audits, regulations o f Title I, or the
Individuals with Disabilities Act. The focus o f these processes is to determine compliance
with regulations and the need for any corrective action. Non-compliance in some cases can
mean financial penalties to the district. Curriculum audits offer a thorough process for
assessing the organization, delivery, support, and results o f the instructional process.
Specific standards have been developed and criteria are used to determine the degree of
effectiveness. Professionally trained auditors, external to the district, conduct the process
and prepare a final report. Peer reviews, such as management audits, also occur. They are
often designed by administrators to analyze specific parameters o f management.
Quality Models For Organizational Effectiveness
According to Field (1994), the National Education Association and the American
Association o f School Administrators suggest that few innovations or educational changes
stimulated from outside o f education will occur without educator commitment. Management
is responsible for the design and approach to improving the performance of the system
(Crosby, 1984). The people who understand the processes and their outcomes well enough
are school education leaders within school organizations (Fields, 1994; Sarason, 1990). In
the absence o f a foundational theory upon which to base practices and an organized approach
to accomplish the improvements, the educational leader is vulnerable to public criticism and
negligent of their duties. There is also a need to bring the practitioner into the creation and
design o f the practice (Deming, 1993, Glasser, 1992; Imai, 1986). There are ever-increasing
examples of classroom teachers who are feeling helpless against a barrage o f public criticism
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
8
and increasing, uncoordinated demands o f federal, state, and local officials (Bracey, 1997;
Fullan, 1997; Hargreaves, 1997).
Quality theory, practices, and tools are being used increasingly by educational and
service organizations. Educators are applying quality principles by defining the needs and
perceptions o f internal and external customers, using information to make decisions, and
designing results-oriented strategies for systemic improvement involving people in all parts
o f the organization. The application o f the Malcolm Baldrige National Quality Award,
originally designed for business, has been extended to educational institutions and offers a
framework for analysis and recognition (National Institute o f Standards and Technology,
1995).
Research Questions
The research questions posed in this study are:
1. How do educators perceive their own school district’s performance based on an
instrument designed using the Malcolm Baldrige National Quality Award Education Criteria?
2. Are there differences in these ratings based on type o f educator or size o f district?
3. Do educators find this instrument a useful tool to study these areas o f a school
district?
4. Do educators believe this instrument could be useful in determining school
improvement needs?
Hypotheses
The study will test the following null hypotheses:
Ho i: There are no significant differences in the Leadership category o f the
Performance Analysis for School Districts by type or size.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
9
Hen: There are no significant differences in the Strategic Planning category of the
Performance Analysis for School Districts by type or size.
H03: There are no significant differences in the Student Focus and Student and
Stakeholder Satisfaction category o f the Performance Analysis for School Districts by type or
size.
H04: There are no significant differences in the Information and Analysis category of
the Performance Analysis for School Districts by type or size.
H05 : There are no significant differences in the Human Resource Development and
Management category o f the Performance Analysis for School Districts by type or size.
H o 6: There are no significant differences in the Educational and Operational Process
Management category o f the Performance Analysis for School Districts by type or size.
H07: There are no significant differences in the School District Results category of
the Performance Analysis for School Districts by type or size.
Limitations
The study is subject to the following limitations:
1. The study presumes a truthful response and that respondents will understand
items.
2. Responses to items are subject to personal biases, motivations, perspectives, and
experience of the respondents.
3. Responses are presumed to be independently made.
4. Respondents’ prior knowledge o f theoretical constructs behind the instrument is
unknown.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
10
5. The design o f the study is not experimental; therefore no causal relationship can
be inferred.
6 . Responses will be collected through a mailed survey that decreases the probability
of 1 0 0 % participation.
7. The study presumes that meaningful analyses can be made on less than 100% o f
returned responses.
Delimitations
1. The study is limited to superintendents, principals, and classroom teachers within
Idaho, which affects generalizability of the findings to other educators outside o f Idaho.
2. The entire population will not be used. A proportional stratified random sample
will be drawn from the population o f interest. Therefore, the data realized is subject to the
limitations o f the sample.
Definitions
The following terms are used in the study or in the Performance Analysis for School
Districts instrument:
1. Approach refers to the systems in place to improve quality and customer
satisfaction (Brown, 1994).
2. Collaborative and participatory approach to management is defined as jointly
working to identify problems and determining improvements with others in the organization
who are knowledgeable, involved, and affected by any decisions made.
3. Communication processes refer to methods used to inform and seek opinions from
others.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
11
4. Comparative data or benchmarking refer to an improvement process in which an
organization compares its performance against best-in-class organizations, determines how
these organizations achieved their performance levels, and uses the information to improve
its own performance (Shipley & Collins, 1997).
5. Conventional information refers to standardized and state test scores, enrollment,
attendance, dropout, discipline, and operating budget.
6. Deployment refers to the extent to which an approach has been implemented
across an organization (Brown, 1994).
7. Educational programs and services refer to all programs and services provided to
students and conducted by professional, certified personnel or by non-certified personnel
supervised by certified personnel.
8. Educational support services refer to all programs and services which support
educational programs, such as business operations, transportation, public relations,
purchasing, clerical services, legal services, volunteers, food service, records, buildings, and
grounds.
9. Expectations refer to clearly defined statements describing specific academic,
behavioral, or social criterion to measure achievement.
10. Data and information processes include the collection, management, and
dissemination o f data on enrollment, achievement, operations, and stakeholder satisfaction
that are used in evaluation and planning processes.
11. Internal communication processes refer to personnel and students within the
school district.
12. External communication processes refer to parents and community stakeholders.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
12
13. Human resource area includes employee well-being, satisfaction, professional
development, work system performance, and effectiveness.
14. Human resource indicators include employee well-being, labor relations,
satisfaction, professional development, work system performance, and effectiveness.
15. Leadership refers to district-level senior administrators and board o f trustees.
16. Organizational effectiveness is a social construct referring to the quality o f an
organization that achieves the performance expected o f it (Brassard, 1993).
17. Performance refers to the results produced by the school district as illustrated by
multiple indicators.
18. Performance data includes data or information from all aspects o f the
organization, including student performance measures, enrollment, discipline, human
resources, business operations, and community.
19. Results refer to data on the performance o f the organization (Brown, 1994).
20. School district units refer to the specific schools, departments, or services o f that
school district.
21. Stakeholder refers to individuals or groups, both internal to the school (students,
all personnel) and external (parents, community members, business) which are affected by
the conditions and quality o f education and the preparedness o f graduates.
22. Student conduct indicators refer to measures of student behavior such as
disciplinary infractions, suspensions, expulsions, arrests, etc.
23. Strategic development refers to the process by which members o f an organization
clarify the purpose and develop the necessary procedures and operations to achieve a purpose
and design a strategic plan.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
13
24. Suppliers refer to those businesses or individuals with which the district contracts
for specific services such as training, consulting, transportation, legal, etc.
25. Partnering processes refer to relationships between organizations, community
agencies, and businesses through which services for students and stakeholders are designed,
implemented, and provided.
26. System refers to a complex of elements in mutual interaction (Owens, 1970).
27. Total quality or continuous improvement refer to a system that elicits
organization-wide participation in planning and improving processes in order to meet or
exceed customer expectations.
28. Work systems are defined as the way in which jobs, work, and decision-making
are designed at all levels within the organization.
Summary
There are few organized approaches to the assessment o f performance in school
districts that apply an integrated analysis of the subsystems. The lack o f use o f information
to make strategic improvement decisions and a systems-based approach to assessing the
current effectiveness contributes to unsuccessful reform initiatives in education. An initial
step in any process o f examination is to determine what now exists (Goodlad. 1984). This
study seeks to determine the usefulness of an assessment process to a acquire baseline
perception of the organization’s performance as it exists today using three constituencies of
the organization.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
14
Chapter 2
Literature Review
Introduction
Areas o f literature previously cited in Chapter 1 are investigated in depth in this
chapter. First, it is critical to understand the nature o f the educational institution as a system.
Defining the nature of educational subsystems and their relationship to each other in the
larger system is fundamental to understanding how to study and improve its effectiveness.
Second, organizational effectiveness theory and practice is discussed. Third, current
approaches to determining school effectiveness and issues o f accountability are reviewed.
Fourth, quality theory is discussed as foundational to understanding the emerging
applications in both business and education. Finally, the applications o f the Malcolm
Baldrige National Quality Award in business and education are described.
Education as a System
During the past decade there has been an increased attention to systems thinking. The
field o f systems thinking includes cybernetics, chaos theory, and Gestalt theory, and is
reflected in the works of Ludwig von Bertallanfy, Russell Ackoff, Gregory Bateson, and Eric
Trist (Senge et al, 1994). Ludwig von Bertalanfy recognized the relationships among several
important concepts current in biology in the 1930s (Levine & Fitzgerald, 1992). He named
the integration o f these ideas general systems theory, incorporating cybernetic concepts such
as feedback. Miller (1993) describes the theory as a philosophy of science that studyies
natural phenomena of all sorts as heterogeneous wholes composed o f multiple different but
interrelated parts rather than studying each part in isolation. Three types o f systems are
described in the literature of the biological sciences. Isolated systems are described by
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
15
Nicolis and Prigogine (1977) as those which do not exchange matter or energy with their
environment. Closed systems do exchange energy with their environment while open
systems exchange both energy and matter with their environment (Nicolis & Prigogine,
1977).
Systems thinking is a discipline for seeing the wholes and the pattern o f
interrelationships among key components o f a system (Eisenberg & Goodall, 1993; Owens,
1970; Senge, 1990; Senge et al, 1994). A system is a collection o f parts that interact to
function purposefully as a whole (Deming, 1986; Patterson, 1993; Senge. 1990). Inter­
dependence is the primary quality o f a system. It refers both to the completeness o f the
workings o f a system in its environment and the interrelationships o f individuals that fall
within the system. These interdependent relationships between people give the organization
its culture. Process, feedback, and contingency are also components o f systems (Eisenberg &
Goodall, 1993).
From a systems perspective, a school district, like other organizations, does not exist
as an entity unto itself, yet it often behaves as one (Eisenberg & Goodall, 1993). School
districts are both open and social systems (Hoy & Miskell, 1991; Owens, 1970). A social
system is defined as an interactive, interrelated, and interdependent network o f components
and unique organizational properties that form an organized whole and function to serve
common goals (MacLellan, 1994). An open system depends on the external environment for
their continued existence, requiring resources from external inputs to the systems
(Consortium on Productivity in the Schools, 1995; Deming, 1986; Eisenberg & Goodall,
1993; Owens, 1970; Senge, 19903). Figure 1 presents a school district as an open system.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
16
People []""
Money
Support
Resources
INPUTS OUTPUT
Education
System
$$CUSTOMERS
Higher Education
Taxpayers
Employers
Educated students who
can continue to learn
Figure 1. An Educational System as an Open System.
In open systems, groups outside the system affect the system’s survival and their
ability to change. School districts take in political, financial, and human resources and use
them to create a service. This service results in a product to the surrounding environment of
the workplace, higher education, and community. Open systems theory emphasizes the
dynamic aspects o f organization; that is, the movement in one part leads in predictable
fashion to movement in other parts. They are in a constant state o f flux because they are
open to inputs from the environment (Katz & Kahn, 1978).
School districts have also been described as loosely coupled systems (Weick, 1976).
Weick (1976) explains that the use of the term intends to convey the image that coupled
events are responsive but that each event also preserves its own identity and some evidence
of its physical separateness. There is usually lack o f clarity, coordination, and articulation
between and among subsystems within the larger system, despite their interdependence.
Such systems often are organizations in which accountability and interdependence between
subsystems are low and autonomy is high (Deer, 1976; Fullan, 1980). Subsystems are
purposely not closely connected and do little to control each other’s activities. They tend to
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
17
respond by shutting out environmental threats and increase the sense o f efficacy and
autonomy o f its members.
Theories o f bureaucracy from which schools continue to be organized, have paid little
attention to the organization’s dependency on internal and external environments (Deming,
1986; Eisenberg & Goodall, 1993). The task o f teaching was viewed as clearly understood,
routine, and predictable (Katz & Kahn, 1978; Owens, 1970). This mechanistic approach to
organizing work in schools is considered to be efficient. This structure does not work when
the environment is uncertain and, in fact, interferes with the organization’s ability to be
adaptive to its inputs (Consortium o f Productivity in Schools, 1995). The influence of
systems theory results in emphasis on inputs, processes, how the processes interact,
information flow and feedback, management o f relationships, and outputs (Deming, 1986;
Eisenberg & Goodall, 1993).
Organizational Effectiveness
The body of knowledge o f organizational theory, behavior, and the pursuit of a model
to determine organizational effectiveness is substantial, confusing, and often in conflict
(Brassard, 1993; Georgopoulos, 1957; Zammuto, 1982). Organizational effectiveness has
been defined in the literature in a variety o f ways. Attempting to define effectiveness,
develop criteria, and apply them to a variety o f organizations continues to be noted in the
literature (Brassard, 1993; Cameron, 1980; Georgopoulos, 1957; Zammuto, 1982).
Yuchtman (1967) points out two assumptions that are either implicitly or explicitly made: (a)
Complex organizations have an ultimate goal or function, and (b) the ultimate goal can be
identified empirically and progress toward it measured. How organizational effectiveness is
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
18
defined is related to the theoretical model from which it was developed. Three models
emerge in the literature (Brassard, 1993; Zammuto, 1982).
As early as the 1930s the emergence o f a goal-based approaches to organizational
effectiveness can be seen (Brassard, 1993; Cameron, 1980; Zammuto, 1982). These models
are often referred to as rational models and are functional rather than conceptual
(Georgopoulos, 1957). Organizational effectiveness is seen as the degree o f achievement of
multiple goals or the degree o f congruence between organizational goals and observable
outcomes (Zammuto, 1982). The focus o f the rational organization is goal orientation
(Cameron, 1980; Patterson, Purkey & Parker, 1986). The design, articulation, and
achievement o f goals are emphasized in the organizations applying this model. The
assumptions in this model are that goals remain stable over time, goals are determined by the
leaders o f the organization, and goals become translated into objectives within the sub-units
of the organization (Patterson et al, 1986). The organization is seen as an entity rationally
structured in order to achieve the goals to which it ascribes. The goals are typically created
to help the organization achieve its expected performance.
The focus o f evaluating effectiveness from this model is on the outputs produced by
the attainment o f goals (Cameron, 1980). The development o f efficiency-related criteria to
insure the accomplishment o f goals is often designed to influence the use o f resources to
achieve optimal performance, productivity, and profits for the organization (Bressard, 1993).
This model led to such practices as management by objectives that remained popular through
the 1970s. The focus o f management in this model is the accomplishment o f the
organization’s goals (Hersey & Blanchard, 1982). The emphasis o f management within this
model is setting goals and objectives that are accomplished by motivating and controlling
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
19
others in the organization to carry them out. In practice, however, organizations often do not
reflect these goals in their daily activities, either at a micro or macro level (Brassard 1993).
There are limitations to a goals approach pointed out in the literature (Cameron, 1980;
Etizoni, 1960; Katz & Kahn, 1978). Success may be overlooked if there is no goal to
measure it. Goals may be too low, misplaced, or harmful long term. Goals are usually
expressed as idealized states and are often not realistically assessed. The nature and effects
o f the social systems from which goals emerge often are not considered in the attainment o f
goals. The goals model may be useful when organizational goals are clear, consensual, and
measurable (Cameron, 1980; Patterson, Purkey & Parker, 1986). The criteria for determining
effectiveness then become unique to that organization and its goals.
Systems-based approaches emerged during the 1950s, according to Owens (1970) and
Zammuto (1982). These models draw on the emerging body of general systems theory
discussed initially in this chapter. Applying this theory, organizational effectiveness is then
viewed as the extent to which an organization as a social system fulfills its objectives without
incapacitating its means and resources and without placing a strain upon its members
(Zammuto, 1982). W. Edwards Deming believed that systems are developed to perform
repetitive tasks (Deming, 1982). Most problems within organizations, he believed, came
from sub-optimization of that system, meaning the system was performing these tasks below
their capability. The inconsistencies and contradictions that become apparent upon analysis
o f the system can be used to detect and isolate the flaws of that system (Bradley, 1993).
Other models appearing during the 1970s are referred to as the multiple constituent
definitions o f effectiveness. The organization is effective insofar as it meets the expectations
o f actors associated with it in one way or another and who try to promote their objectives and
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
20
interests (Brassard, 1993; Cameron, 1980). An organization is effective insofar as the
majority o f those participating in it perceive that they can use it to satisfy their interests. This
model stresses the importance o f satisfying the expectations o f the actors who agree to
support the organization and who influence its ability to obtain the resources it needs and
conserve its legitimacy (Brassard, 1993). This model acknowledges up front the influence
that these constituents have on the organization. Related to these models are those
approaches driven by the requirements o f external organizations such as accreditation
agencies, laws, and regulations (Brassard, 1993).
Dubin (1976) pointed out that organizational effectiveness has a different meaning,
depending on whether the organization is viewed from the outside or inside. The inside
perspective o f an organization tends to be a traditional managerial viewpoint which
emphasizes return on investment and efficient use o f resources. The perspective from the
outside evaluates the output of the organization relative to its contribution to the environment
or the context outside the organization. Dubin (1976) further points out that there is no
correlation relationship between these two perspectives. In fact, he says, they are worlds
apart and cannot be reconciled. “We must face squarely the fact that organizations live under
conflicting demands regarding their effectiveness” (Dubin, 1976, p. 8). Bass (1952)
suggested the criterion of organizational success needed to be expanded to include measures
relevant to employees, society as a whole, and the organization’s management. He suggested
organizational performance should be assessed based on: (a) the degree to which an
organization’s performance was profitable and productive, (b) the degree to which an
organization was of value to its employees, (c) the degree to which an organization and its
members were of value to society. Campbell et al (1974) found over 25 different variables
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
21
that were used as measures o f effectiveness in organizations prior to 1973 (see Table 1). The
most commonly recurring criteria were adaptability/flexibility, productivity, and satisfaction.
Determining Effectiveness in School Systems
MacLellan (1994) did a meta-analysis o f organizational effectiveness criteria used in
school systems. He found fifteen criteria: goals, environment, leadership, structure, work
force, interaction, process, decision-making, workplace, culture, change, communication,
curriculum, accountability and politics. The studies used ranged from 1967 through 1991.
Determining the effectiveness of schools is even more elusive than for other
organizations. Schools, universities, and colleges have been referred to as organized anarchy
in the literature o f organizational study (Cameron, 1980). Some typical characteristics are:
1. Goals are ill-defined, complex, changing, and often contradictory. Goals of some
sub-units may be unrelated to the broader organizational goals.
2. There is often no connection in the way work is done and the outcome.
3. More than one strategy can produce the same outcome.
4. There is little or no feedback from the output to the input.
5. Sub-units are not tightly connected, so it is easier to ignore outside influences.
6. Widely differing criteria of success may be operating simultaneously in various
parts of the organization.
There is often an ambiguous connection between the organizational structure and the
activities o f the organization. It is typical to find rigid structures and hierarchies imposed
upon loose, fuzzy processes.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
22
Table 1. Factors o f Organizational Effectiveness
Overall effectiveness o f the organization Productivity
Efficiency Profit
Quality Accidents
Growth Absenteeism
Turnover
Motivation
Control
Flexibility/adaptation
Role and norm compliance
Readiness
Utilization o f environment
Evaluations by external entities
Internalization o f organizational goals
Satisfaction
Morale
Conflict/cohesion
Goal consensus
Managerial task skills
Managerial interpersonal skills
Managerial management communication
Stability
Value o f human resources
Note: From The Measurement o f Organizational Effectiveness: A Review o f Relevant
Research atid Opinion (pages 39-40), by J. P. Campbell, E. A. Brownas, N. G. Peterson and
M. D. Dunnette, 1974, San Diego: Naval Personnel Research.
Cameron (1980) makes the point that none o f the described models of organizational
effectiveness will work for organized anarchy. Criteria o f effectiveness are usually vague
and ambiguous, making organizational goals difficult to measure and not necessarily agreed
upon by all sub-units. There is often no feedback loop between outputs and inputs, making
the systems model an unnatural fit. Cameron (1980) suggests that the multiple constituencies
model may be the most appropriate for the organized anarchy. The demands o f the
constituencies, once defined, can be assessed on the degree to which they are met.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
23
Although there is substantial information regarding evaluation of schools and school
programs in the literature, there is little consistency in the approaches and parameters
evaluated. Eight components are reported by Nowaiowski (1985). They include business
and finance, curriculum and instruction, policy, planning and evaluation, pupil personnel
services, personnel, school-community relations, and school management. MacLellan (1994)
ranked components o f school systems evaluated in the literature. Table 2 illustrates his
findings.
Table 2. Meta-analysis Findings of School System Evaluation Components by Rank as
Reported in the Literature.
1. Goals 8. Decision-making
2. Environment 9. Work place
3. Leadership 10. Culture
4. Structure 11. Change
5. Workforce 12. Communication
6. Interaction 13. Curriculum
7. Process
Note: From “Towards a New Approach for School System Evaluation,” (Page 159), by
David MacLellan, 1994. (Doctoral dissertation, Dalhousie University, Nova Scotia).
DisseNational Abstracts International.
There are three methodologies that appear to be represented in a substantial manner in
the literature: (a) evaluation research, (b) curriculum audits, and (c) effective schools
research. The researcher has also included a discussion o f the accreditation process in Idaho.
The discussion o f the effectiveness of schools is often linked with discussions of
accountability or to performance of student learning. Although both factors are relevant to
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
24
this study, the researcher focused on models that were using multiple indicators of
effectiveness.
A key deficit in most educational systems, which is all to frequently pointed out by
the critics of public education, is the lack of effective evaluation (Worthen, 1987). The
public demand for accountability has made many educators fearful o f the concept. DeMont
and DeMont (1973) suggested three improvements to the process o f demonstrating
accountability: (a) an increased focus on the outputs o f education, (b) the production of more
effective evaluative and research models, and (c) the inclusion o f non-educators in the
decision-making process. They suggest that an accountability model be a comprehensive
plan for problem solving aimed at improving educational practice. The requirements of this
model include: (a) the designation o f the persons responsible for the program operation.
(b) conducting an internal program review, (c) conducting an external program review, and
(d) use o f the results to diagnose needs and prescribe action.
Evaluation research has been a tool used frequently in public schools to make
judgments about the merit, value, or worth of educational programs (Gall, Borg & Gall.
1996). They are most often used to determine the effectiveness o f specific programs,
benefits to cost ratios, or areas for improvement. Formal evaluation consists of systematic
efforts, using qualitative and/or quantitative designs to define criteria and obtain accurate
information (Worthen, 1987). Formal evaluation studies are often done as a basis for
decision-making and policy formation, to evaluate curricula, monitor expenditure of public
funds, or improve educational programs (Worthen, 1987). Worthen (1987) notes that many
evaluation studies do not lead to significant improvements in school programs. He cites
several reasons including inadequacies o f research design, the use o f evaluation information.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
25
and the view o f evaluation as a discrete study rather than a system o f self-renewal (Worthen,
1987/ The Joint Committee on Standards for Educational Evaluation (1994) developed
standards designed for use in judging quality o f educational evaluation. These standards
cover criteria involving the utility o f the evaluation, its usefulness to the persons involved,
feasibility o f the design to the setting, legal and ethical factors, and the extent to which the
study yields valid, reliable, and comprehensive information for making judgements.
Guba (1981) outlined four major models o f educational evaluation: (a) objectives,
(b) outcomes, (c) effects, and (d) audience concerns. Evaluation approaches that are based
on specific goals and objectives assess the congruence between the standard or the goal and
the performance (Provus, 1971). In a discrepancy-based model o f evaluation, standards are
defined and developed, the performance is assessed, the discrepancy is determined, there is
feedback to the decision-makers, and there is a decision. The critical point in this model is
the establishment o f a standard and assessment against that standard. The C. I. P. P.
(Context, Input, Process, Product) model is a decision-making approach relying on
generation o f information to be used in making decisions (Stufflebeam, 1983). The context
provides an illustration o f the needs and goals, input on how resources and procedures are
used to reach goals. The process focuses on any defects in the implementation of those goals
and products and the measurement of the outcomes. Scriven (1973) proposed the consumer-
oriented model that included establishing standards or indicators; comparing effects to
benefits and costs; and making judgements about change, use, and choice. The focus in this
model is the judgement o f merit or worth. The countenance model— later called the
responsive mode— distinguishes three phases: (a) antecedents, (b) transactions, and
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
26
(c) outcomes (Stake, 1967). This model relies on an informal framework in which
observation, judgement, and data matrices are emphasized.
Curriculum audits provide another source o f evaluating organizational effectiveness
for schools. Curriculum management audits were first offered by the accounting firm of
Peat, Marwick, Mitchell and Company where a partner, Fenwick English, adapted it from the
financial audit process (Vertiz, 1995). He brought the service to the American Association o f
School Administrators which created the National Curriculum Audit Center. The Center
trains curriculum auditors and contracts with school districts. The first audit was done in
1979 in the Columbus Public Schools in Ohio. As o f April 1995, curriculum audits had been
performed in nearly 100 school districts in the United States and two foreign countries
(Vertiz & Bates, 1995). According to Vertiz (1995), the audit became an important data
source in state take-over o f school systems in New Jersey and Kentucky. It is based upon the
concepts of effective instruction, curricular design, and delivery. The audit is designed to
determine the extent to which a sound, valid, and operational system o f curriculum
management is implemented (Vertiz & Bates, 1995). According to Vertiz and Bates,
curricular quality control requires: (a) a written curriculum in a clear, translatable form for
application by teachers in classrooms or related instructional settings; (b) a taught curriculum
which is shaped by, and is interactive with, written curriculum; and (c) a tested curriculum
which includes the tasks, concepts, and skills o f pupil learning that are linked to both the
taught and written curricula.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
27
English (1988) described the five standards he created for the auditing process:
1. The school district is able to demonstrate its control o f resources, programs, and
personnel. Control refers to the system’s ability to channel and focus its resources toward the
achievement o f its goals and its mission (Kemen, 1993). Auditors look for indicators that
demonstrate linkages between the board, central management, and the instructional process
(English, 1988).
2. The school district has established clear and valid objectives for students.
Auditors examine board policy, administrative procedures, courses o f study, and scope and
sequence of curriculum (English, 1988).
3. The school district has documentation explaining how its programs have been
developed, implemented, and conducted. The district must demonstrate clear and operational
linkages between all layers o f the system. Auditors look for alignment between policy,
curriculum, instruction, materials, and assessment (English, 1988).
4. The school district uses the results from the district designed or adopted
assessments to adjust, improve, or terminate ineffective practices. Auditors evaluate the
extent to which the district collects data to evaluate its performance. The data should reflect
its goals, provide usable information that should be used to adjust, or improve district goals
(English, 1988).
5. The school district has been able to improve productivity. Productivity is the
relationship between the inputs and the cost o f obtaining any given level o f outputs (English,
1988).
Each standard encompasses numerous criteria. These criteria are evaluated through
document reviews, interviews with the board and professionals, and observations by trained
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
28
auditors who are school administrators external to the organization. The data is then
triangulated according to agreed upon conditions. Table 3 illustrates specific criteria and the
findings o f a study o f audits conducted between 1988 and 1994 by Vertiz and Bates (1995).
The authors conclude that the majority o f school districts who participated in the audit
process were deficient in major management structures and functions that pertain to the
design and delivery o f curriculum. The investigators found that 90% or more o f the findings
were deficient in 80% o f the areas investigated.
Kamen (1993) found that the extent o f implementation o f the audit recommendations
is dramatically affected by the nature o f the audit selection method. When the audit is
voluntarily selected by a district, there is a high level o f implementation. There are positive
effects generally as demonstrated by greater empowerment of all personnel and a tendency
towards a systems perspective. Management processes appeared to improve. Results
suggest that there is significantly less implementation o f recommendations when the process
is mandated. Under some conditions, it can become a political battleground with resistance,
denial, and defensiveness.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
29
Table 3. Curriculum Audit Findings o f 67 School Districts Between 1988 and 1994.
Standard number: Criteria Strong rating Deficient rating
1 : Policy design 6% 94%
I : Policy implementation 0% 100%
1 : Planning design 10% 90%
1 : Planning implementation 10% 90%
I : Organizational structure 5% 95%
1 : Organizational implementation 27% 73%
1 : Personnel practices 0% 100%
1 : Personnel supervision and supervision 14% 86%
2 : Instructional goals and objectives 6% 94%
2 : Curriculum scope 22% 78%
2 : Curriculum guide: design 2% 98%
2 : Curriculum guide: delivery 0% 100%
2 : Curriculum management structure 3% 97%
3 : Internal consistency 3% 97%
3 : Equity: design 5% 95%
3 : Equity: implementation 7% 93%
3 : Monitoring practices 4% 96%
3 : Staff Development: design 2% 98%
3 : Staff development: delivery 0% 100%
3 : Articulation and coordination 2% 98%
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
30
Table 3, cont’d. Curriculum Audit Findings o f 67 School Districts Between 1988 and 1994.
Standard number: Criteria Strong rating Deficient rating
Testing program: scope 2% 98%
Testing program: quality 3% 97%
Use of assessment data 2% 98%
Use o f program evaluation data 0% 100%
Curriculum-driven budget 0% 100%
Cost effectiveness 4% 96%
Organizational improvement 0% 100%
Facilities 39% 61%
School climate 83% 17%
Support system functioning 30% 70%
Note: From The Curriculum Management Audit: Revelations About Our School (1995), by
Virginia Vertiz and Glynn Bates. Paper delivered to the American Education Research
Association, Division B.
Effective schools research offers a set o f criteria for determining organizational
performance. Effective schools have been described by several parameters. Effective
schools add value through their services, high evaluations from students, high expectations
and high norms of achievement, strong leadership, collaborative decision-making, clear
goals, system wide culture, safe environment, and a dedicated workforce (Mann, 1976,
Purkey & Smith, 1982). Seven characteristics emerged from the body o f literature known as
effective schools research (Edmonds, 1980). They include (a) strong, instructional
leadership; (b) a safe, orderly climate; (c) high expectations for achievement; (d) emphasis on
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
31
basic skills; (e) continual monitoring o f progress; (f) goals that are clear and understood; and
(g) culture.
Research began in the mid-1970s to determine why some schools were effective and
others were not. The research of Ron Edmonds and Lawrence Lezotte began in large urban
schools. Their study resulted in the identification o f correlates present in effective schools
(Edmonds, 1979; Lezotte & Bancroft, 1985). Strong administrative leadership was found to
exist, particularly focused on planning, supporting and monitoring the instructional process.
There were high expectations for all students and the staff o f the building. A positive school
climate existed in the building as evidenced by a sense o f pride and community. There was a
focus on the instructional program in the total school with emphasis on training in teaching
practices. Finally, there was a thorough assessment process allowing for continual
monitoring o f student progress. Numerous improvement strategies followed that focused on
developing the specific correlates in schools. Stefanich (1983) pointed out that much of the
impetus for these applications was based on intuitive rationale rather than hard data. Lezotte
(1989) has since integrated quality theory into his approach to school improvement, citing
such precepts as the importance of an attitude o f continuous improvement, a deliberate
change strategy, and attention to all parts of the system. Some independent contractors using
the correlates o f effective schools as the standard have created an audit-type process.
Each state has an accreditation process usually affiliated with a regional accrediting
organization (Portner, 1997). In Idaho the purpose o f accreditation is to help schools achieve
the required Standards for Idaho Schools and enhance school improvement (Idaho State
Department of Education, 1996). There are four options for how Idaho schools seek
accreditation. They may choose one o f the following options:
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
32
1. The Idaho Elementary/Secondary Accreditation Standards.
2. The Northwest Accreditation Standards.
3. The Idaho School Accreditation School Improvement Model.
4. An alternative school improvement plan.
Regardless o f the option selected, the school must demonstrate the standards defined
and the components o f thoroughness as specified by Idaho Code 33-119. A thorough system
o f education has been defined in Idaho code as one in which:
1. A safe environment conducive to learning is provided.
2. Educators are empowered to maintain classroom discipline.
3. The basic values o f honesty, self-discipline, unselfishness, respect for authority,
and the central importance o f work are emphasized.
4. The skills necessary to communicate effectively are taught.
5. A basic curriculum necessary to enable students to enter academic or vocational
post-secondary educational programs is provided.
6. The skills necessary for students to enter the workforce are taught.
7. The students are introduced to current technology.
8. The importance o f student’s acquiring the skills to enable them to be responsible
citizens o f their home, schools, communities, state, and nation is emphasized.
Regardless o f the option selected, schools must demonstrate on an annual basis the
five required standards:
Standard I: Philosophy/Mission, Vision, Polices: School philosophy and policies
need to be aligned with thoroughness legislation.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
33
Standard II: Personnel and Certification: All educators o f Idaho students must be
certified as specified in the State Board o f Education Rules for the Public Schools o f Idaho.
Standard III: Curriculum/Instruction/School Improvement: This standard is defined
in the thoroughness legislation.
Standard IV: Accountability/Assessments/Measures: Schools must establish
standards for all grade levels and high school exiting standards for graduation, participate in
statewide testing programs, have written plans to reduce dropouts, and report on student
attendance.
Standard V: Safe Learning Environment: Schools must have safe facilities. Each
school must have a comprehensive, district-wide policy and procedure in place encompassing
safe environment and discipline.
There are specific additional standards for each level— elementary, middle, and
secondary. Each standard has specific criteria to which deviation points are assigned.
Schools are accredited annually according to ratings determined by points. If schools receive
a status of not approved for more than one consecutive year, state funds can be withheld and
a report to the public is made. The Northwest accreditation process involves a self-study for
initial accreditation involving staff, students, and community (NASC, 1996). The
accreditation process runs on a ten-year cycle involving a self-study during the ninth year of
the process.
What is unclear in the literature is how the information from any method of
determining organizational effectiveness is used. Brassard (1993) cautions against the need
to compare the performance o f organizations or to identify characteristics o f those that are
effective. Having criteria o f effectiveness reinforces the notion that: (a) organizations
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
34
possess an inherent rationality and (b) criteria become requirements which are imposed often
independent o f their purpose. He makes the point that criteria adopted must define the
performance that the organization must achieve if it is to be useful. Hannan and Freeman
(1977) argue that inter-organizational comparisons can not be accomplished because there is
no way for scientific analysis o f comparative organizational effectiveness.
The above review o f approaches to the overwhelming task o f determining
organizational effectiveness illustrates the varied strategies used in the past and present.
Such performance assessments are done for different reason. There is little discussion in the
literature regarding the process o f evaluation of organizational self-study for the ultimate
purpose of improvement. There is an increasing interest in action research or practitioner
based research done by the practitioners within their own site as a reflective process of
investigation (Anderson et al, 1994). Practitioner research is best done as a collaborative
effort to accomplish multiple perspectives for the purposes o f taking actions in a specific
situation. Zammuto (1982) points out that it is useful to remember that organizations are
social inventions created to satisfy human needs. These needs influence how people evaluate
the effectiveness of organizational performance based on their experience with organizations
and the impact of that performance on them or their preferences. The purpose o f assessment
in anything is to determine the performance and then improve it.
Quality Theory
The importance o f theory as it relates to the areas cited above is explored since from
theory, assumptions, models, practices, and tools emerge (Skrtic, 1991). Many companies
today are using total quality theory or continuous improvement theory as both a conceptual
framework and operationally (Walton, 1990). Total quality or continuous improvement is an
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
35
approach to organizational development that has both historic roots and evolving tenets. It
involves both reflective and active components for organizational development. It has been
defined as a people-focused management system that aims at continual increase o f customer
satisfaction at continually lower real cost (Crosby, 1984; Deming, 1986; Imai, 1986). This
theoretical model is a systems approach to organizational improvement, meaning that
improvements should be made with the whole organization in mind (Deming, 1986, 1994;
Senge, 1990). The terms total quality control and total quality management were coined by
Fiegenbaum (1983). He defined total quality control as “an effective system for integrating
quality development, quality maintenance, and quality improvement efforts o f various groups
in an organization so as to enable marketing, engineering, production, and service at the most
economical levels to allow for full customer satisfaction” (Fiegenbaum, 1983, p. 823). He
used the term total to mean a systems approach to achieve excellence. He defined quality in
terms o f the specific requirements o f the customer.
Japanese management theory has influenced quality theory in the Western world
significantly. Referred to as Kaizen in Japan, it is the single most important concept
influencing Japanese management (Imai, 1986). The Kaizen philosophy means on-going
improvement through the involvement o f everyone, in all aspects o f life. Imai remarks, “I
came to the conclusion that the key difference between how change is understood in Japan
and how it is viewed in the West lies in the Kaizen concept; a concept that is so natural and
obvious to many Japanese managers that they often do not realize that they possess it!”
(Imai, 1986, p. 3). He concludes that this concept is either very weak or non-existent in
American and European business based on his many years o f studying the differences.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
36
There are consistent fundamental principles in quality theory, which emerge upon
review o f the literature by those who are credited as being the major experts in the quality.
The researcher will focus on these principles rather than an in-depth analysis o f the historical
perspectives o f quality theory. Upon review o f the literature, it is clear that quality theory
has emerged from a variety o f historical management approaches, economic contexts o f the
times, cross-cultural influences o f both East and West, and an ever-increasing body of
knowledge that is evolving through practice (Crosby, 1984; Danne, 1991; Deming, 1986).
W. Edwards Deming, often considered the “father of quality,” developed a theory of
profound knowledge that incorporates the major tenets o f quality theory (Deming, 1986,
1989, 1994). He believed that not only skills, but also knowledge about management was
paramount. Deming (1989) stated, “hard work and best efforts, put forth without guidance of
profound knowledge, leads to ruin in the world that we are in today. There is no substitute
for knowledge” (Deming, 1984, p. 10). The system of profound knowledge includes four
principles, each related and interacting with the other.
The first principle is appreciation for a system, which Deming defined as “a network
o f interdependent components that work together to try to accomplish the aim of the system”
(Deming, 1984, p. 50). He stressed the interdependencies within a system and the necessity
o f cooperation among the parts. The greater the independence between the components, the
greater the need for communication and cooperation between them. The system needs to
have an aim that is clear to all in the organization. Without this clear purpose, says Deming.
the aim becomes a value-judgement made on individual bases (Deming, 1984). Deming
often used the example of a good orchestra to illustrate a well-optimized system. “The
players are not there to play solos as prima donnas, to catch the ear o f the listener. They are
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
37
there to support each other. They need not be the best players in the country.” (Deming,
1984, p. 15). According to Deming, management o f a system is action based on prediction.
The prediction needs to be rational and based on the information that the system teaches
people in the organization. Therefore, the performance o f any part o f the system must be
judged in relationship to other parts and the aim of the system.
A second element is knowledge o f variation or statistical theory. Deming believed
that without statistical analysis methods, attempts to improve a process would be hit or miss.
Understanding that variation will always exist in all components o f a system— people,
processes, results— is fundamental. He called for an understanding o f the capability of a
process. Developing stable processes— which means the process is in a state o f statistical
control— is the goal in determining a system’s capability. He makes the distinction between
the types o f variation, special cause, and common cause. Common cause he defines as the
variations that occur by chance and can be attributable to a system Special causes, on the
other hand, are caused by events outside of a system. Deming felt that these were important
to know before one attempted to work on a system (Deming, 1986, 1989). If these
distinctions are not understood, he suggested, mistakes can be made that are costly and
ineffective.
The prevention of errors and nonconformance to specifications are key principles
resulting from the knowledge o f variation (Crosby, 1984; Deming, 1986). Philip Crosby, a
recognized quality expert, invented the term zero defects which he defined as no acceptable
rate of defects for products or services that do not meet customer’s requirements (Crosby,
1984). The emphasis is on prevention, rather inspection or the process of detecting the good
and the bad.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
38
Deming’s third principle builds on the need to use the system and its variation to
generate what he called the theory o f knowledge (Deming, 1986, 1989, 1994). He believed
that good intentions were not enough for management. Managers, he insisted, need to
continually build knowledge and theories based on that knowledge. Deming believed that
this was the only basis for management’s ability to predict. When processes are in the state
o f statistical control, statistical theory can assist in prediction. Theories then emerge from
this knowledge. Theories, professed Deming, are necessary to generate questions. Without
questions, there may only be examples o f successes, and if these are duplicated under the
pretense o f a solution, they can lead to failure. Continual approach o f narrow solutions can
lead to more and more of the solution (Senge, 1990). Theory is critical in optimizing a
system which can meet the customer’s expectations the first time (Deming, 1986, 1989;
Crosby, 1984).
Joseph Juran, another quality expert, extended Deming’s beliefs as they pertained to
knowledge-based decisions to the role o f management (Juran, 1988). He believed that it was
the responsibility o f top management to lead the company through massive training in
quality. Juran placed an emphasis on planning, customer satisfaction, and the use o f data
collection and analysis and has been credited with being the first to address the broader
issues o f management as they relate to quality (Danne, 1991; Miller, 1993).
The fourth principle o f profound knowledge is psychology. Deming felt that this
body o f knowledge was critical in the interaction between people and circumstances, the
interaction between people, and the interaction between people and the system (Deming,
1986, 1989). He emphasized the importance o f leaders in recognizing the differences in
people and using these differences to optimize the system. Recognition of differences in how
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
39
people leam, how they work, and how they relate to each other is an additional factor that a
manager should understand. Leaders are obligated to make changes in the system which will
result in improvement (Crosby, 1984; Deming, 1986).
Critical to this system o f profound knowledge are the following:
1. People have an innate need for self-esteem and respect.
2. Circumstances can either provide or deny people opportunities for dignity and
self-esteem.
3. Management that denies opportunities for dignity and self-esteem will smother
intrinsic motivation.
4. Some extrinsic motivators rob employees o f dignity and self-esteem.
5. Management should recognize the innate inclination o f people to leam and invent.
Deming believed that new systems o f rewards needed to be established to restore respect for
the individual and release the potential o f human resources (Deming, 1986, 1989, 1994).
Organizational behavior can affects the quality o f services, products, and, in the case of
schools, the quality o f instruction (Deming, 1994; Patterson et al, 1986).
What has emerged from the Deming system o f profound knowledge is an evolving
body o f knowledge that incorporates systems theory, scientific method, management by fact,
and participation o f everyone within the system. Each of the quality experts mentioned have
similar messages emphasizing different concepts. Table 4 provides a matrix o f key quality
principles and the interpretation o f each offered by Deming, Juran, and Crosby.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
40
Table 4. A Comparison Among Teaching Theories o f Quality Experts.
Concept Deming Juran Crosby
Definition Predictable degree of Fitness for use Conformance
o f quality dependability suitable to requirements
to market
Performance Use o f statistics to Avoidance of Zero defects
Standards measure performance campaigns to do
in all areas perfect work
Approach to Optimization o f system; Management must Prevention;
Improvement elimination o f goals consider human process
without methods side of quality development
Statistical Use SPC for Use could lead Rejection of
Process quality control to “tool-driven” statistically accepta
Control approach levels of quality
Employee Employee Use of teams; Quality
Participation participation in quality circles improvement
decision-making teams
Continuous Improvement in Business
The history o f the recent movement to improve performance in the private sector is
relevant to current and future applications in other settings. The origins o f the quality
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
41
movement can be traced to the 1940s and the context o f World War II (Pines, 1990). The
United States War Department established a Quality Control section in 1942 as a response to
an increased demand for mass production of weapons and other war materials. Staff from
Bell Telephone Laboratories were used— primarily two statisticians, Walter A. Shewhart and
W. Edwards Deming. Their approach was to predict the performance o f production by
measuring manufacturing processes and stabilizing their performance. When these statistical
techniques were applied, America’s defense was exemplary. At that time, the progressive
approach to manufacturing was referred to as acceptable quality levels (AQL), which
assumed that there was an acceptable level of allowable failures. The approach offered by
Shewhart and Deming suggested that this approach was one of the reasons why the United
States was seeing a decline in productivity compared to other countries.
Garvin ( I98S) reports four major quality eras. Prior to and during the 1930s, the
emphasis was on inspection. Processes for detecting defects such as grading, counting, and
repairing were common in American businesses. From the 1930s to the 1950s, statistical
quality control became popular. This strategy assumed that the principles of probability and
statistics would allow managers to control the variation in a production process to determine
if the cause o f the variation was inherent in the process or the result o f a special cause.
During the 1950s and 1960s, the quality assurance movement emphasized the planning
function, and the concept o f continuous process improvement was originated. The linkage
between quality and controlling costs was made. Beginning in the 1980s, the quality
management period was significantly influenced by W. Edwards Deming. An NBC-TV
documentary that aired on June 24, 1980, IfJapan Can, Why Can't We? explored how
Japanese products came to be perceived as far superior to those of the United States (Walton,
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
42
1990). During an interview, Deming, age 79 at that time, shared how he taught Japanese
management and engineers how to use quality as a system. These techniques enabled them
to detect and eliminate defects, cut down on waste, reduce costs, and increase productivity.
He used methods referred to as statistical process control (SPC). Although they had been
used in America after World War II, their use faded when volume overruled quality.
Since 1980, many companies have adopted quality principles and practices. Curt
Reimann, recently retired Director o f the Malcolm Baldrige National Quality Award,
reported in a telephone interview that all o f the high performing companies today are, in one
way or another, applying quality principles and practices. He further related that there were
many examples o f failed attempts, but companies that have successfully applied these
principles and became learning organizations are realizing results. Brown (1994) found that
some executives felt quality peaked in 1992 and many companies have abandoned quality to
resume a back-to-basics approach emphasizing results. Reimann pointed out in the interview
that the only reason to attend to processes was to improve results. This unfortunate but
common misunderstanding in the application o f quality has been substantiated by the
literature (Brown, 1994).
To help encourage United States companies and reward them for providing high
quality products and services, the Malcolm Baldrige National Quality Award was created in
1987 under President Reagan. The award was named after Malcolm Baldrige, the Secretary
of Commerce credited for his managerial approach to long-term improvement in economy,
efficiency, and effectiveness in government (National Institute of Standards and Technology,
1994). By enacting the Stevenson-Wydler Technology Innovation Act o f 1980, Congress
established the Baldrige Award which created a public-private partnership designed to
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
43
encourage quality in American companies (Brown, 1994). Gavin (1991) stated, “In just four
years, the Malcolm Baldrige National Quality Award has become the most important catalyst
for transforming American business. More than any other initiative, public or private, it has
reshaped managers’ thinking and behavior” (Gavin, 1991, p. 80).
The MBNQA Award (1994) was designed to promote:
1. Awareness o f quality as an increasingly important element in competitiveness.
2. Understanding o f the requirements for quality excellence.
3. Sharing o f information on successful quality strategies and the benefits derived
from the implementation o f those strategies.
The Council on Competitiveness (1995) compiled a report after studying the Baldrige
Award and quality in the United States. Their finding were as follows:
1. The quality o f American goods and services is getting better. Unfortunately, this
progress has led to the perception that extending quality management principles and practices
is no longer a high national priority. Our competitors are continuously improving their
quality, and the United States cannot afford to be complacent.
2. The Baldrige National Quality Award and its state and local offshoots have been
key in the effort to strengthen United States competitiveness. The annual government
investment o f $3.4 million in this program is leveraged by over $100 million in private sector
contributions. The impact o f the Baldrige Award on the competitiveness o f United States
industry and the dividends it pays to the United States economy far exceed these investments.
3. The United States quality movement faces a new set of challenges. We need to
overcome the confusion o f terms and apparently competing approaches (TQM, ISO 9000,
reengineering). New ways to extend quality to more large companies, as well as to small-
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
44
and medium-sized enterprises, are needed, and new sectors, such as government, education,
and healthcare, should be included.
4. Although a number of vehicles are available to advance the process o f promoting
quality management— including state and local quality award programs, colleges and
universities, and the Manufacturing Extension Partnership— there has been inadequate
coordination among them and with the National Baldrige Award Program.
5. The Baldrige Award Program, having galvanized United States quality efforts, is
now positioned to become the vehicle for stimulating and coordinating efforts to expand
quality as a national priority.
In a telephone interview with Curt Reimann, this researcher inquired about the
development of the specific criteria used. He related that the National Institute o f Standards
and Technology (NIST) began analyzing organizations that were currently succeeding, and
iisolated characteristics that were present in these organizations. A model was developed
consistent with the prevailing quality theory at that time. In order to ensure that the criteria
and processes remained relevant and reflected current thinking, the designers o f the MBNQA
developed a two-year revision cycle (Bemowski, 1996). The process allows for continuous
improvement reflecting what has been learned both in theory and in practice. Reimann
indicated that there has not been any effort on the part of NIST to empirically validate the
criteria. The approach, however, has been one o f accumulating the information qualitatively
and drawing inferences. The intent o f the criteria and award process is not to be
prescriptive.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
45
A survey was conducted to determine how the criteria are being used, specifically by
those companies who requested applications but who had not applied for the award
(Bemowski & Stratton, 1995). Three findings were reported:
1. The criteria were overwhelmingly used as a source o f information on how to
achieve business excellence. Over 48% were using the criteria to improve processes in their
companies while less than 25% of the respondents used the criteria to apply for the award.
About 50% o f the respondents indicated that they used the criteria to promote a common
language within the company.
2. The majority o f respondents found that the criteria’s usefulness met or exceeded
their expectations.
3. There was great diversity in the enterprises using the criteria. They were
predominately used by managers o f a broad range o f industries.
The researchers concluded that the stated purposes o f the award were being
accomplished. The difficulties in interpreting the MBNQA criteria are well known
(Bemowski, 1996). That factor was a consideration during the latest revision of the Award
criteria, according to Reimann in his interview.
The 1997 MBNQA criteria categories are as follows:
1. Leadership: Refers to how well senior managers provide leadership and sustain
clear values, directions, performance expectations, customer focus, and a leadership system
throughout the company.
2. Strategic Planning: Examines how the company sets and determines strategic
directions and key action plans by translating them into an effective performance system.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
46
3. Customer and Market Focus: Examines how well a company determines their
customers’ expectations and then satisfies customer needs.
4. Information and Analysis: Examines managing and effectively using data and
information to support key company processes and the performance measurement system.
5. Human Resource Development and Management: Examines requirements to
develop full work force potential, i.e. an environment conducive to full participation, quality
leadership, and personal and organizational growth.
6. Business Results: Examines performance and improvement made by the
organization, including customer satisfaction, financial and market performance, human
resource results, supplier and partner performance, and operational performance.
The framework from which the criteria are designed is based on a systems
perspective as illustrated in Figure 1. Refer to Table 5 for the organization of the categories.
Pannirselvam (1995) conducted a study to validate the MBNQA model and
evaluation process. Results from data in the 1993 version o f state awards following the same
criteria revealed that the model is internally consistent and a reliable measure of quality.
Tables 5 through 7 summarize the findings of that study.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Table 5. Baldrige National Quality Award Criteria, 1997.
47
Category Criteria
1. Leadership 1.1 Leadership system
1.2 Company responsibility and citizenship
2. Strategic Planning 2.1 Strategy development process
2.2 Company strategy
3. Customer and Market
Focus
3.1 Customer and market knowledge
3.2 Customer satisfaction and relationship
enhancement
4. Information and Analysis
5. Human Resource
Development and
Management
6. Process Management
4.1 Selection and use o f information and data
4.2 Selection and use o f comparative information and data
4.3 Analysis and review o f company performance
5.1 Work systems
5.2 Employee education, training, and development
5.3 Employee well-being and satisfaction
6.1 Management of Product and Service Processes
6.2 Management o f Support Processes
6.3 Management o f Supplier and Partnering Processes
7. Business Results 7.1 Customer Satisfaction Results
7.2 Financial and Market Results
7.3 Human Resource Results
7.4 Supplier and Partner Results
7.5 Company Specific Results
Note: From Malcolm Baldrige National Quality Award Criteria, 1997, National Institute of
Standards and Technology. (Gaithersburg, MD: United States Department of Commerce and
Technology Administration)
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
48
Table 6. Validity o f the MBNQA Model.
Research Question Finding
1. Are the items under each criterion a reliable measure o f the
trait they attempt to measure?
Yes
2. Is the MBNQA model a good and complete measure of
quality management practices?
Yes
3. Do the MBNQA criteria represent an accurate measure o f
an organization’s quality management practices?
Yes
4. Do all the items under each o f the seven categories represent
a single construct?
Yes
5. Is variability in the assessment o f total quality systems? Yes
6. Is there assessment o f some elements more variable
than others?
Yes
7. Is the variability in assessment related to the type o f
organization evaluated?
Yes
8. Is the variability in assessment related to the
characteristics o f the evaluator?
Yes
Note: From Statistical Validation o f the Malcolm Baldrige National Quality Award Model
and Evaluation Process, 1995, by Pannirselvam. (Doctoral dissertation, Arizona State
University) Doctoral Dissertations.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
49
Table 7. Accuracy o f the MBNQA Weights.
Research Question Finding
1. Do the weights assigned to the examination items No
accurately reflect the importance of each o f these items to a
good management system?
2. Should the weights assigned to the seven criteria be different Yes
for different sizes and types o f businesses?
Note: From Statistical Validation of the Malcolm Baldrige National Quality Award
Model and Evaluation Process, 1995, by Pannirselvam. (Doctoral dissertation, Arizona State
University) D octoral Dissertations.
There is debate in the literature and in the field regarding the effectiveness o f the
MBNQA criteria and the award process. Some criticisms focus around the notion that
companies spend too much time and money on the application process and are distracted
from the work o f the company (Crosby, 1991). There is also concern expressed that there
was no clear definition o f quality. The advocates for the criteria stand firmly on the belief
that since it is not intended to be prescriptive, there should not be a common definition of
quality. Criticism has also centered on the belief that the criteria support the selection of
companies that produce high quality results and are financially successful.
Today quality is seen as a field unto itself, with its own theory, models, practices, and
tools. Although its early applications were predominately in manufacturing, the applications
have quickly spread to service industries and the public sector.
Continuous Improvement in Education
Deming (1994) described America 2000: An Education Study as a “ . . . horrible
example o f numerical goals, tests, rewards but no method.” Can the theories, principles, and
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
50
practices o f quality or continuous improvement, developed in industry, help in the
transformation of schools (Bradley, 1993; Fields, 1994; Glasser, 1992; Langford, 1993;
Rhodes, 1990; Schmoker, 1996; Tribus, 1993)?
Our current system o f education has been influenced historically by industry as well.
Educational administration has its roots in the theory o f scientific management, spawned by
Fredrick Taylor during the period o f 1910-35 (Bradley, 1993; Stempen, 1987). Max Weber
also had considerable influence during that period on the management and administration o f
organizations (Owens, 1970). He characterized the ideal bureaucracy as having the
following characteristics:
1. A division o f labor based on functional specialization.
2. A well-defined hierarchy o f authority.
3. A system of rules covering the rights and duties o f employees.
4. A system o f procedures for dealing with work situations.
5. Impersonality o f interpersonal relations.
6. Selection and promotion based on technical competence.
However, Weber also warned that massive, uncontrollable bureaucracy could be a threat to
free enterprise capitalism (Owens, 1970).
In the 1950s, systems theory was applied to schools as a social system with a
hierarchical role structure (Owens, 1970). These theories were attempting to understand the
organization as a place of greater productivity and efficiency. Ernest Hartwell, a
superintendent o f three different large city school systems in the early 1900s, held that if
administrators applied business principles to progressive educational ideas, schools would
become efficient, stable organizations and would be more profitable to the students and the
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
51
community (Thomas & Moran, 1992). Today’s organization o f teaching, testing, and the
judging by grades has its roots in the industrial revolution’s theories o f mass production,
inspection, and re-work (Deming, 1994).
Deming (1986, 1994) had much to say about the system o f education and the type of
changes that need to occur. He believed that not only did business management ignore
psychology, but so did the managers of education. His beliefs about the individual
differences of people, their need for self-esteem, dignity, and intrinsic motivation provided a
basis for his criticism o f the educational system. His theory, practices, and tools have been
very appealing to educators who are trying not to lose hope in the on-going battle to improve
schools and reverse the tide of public criticism.
In 1991, the Association o f Quality Control conducted its first Quality in Education
Survey (Klaus, 1996). At that time, 133 K-12 and higher education institutions responded,
indicating that quality had been implemented. Five years later, the study indicated that 451
educational institutions had implemented quality (Klaus, 1996). Educators have wrestled
with the theories o f quality and the wisdom of experts in industry and have made applications
in a meaningful way (Bernhardt, 1994; Bonstingl, 1996; English et al, 1994; Fields, 1994;
Glasser, 1992; McClanahan& Wicks, 1994; Rubin, 1994; Tribus, Langford & Cleary, 1995).
There are increasingly more efforts being made to conduct research in the application of
quality in the field o f education (Chapell, 1993; Danne, 1991; Fritz, 1993; Louer, 1993;
Miller, 1993; Partin, 1992; Regauld, 1993; Smith, 1996).
The results in organizational improvement can be seen in several schools and school
districts across the country. Improvements on disciplinary action have been reported by Mt.
Edgecumbe in Sitka, Alaska (Danne, 1991; Langford et al, 1995). Decline in drop-out rates
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
52
have been cited by George Washington Vocational/ Technical High School in New York
City (Danne, 1991). Redesign o f programs to prevent students from dropping out and to
increase their success is a focus for some schools (Danne, 1991). Improved systems o f data
collection, analysis, and benchmarking have been developed in a number o f school districts
(Langford et al, 1995; Seigal et al, 1994). Motivated by high failure rates, staff at Parkview
School District identified root causes and implemented multiple systemic solutions resulting
in a decrease o f the failure rate by 50% in just one year (Seigel et al, 1994). The emphasis at
the Christa McAuliffe Elementary School in Prince William County, Virginia, has been on
teaching students quality practices and tools to assist them in working together, being
responsible for their own learning and progress, and involving the larger community (Seigal
et al, 1994). There are 70 elementary schools in the United States and abroad where the
Koalaty Kid model o f continuous improvement has been implemented (Green, 1996).
Students use quality tools to monitor their own progress and improvement in mastering new
skills and content.
In 1993 the decision was made to launch the Malcolm Baldridge National Quality
Award program using the Education Criteria Pilot in 1994-95. A pilot approach was taken to
address the many issues involved in extending eligibility to education (National Institute o f
Standards and Technology, 1995). During the pilot year, schools who applied were not
eligible for the award.
The objectives o f the Education Pilot Program were to:
1. Determine the interest and readiness of educational organizations to participate in
a national-level recognition program based on the ability to demonstrate overall performance
improvement.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
53
2. Evaluate the Pilot Criteria.
3. Determine the capability o f the evaluation system, including volunteer experience,
availability, and time commitment
4. Determine the value o f the feedback given to Pilot Program participants.
5. Determine whether or not there should be subcategories of eligibility, taking into
account school type and size.
6. Determine the likely influence o f the award on: (a) sharing o f best practices
information, (b) cross-sector cooperation, (c) elevation o f educational standards.
The criteria for the Education Pilot are based on core values and concepts. These are
summarized Table 8.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
54
Table 8. Core Values/Concepts o f MBNQA, Education Pilot, 1995.
Core Value Core Concepts
I. Learning-Centered A. Focus on learning and real needs o f learners
Education B. High developmental expectations/standards for all students
C. Understanding that student learning rates/styles vary
D. Major emphasis on active learning
E. Regular, extensive formative assessment early in
learning process
F. Periodic use o f summative assessment to measure
progress against key relevant external standards/norms
G. Assist students/families chart progress using self-
assessment
H. Focus on key transitions such as school-to-school and
school-to-career
II. Leadership A. Clear, visible directions and high expectations
B. Modeling o f strategies for continuous improvement
methods and processes by senior administrators
C. School polices that reinforce learning/improvement
climate and encourage self-directed responsibility
throughout the school
D. Building community support and aligning business and
community leaders with their aims
III. Continuous A. Clearly established goals
Improvement/ B. Fact-based measures/indicators
Organizational C. Systematic cycles of planning/execution/evaluation
Learning D. Focus on improving processes for improved results
E. Embedded approach that involves students
IV. Faculty/Staff A. Increased knowledge of faculty/staff about student
Participation/ learning and assessment strategies
Development B. Improved performance o f faculty/staff
C. Organization tailored to a more diverse workforce and
more flexible, high-performance work practices
V. Partnership A. Internal and external partnerships to better accomplish
Development overall goals
B. Partnerships that seek to develop long-term objectives,
strategies for evaluating progress, and means for
changing conditions
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
55
Table 8, cont’d. Core Values/Concepts of MBNQA, Education Pilot, 1995.
Core Value Core Concepts
VI. Management A. Improvement system based on cause-effect thinking,
by Fact measurement, information, data, and analysis
B. Measurements that support school’s mission/strategy
C. Focus on student learning through a comprehensive and
integrated fact-based system
VII. Long-range A. Strong future orientation with long-term commitment
View to students and stakeholders
B. Investment in creating and sustaining assessment
system focused on student learning
C. School leadership familiar with research findings and
practical applications o f assessment/learning
D. School serves as role model in its operations
VIII. Public A. Protection o f public health, safety, and environment in
Responsibility all practices
And Citizenship B. Ethical and non-discriminatory in all practices
D. Support of, and leadership in, purposes important to public
IX. Fast Response A. Faster, more flexible response to customer needs
B. Simultaneous improvement in quality and productivity
C. Strong customer focus
X. Results A. School performance system focused on results
Oriented B. Balanced needs and interests of all stakeholders
C. Student performance demonstrated throughout their
career in a variety o f ways
D. Effective and efficient use o f school resources
There are emerging initiatives in states and school districts in which the MBNQA
Education Pilot Criteria provides the framework for school improvement. The researcher is
aware o f several efforts. Pinellas County Schools in Florida has implemented the
Superintendent’s Quality Challenge, a model based on the Education Pilot criteria. The state
o f New Mexico has initiated a joint private and public sector project, Strengthening Quality
in Schools, which incorporates the criteria and their state award process as a component. The
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
56
Pacific Bell Foundation has sponsored the Education for the Future Initiative in several
schools in California in which a school portfolio process o f organizational improvement was
developed based on components o f the MBNQA Education Pilot Criteria (Bernhardt, 1994).
The specific criteria are listed in Table 9. Revisions are currently being conducted by
the National Institute on Standards and Technology to align the framework to the 1997
MBNQA for businesses. Since 1995 there has not been funding from the legislature to
continue the development o f the Education Pilot, and efforts are actively being made to raise
the capital needed to continue to develop the process. Several states have now included K-12
education in their state quality award process. New Mexico, Florida, New York, and
Minnesota are four which have done so. Idaho is currently initiating those discussions.
The MBNQA Education Pilot Criteria (Table 9) are described below:
1. Leadership: Examines the personal leadership o f senior administrators and their
involvement in creating and sustaining student focus, clear goals, high expectations, and a
leadership system that promotes performance excellence. Also examines how these
objectives and expectations are integrated into the school’s management system.
2. Strategic and Operational Planning: Examines how the school sets strategic
directions and determines key plan requirements and how plan requirements are translated
into an effective performance management system with primary focus on student
performance.
3. Student Focus and Student and Stakeholder Satisfaction: Examines how the
school determines student and stakeholder needs and expectations by defining levels and
trends in key measures o f satisfaction relative to comparable schools and/or appropriately
selected organizations.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
57
4. Information and Analysis: Examines the management and effectiveness o f data
and information used to support overall mission-related performance excellence.
5. Human Resource Development and Management: Examines how faculty and staff
development are aligned with the school’s performance objectives. Also examined are the
school’s efforts to build and maintain a climate conducive to performance excellence, full
participation, and personal and organizational growth.
6. Educational and Business Process Management: Examines the key aspects of
process management, including learning-focused education design, education delivery,
school services, and business operations. Examines how key processes are designed,
effectively managed, and improved to achieve higher performance.
7. School Performance Results: Examines improvement o f student performance; the
school’s educational climate, services, and business operations at performance levels relative
to comparable schools; and/or appropriately selected organizations.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Table 9. 1995 MBNQA Education Pilot Criteria.
58
Category Criteria
1. Leadership 1.1 Senior Administration Leadership
1.2 System and Organization
1.3 Public Responsibility and Citizenship
2. Information and Analysis 2.1 Management o f Information and Data
2.2 Comparisons and Benchmarking
2.3 Analysis and Use of School Level Data
3. Strategic and Operational
Planning
3.1 Strategy Development
3.2 Strategy Deployment
4. Human Resource
Development and
4.1 Human Resource Planning and Evaluation
4.2 Faculty/Staff Work Systems
4.3 Facuity/Staff Develop ment
4.4 Faculty/Staff Well-being and Satisfaction
5. Educational and Business
Process Management
5.1 Education Design
5.2 Education Delivery
5.3 Education Support Service Design/Delivery
5.4 Research, Scholarship,and Service
5.5 Enrollment Management
5.6 Business Operations Management
6. School Performance Results 6.1 Student Performance Results
6.2 School Climate Improvement Results
6.3 Research, Scholarship, and Service
6.4 School Business Performance Results
7. Student Focus and Student/
Stakeholder Satisfaction
7.1 Current Student Needs and Expectations
7.2 Future Student Needs and Expectations
3.3 Stakeholder Relationship Management
3.4 Student and Stakeholder Satisfaction
Determination
Note: From Malcolm Baldrige National Quality Award, 1997, National Institute of
Standards and Technology. (Gaithersburg, MD: United States Department of Commerce and
Technology Administration)
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
59
Summary
The call for increased productivity, efficiency, and effectiveness remains constant
over the decades o f school reform. This chapter reviewed current literature on organizational
effectiveness, effectiveness in schools, and emerging models of quality applications. The
literature reviewed establishes the background and current practices both in business and
education for measuring and improving organizational performance. Quality theory has been
explored as a framework for approaching school improvement. Current traditional practices
for determining comprehensive organizational performance were described. The application
o f the Malcolm Baldrige National Quality Award Education Pilot to school improvement is
an emerging area o f research and application. School improvement and reform strategies are
now recognizing the need for a systemic change strategy that recognizes the comprehensive
and complex nature of school districts (Anderson, 1993; O’Neil, 1993; Wagner, 1993).
Increasingly, businesses are using the criteria o f the Malcolm Baldrige National Quality
Award to assess their status and guide them towards improvements that produces sustained
results through an aligned system which illustrates the core values o f the MBNQA.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
60
Chapter 3
Methodology
Introduction
This research study examined a theory and a framework on which the operation and
performance of a school district can be assessed to determine where improvements might be
necessary. There are three purposes:
1. To determine how participants rate the performance o f their district currently in
each o f the seven categories o f the Performance Analysisfo r School Districts.
2. To determine if these scores differ by type o f educator or size o f school district.
3. To determine how participants perceive the usefulness o f the instrument as a
framework for self-analysis by a school district in school improvement.
The study involved the development of an instrument to collect information regarding
school district performance. It investigated differences by type o f educator and size of
district. It measured the perception of participants about the instrument’s usefulness in
approaching school improvement.
The Research Model
The model for the research study was as follows: Y t]k = u + a, ~ bj+ (ab),j ~ e ljk The value
of the response variable is the sum of:
u = the effect o f the overall mean,
a, = the effect of the district size.
bj =the effect of the position type.
(ab),j = the effect o f the interaction of district size and position.
e ,jk = random error in model.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
61
Instrumentation
The framework for determining organizational performance was adapted from the
Malcolm Baldrige National Quality Award Education Pilot (1995). The researcher
constructed an instrument based on the Malcolm Baldrige National Quality Award (1997)
and the Education Pilot Criteria (1995) as the primary framework. Since the 1995 Education
Pilot is being revised, the researcher was advised by Curt Reimann, retired Director o f NIST,
to consider the current 1997 changes. The researcher also integrated components from the
Northwest Accreditation Standards and the curriculum audit process. The intent o f such an
instrument was to reflect the comprehensive system o f a school district. Therefore, the
researcher felt that there were elements in both the accreditation process and the curriculum
audit process that could potentially be overlooked by an exclusive approach using only the
MBNQA. There are seven categories o f organizational performance used in this instrument:
1.0 Leadership
2.0 Strategic Planning
3.0 Information and Analysis
4.0 Student Focus and Student and Stakeholder Satisfaction
5.0 Human Resource Development and Management
6.0 Educational and Operational Process Management
7.0 School Performance Results
The descriptions in each category are based on the following Likert-type scale used to
construct the language in each of the seven subcategories. The scale for the subcategories o f
Leadership, Strategic Planning, Information and Analysis, Student Focus and Student and
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
62
Stakeholder Satisfaction, Human Resource Development and Management, and Educational
and Operational Process Management were:
1. No systematic approach evident.
2. Awareness stages of systematic approach but minimal requirements.
3. Developing a system that emphasizes prevention o f problems meets expectations.
4. A refined, well-developed approach that is deployed with broad applications.
5. A thorough, systematic approach that is fully deployed, institutionalized, and
idealized.
The scale for the subcategory o f School Performance Results was:
1. No results or results below expectations.
2. Some improvements; early stages of developing trends.
3. Improvement trends or good performance in some areas.
4. Current performance is good to excellent with trends over time.
5. Superior performance with sustained results; state or national benchmark.
The instrument used to collect data was designed to yield continuous data reflecting
the ordered nature o f the items in each category and a weighting for each category. (See
Appendix A). The rationale was designed to: (a) more closely align it with the scoring
design o f the MBNQA in which additional points are awarded for more fully developed
quality practices and performance, and (b) yield continuous data weightings that more
appropriately answer the research questions put forward in the study.
The instrument was reviewed for content validity to insure it would answer the
research questions. Selected experts in the use o f the Baldrige criteria in business and/or
education, both in-state and out-of-state, were used. To qualify as a content-area expert, the
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
63
individual must be, or have been, a state or national quality award examiner using the criteria
from the Malcolm Baldrige National Quality Award and must be experienced in applying
quality applications in educational institutions. Appendix B contains a list of the individuals
consulted and their qualifications. A cover letter (Appendix C) addressed the specific
research questions and the nature of the feedback that the researcher was requesting.
Comments were received and items were revised.
The revised instrument was tested on twelve Idaho educators— two superintendents,
five principals, and five teachers. Minor revisions were made and the category of “I do not
know” was added to each subcategory.
Subjects and Setting
The population used was educators working in Idaho public schools. A proportional
stratified random sample was selected using the 1996-97 database from the Idaho State
Department o f Education. The population was stratified by size of student enrollment using
the classifications as outlined by the Idaho State Department o f Education as follows:
Classification 1 = 5,000+
Classification 2 = 2,500 - 4,999
Classification 3 = 1,000 - 2,499
Classification 4 = 500 - 999
Classification 5 = 1 - 499
A proportional allocation for sample size was used based on the ratio of the number of
districts in each category to the total number o f school districts in the state (Weirsma, 1995).
The sample size was determined using a table o f recommended sample size (Krejcie, 1970).
Table 10 and Appendix D provide the matrix for the sample design.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
64
Table 10. Stratified Random Sample Matrix by Classification.
Superintendents Principals Teachers
Classification N % s** N % s** N % s**
#1 5,000+ 13 13.4 10 220 44 96 6,736 51.5 191
#2 2,500-4,999 13 13.4 10 84 16.8 37 2,358 18 67
#3 1,000-2,499 27 27.8 21 106 21.2 46 2,374 18 67
#4• 500-999 22 22.6 18 55 11 24 987 8 29
#5 1-499 21 21.6 17 31 6 4 617 5 19
State Totals 97 — 76 500 — 217 13,076 — 373
Collection o f Data
The instrument was prepared for electronic scanning. Each sheet was coded by size
o f district and type o f educator (Appendix E). The instrument was mailed to individuals
selected in the sample. This method was selected for convenience and to insure anonymity
o f the respondents. Appendix F contains cover letters and directions. The researcher secured
letters o f support from the Idaho Association of School Administrators and the Idaho
Education Association to help ensure return (Appendix H). Each person selected in the
sample was also sent a pen printed with the message, “Thank you for participating in the
Performance Analysis for School Districts,” and a self-addressed, stamped envelopes was
enclosed. Mailing was conducted in October of 1997, and respondents were given two
weeks to respond. A reminder post card was sent immediately following the deadline
(Appendix G). Random phone calls were also made asking for a response.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
65
Data Analysis
There were seven dependent variables, i.e. scores from each category. There were
two independent variables: (a) type o f educator, with three levels (superintendents,
principals, teachers), and (b) size of district, with five levels o f enrollment (over 5000, 4999-
2500, 2499-1000, 999-500, 499-1 (Appendix I). The SAS computer software program was
used to compile the data and generate the statistical analysis. Descriptive data was collected
to determine the characteristics o f the sample, the highest degree earned and the number o f
years o f experience in their current position, and the attitudes o f the respondents regarding
the instrument. Frequencies o f responses is also illustrated. A two-way factorial analysis of
variance for each category was used to compare two independent variables (Huck, Cormier
& Bounds, 1974). Cronbach’s alpha was done to test reliability o f the instrument. The final
research question regarding the potential usefulness o f the instrument was answered with
descriptive statistics. Qualitative analysis o f the comments about the instrument was done
using a constant comparative model (Patton, 1983).
Summary
Chapter 3 established the procedural design o f the study that investigated how
educators in Idaho perceive their school districts in seven different categories. The sample
and instrument development were described. The statistical analysis used in Chapter 4 was
specified.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Chapter 4
Findings
Introduction
This study was designed to investigate three areas:
1. The perceptions o f superintendents, principals, and teachers in Idaho
regarding the performance o f their school district in seven areas.
2. The differences o f perceptions based on type of position and/or size o f school
district.
3. The perceived usefulness of the instrument constructed by the researcher for
self-study.
The dependent variables were the scores for each of the seven constructs. There
were two independent variables, size o f district and position o f educator. District size
was divided into five levels depending on student enrollment: (a) over 5,000; (b) 4,999-
2,500; (c) 2,499-1,000; (d) 999-500; and (e) 499-1. The position of the educators were
separated into three types: (a) superintendents, (b) principals, and (c) teachers. The
instrument, the Performance Analysis o f School Districts, was designed based on the
1997 Malcolm Baldrige National Quality Award criteria, the 1995 Education Criteria,
curriculum audit standards, and the Northwest Accreditation Standards. The instrument
was piloted, tested, and reviewed for content validity. Revisions were made based on
results. The sample was selected from the population o f educators in Idaho public
schools. The data was collected through a mailed survey. The results were scanned from
returned individual instruments. The data was analyzed using the Statistical Analysis
System (SAS). Descriptive analysis was done for characteristics o f central tendency and
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
67
a factorial analysis o f variance was used to test hypotheses. Cronbach’s alpha was used to
determine reliability. Qualitative analysis was done using a constant comparative model
for the comments o f participants.
Rate of Return
The total sample size was 666. The total number sent was 656. Adjustments
were made for instances in which several participants had responsibility for more than
one o f the targeted positions or because participants were no longer in the position. A
total of 258 surveys were returned for a 36% rate of return. Nine (9) were eliminated due
to participant error, and eleven (11) were not used because they were received too late.
Wiersma (1995) reports 70% as a minimal acceptable rate o f return when surveying
professional samples. Table 11 illustrates the return rates by educator position for the
total number o f surveys sent. Table 12 illustrates the frequencies and percentages of the
returns received by educator position and district size. The highest percentage returned
from the total sample by position was for teachers, and the highest percentage returned
from the total sample by size was for districts with over 5,000 students enrolled.
Table 11. Total Return Rates by Educator Position.
Number
Sent
Number
Received
Percentage
Received
Superintendents 76 49 64%
Principals 211 88 42%
Teachers 369 101 27%
Total 656 238 36%
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
68
Table 12. Frequencies and Percentages o f Returns Received by Educator Position and
District Size.
5000+ 2500-4999 1000-2499 500-999 1-499 Total
Superintendents:
Frequency 8 6 13 13 9 49
Percent o f total 3.4 2.5 5.5 5.5 3.8
Percent by size 7.5 14.6 30.2 52.0 40.9
Percent by position 16.3 12.2 26.5 26.5 18.3 20.6
Principals:
Frequency 44 17 14 8 5 88
Percent o f total 18.5 7.1 5.9 3.4 2.1
Percent by size 41.1 41.5 32.6 32.0 22.7
Percent by position 50.0 19.3 15.9 9.1 5.7 36.9
Teachers:
Frequency 55 18 16 4 8 101
Percent o f total 23.1 7.6 6.7 1.7 3.6
Percent by size 51.4 43.9 37.2 16.0 36.6
Percent by position 54.6 17.8 15.8 3.9 7.9 42.4
Total:
Frequency 107 41 43 25 22 238
Percent by size 45 17.2 18.1 10.5 9.2
Characteristics o f Sample
A proportional, stratified random sample was selected from the state-wide data
base o f certified educators employed in Idaho public schools during the 1996-97 school
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
year. Participants were asked two demographic questions: (a) their highest terminal
degree, and (b) the length o f time in their current position. Table 13 illustrates the
highest degree by size o f district and position, and length o f time in current position
district size and by position. Table 14 illustrates the rank order by percent o f time in
position and highest degree. The most frequent terminal degree in the sample was a
Masters with the most frequently occurring range o f experience being twelve or more
years. Percentages o f terminal degrees varied by size o f district as illustrated in Table
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
70
Table 13. Percentage o f Highest Degree and Time in Position by Size and Position.
Variables
Degrees Years
B M S D 1 1-3 4-7 8-11 12+
5,000+:
Superintendents 37.5 62.5 12.5 25.0 50.0 12.5
Principals 52.3 27.3 13.6 9.1 9.1 29.5 18.2 34.1
Teachers 45.5 50.9 3.6 5.5 12.7 18.2 12.7 50.9
4,999-2,500:
Superintendents 33.3 33.3 33.3 16.7 50.0 16.7 16.7
Principals 47.1 47.1 5.9 23.5 29.4 29.4
Teachers 55.6 38.9 5.6 5.6 22.2 11.1 16.7 44.4
2499-1000:
Superintendents 15.4 69.2 15.4 15.4 7.7 53.8 15.4 7.7
Principals 78.6 14.3 7.1 7.1 21.4 21.4 28.6 21.4
Teachers 87.5 12.5 25.0 12.5 12.5 50.0
999-500:
Superintendents 30.8 53.8 15.4 23.1 15.4 30.8 30.8
Principals 87.5 12.5 25.0 62.5 12.5
Teachers 75.0 25.0 25.0 25.0 50.0
499-1:
Superintendents 22.2 66.7 11.1 11.1 11.1 33.4 44.4
Principals 100 40.0 60.0
Teachers 50.0 37.5 12.5 12.5 37.5 25.0 12.5 12.5
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
71
Table 14. Rank Order o f Combined Sample.
Percentage Highest Degree Percentage Years in Position
44.1 Bachelors 32.8 12+
23.5 Masters 27.7 4-7
22.3 Specialist 15.5 1-3
8-11
8.4 Doctorate 7.6 >1
Reliability o f Performance Analysis for School Districts
Cronbach correlation coefficient was used to determine the consistency o f the
instrument in measuring the seven constructs. Reliability coefficients (Table 15)
suggested internal test consistency existed in each construct, with the Leadership
construct having the highest reliability and the School District Results the lowest.
Table 15. Reliability o f Instrument.
Category Construct Cronbach’s Alpha
Leadership
Strategic Planning
Student and Stakeholder Satisfaction
Information and Analysis
Human Resources
Educational Process Management
School District Results
.851446
.831169
.768750
.803754
.842505
.828683
.741474
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
72
Descriptive Analysis
Descriptive statistics were used to illustrate central tendency and variability
among the dependent variables. Tables 16 through 24 illustrate the means and standard
deviations by district size and position o f educator. The lowest overall mean occurred in
the Information and Analysis category when combining district size and educator
position, while the highest mean occurred in the Leadership construct.
Table 16. Means For District Size and Positions Combined.
Construct N Mean SD
Leadership 233 3.15 .98
Strategic Planning 233 3.06 1.11
Student /Stakeholder Satisfaction 233 2.82 .90
Information & Analysis 237 2.62 1.13
Human Resources 233 2.71 .99
Educational Process 228 2.77 .93
School District Results 229 2.94 .91
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
73
Table 17. Means by District Size for Districts With 5000 or More Students Enrolled.
Construct N Mean SD
Leadership 106 3.15 1.02
Strategic Planning 106 3.22 1.05
Student /Stakeholder Satisfaction 107 2.86 .93
Information & Analysis 107 2.70 1.24
Human Resources 107 2.62 1.02
Educational Process 104 2.84 1.02
School District Results 104 2.95 .96
Table 18. Means by District Size for Districts With Between 4,999 and 2,500 Students
Enrolled.
Construct N Mean SD
Leadership 40 3.17 1.05
Strategic Planning 39 2.91 1.10
Student /Stakeholder Satisfaction 40 2.77 .95
Information & Analysis 40 2.55 1.16
Human Resources 39 2.86 .93
Educational Process 37 2.65 .81
School District Results 37 2.63 .70
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
74
Table 19. Means by District Size for Districts With Between 2,499 and 1,000 Student
Enrolled.
Construct N Mean SD
Leadership 42 3.06 .82
Strategic Planning 43 3.00 1.09
Student /Stakeholder Satisfaction 43 2.79 .74
Information & Analysis 43 2.51 .95
Human Resources 43 2.66 .94
Educational Process 43 2.66 .83
School District Results 42 2.97 .82
Table 20. Means by District Size for Districts With Between 999 and 500 Students
Enrolled.
Construct N Mean SD
Leadership 25 3.22 .99
Strategic Planning 23 3.01 1.24
Student /Stakeholder Satisfaction 25 2.71 .95
Information & Analysis 25 2.59 .94
Human Resources 25 2.76 .97
Educational Process 25 2.68 .89
School District Results 25 3.06 1.10
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
75
Table 21. Means by District Size for Districts With Less Than 499 Students Enrolled.
Construct N Mean SD
Leadership 20 3.26 .94
Strategic Planning 22 2.71 1.27
Student /Stakeholder Satisfaction 19 2.88 .96
Information & Analysis 22 2.54 1.10
Human Resources 19 3.02 1.07
Educational Process 19 3.07 .89
School District Results 21 3.23 .89
Table 22. Means by Position for Superintendents.
Construct N Mean SD
Leadership 48 3.52 .67
Strategic Planning 47 3.28 1.09
Student /Stakeholder Satisfaction 49 3.11 .76
Information & Analysis 49 2.90 .96
Human Resources 49 3.24 .77
Educational Process 49 3.03 .72
School District Results 49 3.16 .79
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
76
Table 23. Means by Position for Principals.
Construct N Mean SD
Leadership 86 3.47 .89
Strategic Planning 85 3.32 1.04
Student /Stakeholder Satisfaction 87 3.00 .85
Information & Analysis 87 2.85 1.08
Human Resources 87 2.95 .92
Educational Process 82 3.03 .92
School District Results 82 3.25 .85
Table 24. Means by Position for Teachers.
Construct N Mean SD
Leadership 99 2.70 1.00
Strategic Planning 101 2.74 1.11
Student /Stakeholder Satisfaction 98 2.52 .92
Information & Analysis 101 2.27 1.17
Human Resources 97 2.23 .94
Educational Process 97 2.42 .92
School District Results 98 2.57 .90
The frequencies and percentages for each item in the seven constructs are charted
in Tables 25 through 31. Teacher responses tended to be distributed across all six choices
more frequently than those o f superintendents and principals. This was consistent across
all constructs. Greater frequencies occurred in item #3 through #5, signifying a more
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
77
all constructs. Greater frequencies occurred in item #3 through #5, signifying a more
developed and refined quality approach on the part o f superintendents and principals,
while greater frequencies occurred in item #1 through #3, indicating a less developed and
a more arbitrary approach by teachers. Teachers also tended to select the “do not know”
response more frequently.
In Table 25 data collected for the Leadership category suggested that for the
districts with 5,000 or more students there was a higher ratio o f superintendents and
principals who saw a clearly communicated, fully deployed direction in their district than
the ratio o f teachers. In the largest districts, 63% o f superintendents responded to item #3
or #5, compared to 59% o f the principals and 42% of the teachers. In districts with 499
or less students, 88% o f the superintendents responded to item #4 or #5, compared to
60% o f the principals and 25% o f the teachers. Teacher responses occurred more
frequently in item #1 and #2 regarding the existence o f a systematic study process o f the
performance o f their school district than did superintendents or principals. Across all
sizes of districts, 15% o f the superintendents suggested there were only minimal school
improvement efforts on the part of district leaders, compared 75% to 44% of teachers.
There also appeared to be a greater perception on the part o f superintendents and
principals that a participatory approach to management exists in their district. More
teachers than superintendents or principals report that there is little involvement of
stakeholders in policy development before the local board o f trustees. There was less
variability among the groups when responding to items on Responsibility to Public or
Legal, Ethical Conduct, with most responses across all positions and district sizes
occurring in item #4 and #5.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
78
In the Strategic Planning construct in Table 26, more superintendents and
principals responded to item #3 through #5 than did teachers regarding both Strategic
Development and Focus of the Plan. However, the majority o f all three groups perceived
that their district did not have a well-deployed system for implementing or assessing their
strategic plan. In the Student and Stakeholder Satisfaction construct in Table 27,
teachers in all districts responded to item #1 or #2 describing standardized tests scores as
the primary means of determining student needs. Superintendents and principals in the
same districts responded with greater frequency to choices #3 through #5. Teachers
reported that there were minimal attempts to determine student and stakeholder
satisfaction, while superintendents reported that more refined attempts existed.
Increased frequencies o f “do not know” responses occurred among teachers in the
Information and Analysis construct in Table 28. Teacher responses occurred with greater
frequency in item #1 and #2, compared to superintendents and principals who replied
with greater frequency to items #3 through #5 regarding the collection, use, and analysis
o f information. In the Human Resources construct in Table 29, teacher perceptions o f the
learning and working climates, work systems, and employee satisfaction were less
positive than the perceptions o f their superintendents and principals. Table 30 illustrates
frequencies and percentages for Educational and Operational Process Management.
Teachers, more often than administrators, perceived that educational programs and
services were primarily designed and delivered based on federal and state regulations,
traditional practices, or test results. Teachers selected the “do not know” response with
greater frequency than superintendents or principals regarding supply and partnering
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
79
processes. Table 31 illustrates frequencies for the School District Results construct with
similar patterns o f responses.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 25. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
1. Clearly communicated direction:
Scale: 1 Frequency 1 2 1 3 1 1 1 2 9 3 4 1 2
Percentage 16.7 15.4 7.7 6.8 5.9 7.1 12.5 40.0 16.4 16.7 25,0 25.0 25.0
Scale: 2 Frequency 1 2 4 1 3 2 1 1 7 4 2 2 1
Percentage 12.5 15.4 30.8 11.1 6.8 11.8 7.1 12.5 12.7 22.2 12.5 50.0 12.5
Scale: 3 Frequency 2 1 3 3 11 2 2 15 3 2 3
Percentage 25.0 16.7 23.1 23.1 25.0 11.8 14.3 27.3 16.7 12.5 37.5
Scale: 4 Frequency 4 3 4 3 4 11 8 4 5 3 17 5 7 1 1
Percentage 50.0 50.0 30.8 23.1 44.4 25.0 47.1 28,6 62.5 60,0 30.9 27.8 43.8 25.0 12.5
Scale: 5 Frequency 1 1 2 2 4 15 4 5 1 6 2 1 1
Percentage 12.5 16.7 15.4 15.4 44.4 34.0 23.5 35.7 12.5 10.9 11.1 6.3 12.5
“Do Not Know” Frequency 1 1 1
Percentage 7.1 1.8 5.6
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Leadership Category.
Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
2. Process to study performance:
Scale: 1 Frequency 3 1 6 3 1 2 12 3 4 1 3
Percentage 23.1 11.1 13.6 17.6 7.1 25.0 21.8 16.7 25,0 25.0 37.5
Scale: 2 Frequency 1 1 3 4 1 5 6 5 1 3 17 4 4 2 2
Percentage 12.5 16.7 23.1 30.8 11.1 11.4 35.3 35.7 12.5 60.0 30,9 22.2 43.8 50.0 25.0
Scale: 3 Frequency 2 3 7 4 3 12 4 5 1 1 4 4 2 1
Percentage 25.0 50.0 53,8 30,8 33.3 27.3 23.5 35.7 12.5 20,0 7.3 22.2 12.5 25.0
Scale: 4 Frequency 5 2 5 2 8 2 1 2 1 10 2 2 1
Percentage 62.5 33.3 38.5 22.2 18.2 11.8 7.1 25.0 20,0 18.2 111 12.5 12.5
Scale: 5 Frequency 2 12 2 2 2 7 2 1 2
Percentage 22.2 27.3 11.8 14.3 25.0 12.7 11.1 6.3 25.0
“Do Not Know” Frequency 5 3
Percentage 9.1 16.7
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
3. Leadership role in improvement:
Scale. 1 Frequency 6 1 14 6 5 1 3
Percentage 13.6 7.1 25,5 33.3 31.3 25.0 37.5
Scale: 2 Frequency 2 5 2 2 1 2 12 2 5 2 2
Percentage 15.4 11.4 11.8 14.3 12.5 40.0 21.8 11.1 31.3 50,0 25,0
Scale: 3 Frequency 1 1 6 3 10 6 5 2 1 9 4 2 1 1
Percentage 12.5 16.7 46.2 23.1 22.7 35.3 35.7 25.0 20.0 16.4 22.2 12.5 25.0 12.5
Scale: 4 Frequency 4 1 5 4 3 9 3 3 2 1 15 1 2 2
Percentage 50,0 16.7 38.5 30.8 33.3 20.5 17.6 21.4 25.0 20.0 27.3 5.6 12.5 25.0
Scale: 5 Frequency 3 4 2 4 6 13 6 3 2 1 4 4 2
Percentage 37.5 66.7 15.4 30.8 66.7 29,5 35.3 21.4 25.0 20.0 7.3 22.2 12.5
“Do Not Know” Frequency 3 1 1
Percentage 37.5 1.8 5.6
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
K>
73
CD
■ o
- 5
o
Q.
Q.
s
g> Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
(/)■<f )
o '
o
o
CD
o Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
o
■O
-5
cq'
17
4. Participatory management:
o
o Scale: 1 Frequency 1 3 1 1 13 3 4 2 2
CD
—s Percentage 12.5 6.8 5.9 12.5 23.6 16.7 25.0 50.0 25.0
~n
c
Scale: 2 Frequency 1 3 11 3 3 1 18 6 6 1 4
CD
CD
Percentage 7.7 23.1 25 17.6 21.3 20.0 32.7 33,3 37.5 25.0 50.0
■o
- 5
O
Q. Scale: 3 Frequency 6 3 8 6 3 7 5 8 4 2 11 4 2
C
&
o
Percentage 75.0 50.0 61,5 46.2 33.3 15.9 29.4 57.1 50.0 40.0 20.0 22.2 12.5
3
■o
o Scale: 4 Frequency 1 4 2 3 11 8 3 3 1 7 2 2 1 1
o ;
l-H
Percentage 16.7 30.8 15.4 33.3 25.0 47.1 21.4 37.5 20,0 12.7 11.1 12.5 25.0 12.5
CD
Q.
Scale: 5 Frequency 1 2 2 3 11 1 4 2 1 1
O Percentage 12.5 33.3 15.4 33.3 25.0 20,0 7.3 111 6.3 12.5
■O
CD
“Do Not Know” Frequency 2 1 1
o
C/)'
(/)
Percentage 3.6 5.6 6.3
o
o
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
u>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
Board policy:
Scale: 1 Frequency 1 1 1 4 3 1 1 3 10 2 5 3 1
Percentage 12.5 7.7 111 9.1 17.6 7.7 12.5 60.0 18.2 111 31.3 75.0 12.5
Scale: 2 Frequency 1 2 4 6 4 9 3 3 2 21 4 4 4
Percentage 12.5 33.3 30.8 46.2 44.4 20.5 17.6 21.4 25.0 38.2 22.2 25.0 50.0
Scale: 3 Frequency 3 2 2 7 7 1 1 1 4 2 4
Percentage 37.5 33.3 15.4 15.9 15.9 5.9 7.5 12.5 7.3 11.1 25.0
Scale: 4 Frequency 3 4 5 3 14 8 7 2 1 9 5 2 1 1
Percentage 37.5 30.8 38.5 33.3 31.8 47.1 50.0 25.0 20.0 16.4 27.8 12.5 25.0 12.5
Scale: 5 Frequency 2 2 2 1 9 2 1 2 1 4 1 1 1
Percentage 33.3 15.4 15.4 11.1 20.5 11.8 7.1 25.0 20.0 7.3 5.6 6.3 12.5
“Do Not Know” Frequency 1 7 4
Percentage 7.1 12.7 22.2
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
OO
4^
73
CD
■o-5
o
Q.
Q.
s
to Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
3
(/)■
<f)
o '
o
o
O’
CD
O Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
O
■o
cq'
O’
6. Responsibility to public:
o
3 Scale: 1 Frequency 1 4 3 1
CD
—s
T |
Percentage 5.9 7.3 18.8 25.0
C
O’ Scale: 2 Frequency 1 1 2 2 9 4 2 4 27 6 9 2 4
CD
—i
CD
Percentage 12.5 7.7 15.4 22.2 20,5 28.6 25.0 80.0 49.1 33.3 56,3 50.0 50.0
"O
o
Q. Scale: 3 Frequency 1 1 4 5 1 8 2 3 2 10 4 2 1 1
ao
Percentage 12.5 16.7 30.8 38.5 11.1 18.2 11.8 21.4 25.0 18.2 22.2 12.5 25.0 12.5
"O
o Scale: 4 Frequency 6 5 6 5 5 15 12 4 4 4 3 2 1
o ;
l-H
CD
Percentage 75.0 83.3 46.2 38,5 55.6 34.1 70,6 28.6 50.0 7.3 16.7 12.5 12.5
Q.
l-H
Scale: 5 Frequency 2 1 11 1 5 1 9 3
O’
o
c
l-H
Percentage 15.4 11.1 25.0 5.9 21.4 20.0 16,4 16.7
■O
CD
g
“Do Not Know” Frequency 1 2
c/5'
w
o
Percentage 1.8 111
o
(fable continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
KJ
73
CD
■o- 5
o
Q.
Q.
s
g> Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category.
(/)■<f )
o '
3
O
CD
o
Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
O
■o
cq'
17
7. Legal, ethical conduct:
o
o Scale: 1 Frequency 5 2 1
CD
—s
—n
Percentage 9.1 12.5 25.0
11
C
Scale: 2 Frequency 1 2 5
CD
CD
Percentage 12.5 4.5 9.1
"O- 5
o
Q. Scale: 3 Frequency 2 2 1 8 4 1 1 19 5 7 1 2
C
&
o
Percentage 25.0 15.4 11.1 18.2 23.5 7.1 12.5 34.5 27.8 43.8 25.0 25.0
2
■o
o
Scale: 4 Frequency 2 4 10 8 7 17 7 8 5 4 10 9 3 1 4
2 ^
o;
l-H
Percentage 25.0 50.0 76.9 61.5 77.8 38,6 41.2 57.1 62.5 80,0 18.2 50.0 18.8 25.0 50,0
CD
Q.
| Scale: 5 Frequency 3 4 2 3 1 15 5 5 5 1 7 1 4 1
2 ^
O Percentage 37.5 50.0 15.4 23.1 11.1 34.1 29.4 35.7 25.0 20.0 12.7 5.6 25.0 25.0
■O
CD
3 “Do Not Know” Frequency 1 9 3
2
C/)'
C/)
o '
2
Percentage 2.3 16.4 16.7
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
ON
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 26. Item Frequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
1. Strategic development:
Scale: 1 Frequency 1 1 1 1 1
Percentage 7.7 1.8 5.6 6.3 25,0
Scale: 2 Frequency 2 2 1 4 3 11 12.5 2 16 5 6 2 3
Percentage 15.4 15.4 11.1 9.1 17.6 78.6 40.0 29.1 27.8 37.5 50.0 37.5
Scale: 3 Frequency 3 3 9 7 4 12 5 2 2 2 20 8 2 1 4
Percentage 37.5 50.0 69.2 53.8 44.4 27.3 29.4 14.3 25.0 40,0 36.4 44.4 12.5 25.0 50,0
Scale: 4 Frequency 5 4 2 14 5 1 4 6 2 3
Percentage 62.5 30,8 22.2 31.8 29.4 7.1 50.0 10.9 11.1 18.8
Scale: 5 Frequency 3 2 3 2 13 3 1 1 11 1 3
Percentage 50.0 15.4 23.1 22.2 29.5 17.6 12.5 20.0 20.0 5.6 18.8
“Do Not Know” Frequency 1 1 1 1
Percentage 1.8 5.6 6.3 12.5
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 26, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning
Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
2. Focus of plan:
Scale: 1 Frequency 1 1 1 3 2 1 7 2 2 2 3
Percentage 7.7 7.7 11.1 6.8 11.8 7.1 12.7 111 12.5 50.0 37.5
Scale: 2 Frequency 1 11 4 3 1 8 4 4 2 2 17 6 5 1
Percentage 12.5 16.7 30.8 23.1 111 18.2 23.5 28.6 25.0 40.0 30.9 33.3 31.3 25.0
Scale: 3 Frequency 1 4 4 2 19 2 2 1 2 12 3 5 1
Percentage 12.5 30.8 30,8 22.2 22.7 11.8 14.3 12.5 40.0 21.8 16.7 31.3 25.0
Scale: 4 Frequency 4 3 1 2 1 9 6 5 2 9 2 1 2
Percentage 50.0 50.0 7.7 15.4 11.1 20.5 35.3 35.7 25.0 16.4 11.1 25.0 25.0
Scale: 5 Frequency 2 2 3 3 4 13 1 2 3 1 9 4 4
Percentage 25.0 33.3 23.1 23.1 44.4 29.5 5.9 14.3 37.5 20.0 16.4 22.2 25.0
“Do Not Know” Frequency 1 1 1
Percentage 1.8 5.6 12.5
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
00
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 26, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning
Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
3. Implementation and assessment of plan:
Scale: 1 Frequency 2 2 3 5 2 6 2 2 2 3 8 5 5 3 1
Percentage 25.0 33.3 23.1 38.5 22.2 13.6 11.8 14.3 25.0 60.0 14.5 27.8 31.3 75,0 12.5
Scale: 2 Frequency 1 3 1 5 5 4 1 16 3 3 3
Percentage 12.5 23.1 11.1 11.4 29.4 28.6 12.5 29.1 16.7 31.3 37.5
Scale. 3 Frequency 3 2 4 3 3 15 4 3 3 2 13 4 3 1 2
Percentage 37.5 33.3 30.8 23.1 33.3 34.1 23.5 21.4 37.5 40.0 23.6 22.2 18.8 25,0 25.0
Scale: 4 Frequency 1 1 1 8 3 3 1 11 1 1
Percentage 16.7 7.7 11.1 20.5 17.6 21.4 12.5 20.0 6.3 12.5
Scale: 5 Frequency 2 1 1 3 2 9 1 2 1 6 2 1 1
Percentage 25.0 16.7 7.7 23.1 22.2 20.5 5.9 14.3 12.5 10.9 111 6.3 12.5
“Do Not Know” Frequency 1 1 1 4 1
Percentage 7.7 2.3 1.8 22.2 6.3
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
oo
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 27. Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and
Satisfaction/Stakeholder Categories.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
How student needs and expectations are determined:
Scale: 1 Frequency
Percentage
1 2
12.5 15.4
1
7.7
1
11.1
3
6.8
2
25.0
16
29.1
8
44.4
5
31.3
3
75.0
4
50.0
Scale: 2 Frequency
Percentage
1 1 3
12.5 16.7 23.1
6
46.2
2
22.2
13
29.5
10
58.8
8
57.1
4
50.0
2
40.0
22
40.0
4
22.2
5
31.3
3
37.5
Scale: 3 Frequency
Percentage
1 3
12.5 23.1
2
15.4
1
11.1
10
22.7
2
14.3
2
25.0
1
20.0
3
5.5
2
111
2
12.5
Scale: 4 Frequency
Percentage
5 5 5
62.5 83.3 38.5
4
30.8
3
33.3
12
27.3
6
35.3
4
28.6
2
40.0
7
12.7
2
11.1
3
18.8
1
25.0
1
12.5
Scale: 5 Frequency
Percentage
2
22.2
6
13.6
4
7.3
2
11.1
1
6.3
“Do Not Know” Frequency
Percentage
3
5.5
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
O
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 27, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and
Satisfaction/Stakeholder Categories.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
2. High expectations for performance of students:
Scale: 1 Frequency 1 3 1 5 2 2 1
Percentage 7.7 6.8 5.9 9.1 11.1 12.5 25.0
Scale: 2 Frequency 3 4 7 2 2 3 4 4 1 3 3
Percentage 23.1 30.8 15.9 11.8 14.3 37.5 7.3 22.2 6.3 75.0 37.5
Scale: 3 Frequency 6 5 7 7 6 21 9 9 4 5 25 5 9 3
Percentage 75.0 83.3 53.8 53.8 66.7 47.7 52.9 64.3 50.0 100.0 45.5 27.8 56.3 37.5
Scale: 4 Frequency 2 1 2 1 2 7 3 2 16 2 4 2
Percentage 25.0 16.7 15.4 7.7 22.2 15.9 17.6 14.3 29.1 11.1 25.0 25.0
Scale: 5 Frequency 1 1 6 1 1 3 2
Percentage 7.7 111 13.6 5.9 7.1 5.5 111
“Do Not Know” Frequency 1 2 3
Percentage 12.5 3.6 16.7
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 27, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and
Satisfaction/Stakeholder Categories.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
3. Student and stakeholder satisfaction:
Scale: 1 Frequency 2 2 1 2 3 2 2 1 1 13.6 9 5 3
Percentage 25.0 15.4 7.7 22.2 6.8 11.8 14.3 12.5 20.0 23.6 50.0 31.3 75.0
Scale: 2 Frequency 1 2 4 3 1 17 6 5 2 3 21 4 7 1 3
Percentage 12.5 33.3 30.8 23.1 11.1 38.6 35.3 35.7 25.0 60,0 38.2 22.2 43.8 25.0 37.5
Scale: 3 Frequency 2 2 4 5 1 8 3 5 1 7 1 2 1
Percentage 25.0 33.3 30.8 38.5 111 18.2 17.6 35.7 12.5 12.7 5.6 12.5 12.5
Scale: 4 Frequency 3 2 3 4 4 14 5 2 2 1 9 1 1
Percentage 37.5 33.3 23.1 30.8 44.4 31.8 29,4 14.3 25.0 20.0 16.4 5.6 12.5
Scale: 5 Frequency 1 2 2 1 2 1
Percentage 111 4.5 25.0 1.8 11.1 6.3
“Do Not Know” Frequency 4 1 1
Percentage 7.3 5.6 6.3
(table continuesj
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
to
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 27, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and
Satisfaction/Stakeholder Categories.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
4. Future needs of students and stakeholders:
Scale: 1 Frequency 1 1 1 1 7 1 1 1 9 2 1 3 2
Percentage 12.5 7.7 7.7 11.1 15.9 7.1 12.5 20.0 16.4 11.1 6.3 75.0 25.0
Scale: 2 Frequency 1 1 1 6 5 2 8 5 6 1 2
Percentage 12.5 7.7 111 13.6 29.4 40.0 14.5 27.8 37.5 25.0 25.0
Scale: 3 Frequency 2 1 2 8 9 3 4 3 1 18 4 5
Percentage 25.0 16.7 15.4 61.5 20.5 17.6 28.6 37.5 20.0 32.7 22.2 31.3
Scale: 4 Frequency 4 5 8 2 3 15 7 8 3 1 11 3 4
Percentage 50.0 83.3 61.5 15.4 33.3 34.1 41.2 57.1 37.5 20.0 20.0 16.7 25.0
Scale: 5 Frequency 1 2 3 6 1 1 1 5 2 1
Percentage 7.7 15.4 33,3 13.6 5.9 7.1 12.5 9.1 11.1 12.5
“Do Not Know” Frequency 1 1 4 2
Percentage 111 2.3 7.3 111
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
v£>
U)
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 28. Item Frequency and Percentage of Response by Position and Size of District For Items in the Information and Analysis
Category.
Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
1. Selection and use;
Scale: 1 Frequency 1 1 5 2 1 2 13 3 5 3 4
Percentage 7.7 11.1 11.4 11.8 7.1 25.0 23.6 16.7 31.3 75.0 50.0
Scale: 2 Frequency 1 1 4 5 4 15 5 5 2 2 21 6 7 2
Percentage 12.5 16.7 30.8 38.5 44.4 34.1 29.4 35.7 25.0 40.0 38.2 33.3 43.8 25.0
Scale: 3 Frequency 3 2 7 5 10 5 2 1 6 2 2 1 1
Percentage 37.5 33.3 53.8 38,5 22.7 29.4 14.3 20.0 10.9 111 12.5 25.0 12.5
Scale: 4 Frequency 3 2 1 1 3 7 5 5 4 2 7 4 1
Percentage 37.5 33.3 7.7 7.7 33.3 15.9 29.4 35.7 50.0 40.0 12.7 22.2 6.3
Scale: 5 Frequency 1 1 1 1 1 6 1 3 1
Percentage 12.5 16.7 7.7 7.7 11.1 13.6 7.1 5.5 6.3
“Do Not Know” Frequency 1 5 3 1
Percentage 2.3 9.1 16.7 12.5
0able continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
-fc.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 28, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Information and
Analysis Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
2. Selection and use of comparative data:
Scale: 1 Frequency 1 1 2 3 2 6 1 1 7 3 3 3 2
Percentage 12.5 16.7 15.4 23.1 22.2 13.6 7.1 20,0 12.7 16.7 18.8 75.0 25.0
Scale: 2 Frequency 1 5 5 2 9 10 5 3 2 15 5 8 1
Percentage 12.5 38,5 38.5 22.2 20.5 58.8 35.7 37.5 40.0 27.3 27.8 50.0 12.5
Scale: 3 Frequency 3 3 6 3 3 13 4 5 2 1 9 4 1 3
Percentage 37.5 33.3 46.2 23.1 33.3 29.5 23.5 35.7 25.0 20.0 16.4 22.2 6.3 37.5
Scale: 4 Frequency 3 1 1 1 6 3 2 3 1 8 2 1 1 1
Percentage 37.5 16.7 7.7 11.1 13.6 17.6 14.3 37.5 20.0 14.5 111 6.3 25.0 12.5
Scale: 5 Frequency 2 1 1 8 5 1
Percentage 33.3 7.7 111 18.2 9.1 6.3
“Do Not Know” Frequency 2 1 11 4 2 1
Percentage 4.5 7.1 20.0 22.2 12.5 12.5
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 28, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Information and
Analysis Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
3. Analysis and use of school performance data:
Scale: 1 Frequency 1 2 2 6 1 1 1 14 5 5 2 2
Percentage 12.5 15.4 22.2 13.6 5.9 7.1 20.0 25.5 27.8 31.3 50.0 25.0
Scale: 2 Frequency 1 2 6 5 1 10 4 5 3 10 5 2 2 3
Percentage 12.5 33.3 46.2 38.5 11.1 22.7 23.5 35.7 37.5 18.2 27.8 12.5 50.0 37.5
Scale: 3 Frequency 3 1 2 4 2 9 6 3 2 1 7 4 1
Percentage 25.0 16.7 15.4 30.8 22.2 20.5 35.3 21.4 25,0 20.0 12.7 25.0 12.5
Scale. 4 Frequency 4 2 3 9 4 2 3 1 10 3 1 2
Percentage 50.0 15.4 23.1 20.5 23.5 14.3 37.5 20.0 18.2 16.7 6.3 25,0
Scale: 5 Frequency 3 1 1 4 8 1 2 7 2 1
Percentage 50.0 7.7 7.7 44.4 18.2 5.9 14.3 12.7 11.1 6.3
“Do Not Know” Frequency 2 1 2 7 3 3
Percentage 4.5 7.1 40.0 12.7 16.7 18.8
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
ON
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29. Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development
and Management Category.
Supervisors Principals Teachers
Items 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
1. Learning and working climate:
Scale. 1 Frequency
Percentage
4
9.1
1
12.5
14
25.5
5 4 2 2
27.8 25.0 50.0 25.0
Scale: 2 Frequency
Percentage
1
12.5
2 1
15.4 7.7
11
25.0
3 2
17.6 14.3
1
20.0
11
20.0
4 5 1
22.2 31.3 12.5
Scale. 3 Frequency
Percentage
2
25.0
1 6 4
16.7 46.2 30.8
1
11.1
14
31.8
6 4 2
35.3 28.6 25.0
2
40.0
19
34.5
4 2 1 4
22.2 12.5 25.0 50.0
Scale. 4 Frequency
Percentage
5
62.5
3 4 6
50.0 30.8 46.2
4
44.4
9
20.5
4 5 5
23.5 35,7 62.5
2
40.0
8
14.5
4 5 1 1
22.2 31.3 25.0 12.5
Scale. 5 Frequency
Percentage
2 1 2
33.3 7.7 15.4
4
44.4
6
13.6
3 3
17.6 21.4
3
5.5
1
5.6
“Do Not Know” Frequency
Percentage
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource
Development and Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
Work systems;
Scale; 1 Frequency 1 6 1 2 1 1 18 5 5 2 5
Percentage 7.7 13.6 5.9 14.3 12.5 20,0 32.7 27.8 31.3 50,0 62.5
Scale: 2 Frequency 4 4 9 3 4 3 1 17 5 5 1 2
Percentage 30.8 30.8 20.5 17.6 28.6 37.5 20.0 30.9 27.8 31.3 25.0 25.0
Scale: 3 Frequency 5 3 6 3 1 10 4 3 1 1 7 5 4
Percentage 62.5 50.0 46.2 23.1 11.1 22.7 23.5 21.4 12.5 20,0 12.7 27.8 25.0
Scale: 4 Frequency 3 1 1 5 7 15 7 5 2 2 11 2 2 1 1
Percentage 37.5 16.7 7.7 38.5 77.8 34.1 41.2 35.7 25.0 40.0 20,0 111 12.5 25.0 12.5
Scale: 5 Frequency 2 1 1 4 1 1
Percentage 33.3 7.7 11.1 9.1 5.9 12.5
“Do Not Know” Frequency
Percentage
2 1 4
3.6 5.6 25.0
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29, cont’d. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Human Resource
Development and Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
2. Work systems:
Scale. 1 Frequency 1 6 1 2 1 1 18 5 5 2 5
Percentage 7.7 13.6 5.9 14.3 12.5 20.0 32.7 27.8 31.3 50.0 62.5
Scale: 2 Frequency 4 4 9 3 4 3 1 17 5 5 1 2
Percentage 30.8 30.8 20.5 17.6 28.6 37.5 20,0 30.9 27.8 31.3 25.0 25.0
Scale. 3 Frequency 5 3 6 3 1 10 4 3 1 1 7 5 4
Percentage 62.5 50.0 46.2 23.1 11.1 22.7 23.5 21.4 12.5 20,0 12.7 27.8 25.0
Scale: 4 Frequency 3 1 1 5 7 15 7 5 2 2 11 2 2 1 1
Percentage 37.5 16.7 7.7 38.5 77.8 34.1 41.2 35.7 25.0 40.0 20.0 111 12.5 25.0 12.5
Scale: 5 Frequency 2 1 1 4 1 1
Percentage 33.3 7.7 11.1 9.1 5.9 12.5
“Do Not Know” Frequency 2 1 4
Percentage 3.6 5.6 25.0
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
vO
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource
Development and Management Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
Personal training and development:
Scale: 1 Frequency
Percentage
1
7.7
9
20.5
2
14.3
1
12.5
17
30.9
6
3.3
8 3
50.0 75.0
3
37.5
Scale: 2 Frequency
Percentage
1
12.5
1
16.7
3
23.1
6
46.2
1
111
5
11.4
3
17.6
2
14.3
2
25.0
2
40.0
9
16.4
3
16.7
1
6.3
3
37.5
Scale: 3 Frequency
Percentage
4
50.0
1
16.7
5
38.5
2
15.4
1
11.1
10
22.7
6
35.3
3
21.4
1
12.5
2
40.0
12
21.8
4
22.2
1
6.3
Scale: 4 Frequency
Percentage
2
25.0
1
16.7
4
30.8
3
23.1
6
66.7
13
29.5
6
35,3
5
35.7
4
50.0
1
20.0
8
14.5
4
22.2
1
25.0
1
12.5
Scale; 5 Frequency
Percentage
1
12.5
3
50.0
2
15.4
1
11.1
7
15.9
1
5.9
2
14.3
7
12.7
1
5.6
“Do Not Know” Frequency
Percentage
2
3.6
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource
Development and Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
4. Performance appraisal:
Scale: 1 Frequency 1 1 4 2 6 2 2 2 1 18 7 5
Percentage 16.7 7.7 30.8 22.2 13.6 11.8 14.3 25.0 20.0 32.7 38.9 31.3
Scale: 2 Frequency 5 1 3 6 1 21 6 5 3 3 20 5 9
Percentage 62.5 16.7 23.1 46.2 11.1 47.7 35.3 35.7 37.5 60,0 36.4 27.8 56.3
Scale: 3 Frequency 2 1 7 1 3 3 4 4 1 1 4 1 1
Percentage 25.0 16.7 53.8 7.7 33.3 6.8 23.5 28.6 12.5 20.0 7.3 5.6 6.3
Scale: 4 Frequency 1 3 215. 1 1 11 2 3 1 8 1 1
Percentage 12.5 50.0 4 7.7 11.1 25.0 11.8 21.4 12.5 14.5 5.6 6.3
Scale: 5 Frequency 1 2 3 2 1 3 1
Percentage 7.7 22.2 6.8 11.8 12.5 5.5 5.6
“Do Not Know” Frequency 2 2
Percentage 3.6 11.1
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 29, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource
Development and Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
5. Employee satisfaction:
Scale: 1 Frequency 6 3 21 8 7 3 3
Percentage 13,6 21.4 38.2 44.4 43.8 75.0 37.5
Scale: 2 Frequency 2 2 4 7 20 6 3 5 2 23 5 5 3
Percentage 25.0 33.3 30.8 53.8 45.5 35.3 21.4 62.5 40.0 41.8 27.8 31.3 37.5
Scale: 3 Frequency 3 2 6 4 2 6 5 1 1 4 3 1 2
Percentage 37.5 33.3 46.2 30.8 22.2 13.6 29.4 7.1 20.0 7.3 18.8 25.0 25.0
Scale: 4 Frequency 2 2 2 1 4 8 6 7 2 2 4 2 1
Percentage 25.0 33.3 15.4 7.7 4.4 18.2 35.3 50.0 25.0 4.0 7.3 11.1 6.3
Scale: 5 Frequency 1 1 1 3 4 1 3 3
Percentage 12.5 7.7 7.7 33.3 9.1 12.5 5.5 16.7
“Do Not Know” Frequency
Percentage
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
to
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational
Process Management Category.
Supervisors Principals Teachers
Items 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
1. Design of educational programs:
Scale: 1 Frequency
Percentage
6
13.6
1 1
5.9 7.1
6
10.9
6
33.3
3 3 4
18.8 75.0 50.0
Scale: 2 Frequency
Percentage
1 5 3
16.7 38.5 23.1
4
9.1
4 2 3
23.5 14.3 37.5
1
20.0
16
29.1
1
5.6
7 1 3
43.8 25.0 37.5
Scale: 3 Frequency
Percentage
5
62.5
2 5 7
33,3 38.5 53,8
1
11.1
15
34.1
7 4 1
41.2 28.6 12.5
2
40.0
17
30.9
4
22.2
Scale: 4 Frequency
Percentage
2
25.0
3 3
50.0 23.1
7
77.8
17
38.6
3 6 3
17.6 42.9 37.5
2
40.0
4
7.3
5
27.8
4 1
25.0 12.5
Scale: 5 Frequency
Percentage
1
12.5
3
23.1
1
11.1
2
4.5
1 1
7.1 12.5
7
12.7
1
5.6
1
6.3
“Do Not Know” Frequency
Percentage
5
9.1
1
5.6
1
6.3
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
U>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30. cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and
Operational Process Management Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
2. Delivery of educational programs:
Scale: 1 Frequency 1 1 5 1 1 1 9 5 4 2 4
Percentage 7.7 7.7 11.4 5.9 7.1 12.5 16.4 27.8 25.0 50.0 50.0
Scale: 2 Frequency 2 5 3 6 6 3 3 2 2 0 3 7 2 2
Percentage 33.3 38.5 23.1 13.6 35.3 21.4 37.5 40.0 36.4 16.7 43.8 50,0 25.0
Scale: 3 Frequency 3 1 5 4 1 15 3 4 2 1 10 2 1
Percentage 37.5 16.7 38,5 30.8 1 1 1 34.1 17.6 28.6 25.0 2 0 .0 18.2 11.1 6.3
Scale: 4 Frequency 3 2 2 3 6 14 5 5 1 2 6 5 4 2
Percentage 37.5 33,3 15.4 23.1 66.7 31.8 29.4 35.7 12.5 40.0 10.9 27.8 25.0 25.0
Scale: 5 Frequency 2 1 2 2 4 1 1 6 1
Percentage 25.0 16.7 15.4 22.2 9.1 7.1 12.5 10.9 5.6
“Do Not Know” Frequency 4 2
Percentage 7.3 111
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
a
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and
Operational Process Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
3. Design and delivery of educational support services:
Scale: 1 Frequency
Percentage
2
15.4
7
15.9
1 1
12.5 2 0 .0
13 2
23.6
4
1 1 1
2
25.0
1
50.0 12.5
Scale: 2 Frequency
Percentage
3
37.5
2
33.3
7
53.8
5
38.5
3
33.3
7
15.9
6
35.3
6
42.9
3
37.5
1
20 .0
17
30.9
6
33.3
7
43.8
2
50.0
4
50,0
Scale: 3 Frequency
Percentage
3
37.5
1
16.7
3
23.1
4
30.8
5
55.6
9
20,5
5
29.4
2
14.3
3
37.5
1
20 ,0
3
5.5
3
16.7
1
6.3
Scale: 4 Frequency
Percentage
1
12.5
3
50,0
3
23.1
1
7.7
1
III
12
27.3
5
29.4
6
42.9
1
12.5
11
2 0 .0
3
16.7
2
12.5
2
25.0
Scale: 5 Frequency
Percentage
1
12.5
1
7.7
8
18.2
2
40.0
3
5.5
1
5.6
“Do Not Know” Frequency
Percentage
1
2.3
8
14.5
3
16.7
2
12.5
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and
Operational Process Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
4. Data and information processes:
Scale: 1 Frequency 1 1 1 2 4 1 2 2 1 16 5 2 3 2
Percentage 12.5 16.7 7.7 15.4 9.1 5.9 14.3 25.0 2 0 .0 29.1 27.8 12.5 75,0 25.0
Scale: 2 Frequency 2 4 7 2 8 7 4 1 3 18 6 10 1 3
Percentage 25.0 30.8 53.8 2 2 .2 18.2 41.2 28.6 12.5 60,0 32.7 33.3 62.5 25.0 37.5
Scale: 3 Frequency 3 3 7 2 2 14 4 1 1 5 5 1
Percentage 37.5 50.0 53.8 15.4 22 .2 31.8 23.5 7.1 12.5 9.1 27.8 6.3
Scale: 4 Frequency 1 2 1 2 3 15 2 6 3 1 8 1 1
Percentage 12.5 33.3 7.7 15.4 33.3 34.1 11.8 42.9 37.5 2 0 .0 14.5 6.3 12.5
Scale: 5 Frequency 1 1 1 2 1 1 3
Percentage 12.5 16.7 1 1 1 4.5 7.1 12.5 5.5
“Do Not Know” Frequency 1 1 5 2 2
Percentage 11.1 2.3 9.1 11.1 12.5
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
ON
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and
Operational Process Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
Communication processes:
Scale: 1 Frequency 1 1 4 2 1 9 5 3 2 1
Percentage 16.7 7.7 9.1 14.3 2 0 .0 16.4 27.8 18.8 50.0 12.5
Scale: 2 Frequency 1 1 3 8 2 2 2 16 4 7 1
Percentage 12.5 7.7 23.1 18.2 11.8 25.0 40.0 29.1 2 2 .2 43.8 25.0
Scale: 3 Frequency 4 4 8 9 4 13 7 5 3 21 6 3 4
Percentage 50.0 66.7 61.5 69.2 44.4 29.5 41.2 35.7 37.5 38.2 33.3 18.8 50.0
Scale: 4 Frequency 2 1 3 1 4 13 5 4 1 2 6 3 3 1 2
Percentage 25.0 16.7 23.1 7.7 44.4 29.5 29.4 28.6 12.5 40.0 10,9 16.7 18.8 25.0 25.0
Scale: 5 Frequency 1 4 3 2 2
Percentage 11.1 9.1 21.4 25.0 3.6
‘Do Not Know” Frequency
Percentage 12.5
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 30, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and
Operational Process Management Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
6 . Supplier and partnering processes:
Scale: 1 Frequency 3 1 3 1 1 6 2 2
Percentage 23.1 7.7 6.8 7.1 12.5 10.9 12.5 50.0
Scale: 2 Frequency 2 3 4 5 4 7 8 4 1 3 10 3 11 1 3
Percentage 25,0 50.0 30,8 38.5 44.4 15.9 47.1 28.6 12.5 60.0 18.2 16.7 68 .8 25.0 37.5
Scale: 3 Frequency 3 1 5 6 1 12 6 5 2 5 4 1 2
Percentage 37.5 16.7 38.5 46.2 1 1 1 27.3 35.3 35.7 25.0 9.1 22 .2 6.3 25.0
Scale. 4 Frequency 2 1 1 1 1 9 4 2 2 1 6 1
Percentage 25,0 16.7 7.7 7.7 11.1 20.5 11.8 14.3 25.0 2 0 .0 10.9 6.3
Scale: 5 Frequency 1 1 2 9 1 1 3
Percentage 12.5 16.7 2 2 .2 20.5 12.5 2 0 .0 5.5
“Do Not Know” Frequency 1 4 2 1 25 11 1 1
Percentage 11.1 9.1 14.3 12.5 45.5 61.1 6.3 25.0
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31. Item Frequency and Percentage of Response by Position and Size of District For Items in the School District Performance
Results Category.
Supervisors Principals Teachers
District Size* District Size* District Size*
Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5
1. Student performance results:
Scale: 1 Frequency
Percentage
1
12.5
1
7.7
1
7.7
3
5.5
1
5.6
3
18.8
1
25.0
1
12.5
Scale: 2 Frequency
Percentage
2
33.3
3
21.1
4
30,8
2
2 2.2
7
15.9
6
35.3
5
35.7
2
25.0
1
2 0 .0
19
34.5
8
44.4
5
31.3
2
50.0
5
62.5
Scale: 3 Frequency
Percentage
2
25,0
4
66.7
5
38.5
7
53.8
5
55.6
14
31.8
6
35.3
5
35.7
3
37.5
2
40.0
12
2 1 .8
4
22 .2
4
25,0
2
25.0
Scale: 4 Frequency
Percentage
5
62.5
3
23.1
1
11.1
23
52.3
5
29.4
3
21.4
2
25.0
1
2 0 .0
18
32.7
3
16.7
4
25.0
1
25.0
Scale: 5 Frequency
Percentage
1
7.7
1
7.7
1
1 1 1
1
12.5
1
2 0 ,0
2
3.6
“Do Not Know” Frequency
Percentage
1
7.1
1
1.8
2
11.1
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
VO
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the School District
Performance Results Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
2 . Student conduct results:
Scale: 1 Frequency
Percentage
1
12.5
2
15.4
1
7.7
2
4.5
1
5.9
15
27.3
2
11.1
2
12.5
1
25.0
1
12.5
Scale: 2 Frequency
Percentage
1
12.5
1
16.7
4
30.8
1
7.7
1
11.1
8
18.2
3
17.6
9
16.4
4
2 2 .2
5
31.3
2
25.0
Scale: 3 Frequency
Percentage
1
12.5
3
50.0
4
30.8
3
23.1
2
22.2
8
18.2
3
17.6
5
35.7
1
12.5
1
2 0 .0
11
2 0 .0
2
11.1
5
31.3
1
25.0
2
25,0
Scale: 4 Frequency
Percentage
4
50.0
2
33.3
2
15.4
4
30.8
4
44.4
14
31.8
6
35.3
4
28.6
4
50.0
3
60.0
7
12.7
2
11.1
1
6.3
1
12.5
Scale: 5 Frequency
Percentage
1
7.7
3
23.1
2
22.2
3
6.8
2
14.3
2
25.0
1
20 .0
1
1.8
1
5.6
1
25.0
1
12.5
“Do Not Know” Frequency
Percentage
1
12.5
1
7.7
9
20.5
3
17.6
3
21.4
1
12.5
12
21.8
7
38,9
3
18.8
1
25.0
1
12.5
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31, cont’d. Item Frequency and Percentage of Response by Position and Size o f District For Items in the School District
Performance Results Category.
Supervisors Principals Teachers
Items 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
3. Student and stakeholder satisfaction results:
Scale: 1 Frequency
Percentage
1
16.7
1
6.3
4
9.1
5 4
9.1
1
22 .2
1
6.3
1
25.0 12.5
Scale. 2 Frequency
Percentage
1
6.3
1
12.5
7
12.7
1
6.3
1 2
25.0 25.0
Scale: 3 Frequency
Percentage
3
37.5
3
50.0
5
38.5
4
25.0
5
55.6
14
31.8
7
41.2
4
28.6
2
25.0
2
40.0
13
23.6
3
16.7
4
25.0
2
25,0
Scale: 4 Frequency
Percentage
4
50.0
2
33.3
4
30.8
1
6.3
2
22 .2
10
22.7
8
7.1
6
42,9
3
37.5
3
60.0
8
14.5
6
33.3
1
6.3
1
25.0
2
25.0
Scale: 5 Frequency
Percentage
1
12.5
1
7.7
9
56.3
2
22.2
5
11.4
2
14.3
1
12.5
3
5.5
“Do Not Know” Frequency
Percentage
3
23.1
10
22.7
1
5.9
1
7.1
1
12.5
19
34.5
5
27.8
9
56.3
1
25.0
1
12.5
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31, cont’d. Item Frequency and Percentage of Response by Position and Size o f District For Items in the School District
Performance Results Category.
Supervisors Principals Teachers
Items
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
District Size*
1 2 3 4 5
4. Human resource results:
Scale: 1 Frequency 1 3 1 6 1 1 2 1
Percentage 6.3 6.8 5.9 10.9 5.6 6.3 50.0 12.5
Scale: 2 Frequency 3 4 6 12 2 18 5 7 4 3 25 7 12 1 2
Percentage 37.5 66.7 46.2 75.0 22.2 40.9 29.4 50,0 50.0 60.0 45.5 38.9 75.0 25.0 25.0
Scale: 3 Frequency 2 1 5 2 2 1 6 2 2 1
Percentage 25.0 7.7 11.4 11.8 14.3 20 .0 10.9 11.1 12.5 12.5
Scale: 4 Frequency 1 4 2 3 8 5 1 1 7 3 1
Percentage 12.5 30.8 12.5 33.3 18.2 29.4 12.5 20 .0 12,7 16.7 12.5
Scale: 5 Frequency 2 2 2 4 6 4 2 5 1 1 1
Percentage 25,0 33.3 15.4 44.4 13.6 28.6 25.0 9.1 5.6 25.0 12.5
“Do Not Know” Frequency 1 4 1 1 6 4 1 1
Percentage 6.3 9.1 7.1 12.5 10.9 2 2 .2 6.3 12.5
(table continues)
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the School District
Performance Results Category.
Items
Supervisors Principals Teachers
1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
5. Educational program and service results:
Scale: 1 Frequency 2 1 5 2 1 2 2 12 5 4 2 2
Percentage 25.0 16.7 11.4 11.8 7.1 25.0 40.0 21.8 27.8 25.0 50.0 25.0
Scale: 2 Frequency 1 3 1 1 8 4 1 1 2
Percentage 12.5 23.1 2.3 5.9 14.5 2 2 .2 6.3 25.0 25.0
Scale: 3 Frequency 3 2 4 3 10 7 5 1 10 3 4 1
Percentage 37.5 33.3 30.8 33.3 22.7 41.2 35.7 20.0 18.2 16.7 25.0 12.5
Scale: 4 Frequency 2 3 5 4 18 4 4 4 1 10 3 3 1 2
Percentage 25.0 50.0 38.5 44.4 40.9 23.5 28.6 50.0 2 0.0 18.2 16.7 18.8 25.0 25.0
Scale: 5 Frequency 2 2 4 2 2 1 1 1
Percentage 15.4 2 2.2 9.1 14.3 25.0 2 0,0 1.8 12.5
“Do Not Know” Frequency 1 6 1 2 13 3 4
Percentage 7.7 13.6 5.9 14.3 23.6 16.7 25.0
(table continues)
*1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Table 31, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the School District
Performance Results Category.
Supervisors Principals Teachers
Items 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5 1
District Size*
2 3 4 5
6 . Educational support services results:
Scale: 1 Frequency
Percentage
2
33.3
1
2.3
2
11.8
1
12.5
4
7.3
1
5.6
1
25.0
1
12.5
Scale: 2 Frequency
Percentage
1
12.5
4 1
66.7 7.7
1
11.1
7
15.9
11 3
64.7 21.4
5
9.1
2
11.1
3 1
18.8 25.0
4
50.0
Scale: 3 Frequency
Percentage
4
50.0
4
30.8
3
33,3
5
11.4
3 1
17.6 7.1
1
2 0 .0
10
18.2
4
2 2 .2
6
37.5
3
37.5
Scale: 4 Frequency
Percentage
3
37.5
8
61.5
2
2 2 .2
10
22.7
4
28.6
1
12.5
1
2 0 .0
10
18.2
3
16.7
Scale: 5 Frequency
Percentage
1
i l l
7
15.9
5
35.7
4
50.0
2
40.0
1
25.0
“Do Not Know” Frequency
Percentage
2
2 2 .2
14
31.8
1
7.1
2
25.0
1
2 0 .0
25
45.5
8
44.4
7 1
43.8 25.0
* 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
115
Inferential Statistical Analysis
A general linear models procedure was applied, and a 3 x 5, two-way factorial
analysis o f variance was used to test the hypotheses. The assumptions o f homogeneity
o f variance, independence o f variables, and continuous dependent variables were met.
Type III sum o f squares for each construct was used since there were different numbers
in the cells.
The null hypothesis for the Leadership construct was, “Hoi: There are no
significant differences in the Leadership category o f the Performance Analysis for School
Districts by type or size.” Table 32 illustrates a significant difference was found with the
variable position at .001. No significant differences for p values at .05 were found for
district size or interaction. Scheffe’s post hoc test for multiple comparisons was done.
This test is the most conservative measure and controls for Type 1 error rate. The alpha
level o f .05 determined that there were no significant differences when comparing
superintendents and principals. However, there was a significant difference at .05 when
teachers were compared to both superintendents and principals. Therefore, the null
hypothesis is rejected.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
116
Table 32. Two-Way ANOVA Leadership Construct.
Source df SS MS F value Pr > F
Model:
Size 4 1.94558390 0.48639598 0.59 0.6736
Type 2 25.06159973 12.53079986 15.08 0.0 0 1
Size x Type 8 2.71734755 0.33966844 0.41 0.9148
Error 218 181.14322972 0.83093225
Total 232 221.53565721
The null hypothesis for the Strategic Planning construct was, “H0 2 : There are no
significant differences in the strategic planning category o f the Performance Analysis for
School Districts by type or size.” Table 33 illustrates a significant difference was found
with the variable position at .001. No significant differences for p values at .05 were
found for district size or interaction. Scheffe’s post hoc test for multiple comparisons
was done. The alpha level o f .05 determined that there were no significant differences
when comparing superintendents and principals. There was a significant difference,
however, at .05 when teachers were compared to both superintendents and principals.
Therefore, the null hypothesis is rejected.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
117
Table 33. Two-Way ANOVA Strategic Planning Construct.
Source df SS MS F value Pr > F
Model:
Size 4 8.04564589 2.01141147 1.75 0.1397
Type 2 22.17052545 11.08526273 9.66 0 .001
Size x Type 8 7.73834769 0.96729346 0.84 0.5661
Error 218 250.27735473 1.14806126
Total 232 287.11826419
The null hypothesis for the Student and Stakeholder Satisfaction construct was,
“H03: There are no significant differences in Student and Stakeholder Satisfaction
category o f the Performance Analysis for School Districts by type or size.” Table 34
illustrates a significant difference was found with the variable position at .001. No
significant differences for p values at .05 were found for district size or interaction.
Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05
determined that there was no significant difference when comparing superintendents and
principals. However, the difference between the mean o f the teachers and principals and
the mean of teachers and superintendents was large enough to be significant at .05.
Therefore, the null hypothesis is rejected.
The null hypothesis for the Information and Analysis construct was, “Ho4: There
are no significant differences in the Information and Analysis category o f the
Performance Analysis for School Districts by type or size.” Table 35 illustrates a
significant difference was found for position at .001. No significant differences for the p
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
118
values at .05 were found for district size or interaction. Scheffe’s post hoc test for
multiple comparisons was done. The alpha level o f .05 determined that there was no
significant difference when comparing superintendents and principals. However, the
difference between the mean o f the teachers and principals and the teachers and
superintendents was large enough to be significant at .05. Therefore, the null hypothesis
is rejected.
Table 34. Two-Way ANOVA Student/Stakeholder Satisfaction Construct.
Source df SS MS F value Pr > F
Model:
Size 4 3.81446595 0.95361649 1.27 0.2820
Type 2 18.42615628 9.21307814 12.29 0.001
Size x Type 8 5.01819420 0.62727427 0.84 0.571
Error 219 164.20135256 0.74977787
Total 233 188.16503443
The null hypothesis for the Human Resource Development and Management
construct was, “H0 5 : There are no significant differences in the Human Resource
Development and Managemen/ category of the Performance Analysis for School Districts
by type or size.” Table 36 illustrates a significant difference was found for factor
position at .001. No significant differences for the p values at .05 were found for district
size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha
level o f .05 determined that there was no significant difference when comparing
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
119
superintendents and principals. The difference between the mean o f the teachers and
principals and the teachers and superintendents was large enough to be significant at .05.
Therefore, the null hypothesis is rejected.
Table 35. Two-Way ANOVA Information and Analysis Construct.
Source df SS MS F value Pr > F
Model:
Size 4 5.10247757 1.27561939 1.05 0.3822
Type 2 22.98733438 11.49366719 9.46 0.001
Size x Type 8 5.87610935 0.73451367 0.60 0.7735
Error 222 269.64388447 1.21461209
Total 236 301.32301922
Table 36. Two-Way ANOVA Human Resource Development / Management Construct.
Source df SS MS F value Pr > F
Model:
Size 4 3.30666687 0.82666672 1.02 0.3975
Type 2 36.47055806 18.23527903 22.51 0.001
Size x Type 8 6.18997506 0.77374688 0.96 0.4721
Error 218 176.57388647 0.80997196
Total 232 227.56105150
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
120
The null hypothesis for the Educational and Operational Process Management
construct was, “Ho6: There are no significant differences in the Educational and
Operational Process Management category of the Performance Analysis for School
Districts by type or size.” Table 37 illustrates a significant difference was found for
position at .001. No significant difference for the p values at .05 was found for district
size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha
level of .05 determined that there were no significant differences when comparing
superintendents and principals. The difference between the means o f the teachers and
principals and the teachers and superintendents was large enough to be significant at .05.
Therefore, the null hypothesis is rejected.
Table 37. Two-Way ANOVA Educational / Operational Process Management Construct.
Source df SS MS F value Pr > F
Model:
Size 4 4.86845737 1.21711434 1.59 0.1772
Type 2 21.22474914 10.61237457 13.89 0.001
Size x Type 8 7.513 14414 0.93914302 1.23 0.2830
Error 213 162.69105458 0.76380777
Total 227 196.23901925
The null hypothesis for the School District Results construct was, “H0 7 : There are
no significant differences in the school district results category o f the Performance
Analysis for School Districts by type or size.” Table 38 illustrates a significant
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
121
difference was found for factor position at .001. No significant difference for the p
values at .05 was found for factor district size or interaction o f the factors. Scheffe’s post
hoc test for multiple comparisons was done. The alpha level o f .05 determined that there
were no significant differences when comparing superintendents and principals. The
difference between the means o f the teachers and principals and the teachers and
superintendents was large enough to be significant at .05. Therefore, the null hypothesis
is rejected.
Table 38. Two-Way ANOVA School District Performance Results Construct.
Source df SS MS F value Pr > F
Model:
Size 4 4.02605474 1.00651369 1.37 0.2444
Type 2 19.01940018 9.50970009 12.97 0.001
Size x Type 8 4.73059064 0.59132383 0.81 0.5974
Error 214 156.88729215 0.73311819
Total 228 189.84039301
Analysis o f “Do Not Know” Responses
Chi square analysis was done to analyze the “Do Not Know” responses (See
Table 39). There were no significant differences when positions were combined and
district sizes were compared for responses to the “Do Not Know” choice. When districts
were combined and positions were compared, significant differences were found at the
.05 level. Significant differences were found in all but the Strategic Planning category.
Teachers were more likely to select the “Do Not Know” category than superintendents or
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
122
Teachers were more likely to select the “Do Not Know” category than superintendents or
principals. In the Strategic Planning category, the p values was .061, which could be
considered significant if alpha was set at .10.
Table 39. Chi Square Analysis for “Do Not Know” Responses.
Construct DF Value Prob
Leadership 8 32.584 .001
Strategic Planning 4 8.989 .061
Student Stakeholder Satisfaction 4 16.521 .002
Information & Analysis 6 25.136 .001
Human Resource 4 9.783 .044
Educational Process 10 47.208 .001
School District Results 10 31.830 .001
Usefulness o f the Instrument as a Tool
All educator types and district sizes were combined to analyze the perceived
usefulness o f the instrument for self-study o f their school district (See Table 40). A total
of 81% o f the sample found the instrument to have some use. Thirteen percent found it to
have little or no use. Seventy-seven percent o f the sample responded positively to the
potential use o f the instrument for school improvement.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
123
Table 40. Combined Percentages for Usefulness of Instrument.
Extremely Somewhat Little No
As a study tool 24.4 61.8 13.0 .4
As a tool for school improvement 25.2 52.1 12.6 5.9
Comments
Qualitative analysis was done using a constant comparative approach for
emerging categories o f focus in the written comments o f the respondents. Two categories
o f comment emerged:
1. Usefulness o f instrument in school improvement process.
2. Perceptions o f current climate and approach to school improvement.
For the first category o f written comments, respondents who felt the instrument to
have potential utility commented on its thoroughness, scope, and organization. Some
suggested it could assist in focusing attention to specific components o f the organization
and could force reflective thought about school improvement. Some indicated that it
could be a “rubric” for analysis, while others indicated it could show “growth” and
development in school improvement. Those who felt that the instrument had little or no
use commented that it was too long, complicated, and time consuming. Several
comments were related to use o f “jargon.” Two respondents suggested that since each
district was different, one instrument would not be appropriate. The responses of
participants from the smaller size districts indicated that this approach would not be
necessary in their district.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
124
For the second category o f written comments dealing with perceptions o f the
current climate and approach to school improvement, the majority o f comments had to do
with respondent’s opinions regarding school improvement and how it is designed in their
district. There were several comments, mostly by teachers, which communicated a lack
of confidence that their opinions would be seriously considered in decision-making or
that their involvement would be desired by the administration. The teachers were also
more likely to comment on their lack o f access to information needed to answer some of
the items on the instrument. There were several comments which reflected frustration
with school change and the lack o f resources to do anything about it. Teachers also
commented that they were unaware o f how school improvement and decisions were made
currently.
Summary
The instrument was determined to be a reliable tool for internal consistency. The
results o f this study found a significant difference between the perceptions o f teachers
and those o f superintendents and principals related to seven constructs in a school district
organization. No significant differences were found between superintendents and
principals or among the five sizes of school districts in Idaho. No significant interaction
between the factors o f position and size of district was found. The majority o f
respondents felt the instrument could be potentially useful for self-study o f their school
district.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
125
Chapter 5
Summary, Conclusions, and Recommendations
Summary
The evolution of school reform calls for an increased emphasis on a clear sense o f
focus, clear expectations from stakeholders, analysis o f root causes o f the problems, and
the realization that the call for increased evidence is not subsiding. School reform suffers
from too much scatter and the absence o f a framework, the implementation o f solutions,
and the measurement of the effectiveness o f programs (Consortium on Productivity in
Schools, 1995). The lessons learned from reform to date indicate poorly defined
problems, lack o f process for improvement, the absence o f measurement and the
intractability o f the educational system are significant factors for the lack of impact
(Bernhardt, 1994; Deming, 1994; English & Hill, 1994; Fields, 1994; Fullan, 1992;
Fullan, 1993; Fullan, 1997; Fullan and Miles, 1992; Sagor, 1995; Scholtes, 1995; Senge,
1990). The emerging emphasis on improving the performance of organizations is to
adopt a systems model. Increasingly, employees and consumers are looking for the
interconnectedness o f work function, process, and productivity (Senge, 1990).
Continuous improvement is not a new approach for some businesses or service
agencies such as hospitals. Quality theory and practice have gone through an evolution,
much like school reform. Early stages involved costly and inefficient inspection-based
practices (Wu, 1996). Quality control systems emerged that identified defects upon
inspection. As quality theory and application developed, a focus on prevention emerged.
This emphasis led to an emphasis on the proactive assessment o f the organization in order
to improve management systems, service, and product quality. What emerged was a need
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
126
to learn from our work organization and understand the problem before the solution was
prescribed.
The Malcolm Baldrige National Quality Award was originally developed to
motivate businesses to excel in productivity and operations as well as to recognize their
performance. Award winners were obligated to share their lessons with other companies
who wished to achieve the same high standard o f performance. An unanticipated result
of the award process was the development o f criteria for the organizational self-study o f
performance for the purpose o f improvement (Caravatta, 1997).
Models of school district performance have traditionally been inspection-based
and have reflected a minimal standard of compliance. Effective models for
demonstrating performance were needed, with clearly defined indicators o f excellence
that reflected the comprehensive scope of the school district as an interconnected system.
This study examined the current literature on organizational improvement and traditional
approaches used by schools to assess their effectiveness, and proposed a model for
improvement grounded in quality theory. The research also investigated the perceptions
of Idaho educators about the performance o f their school districts, using an instrument
adapted from the Malcolm Baldrige National Quality Award, and based on a systems
model and quality theory.
Conclusions
The instrument developed— Performance Analysis for School Districts— was
found to be reliable and demonstrate internal consistency. The study examined
perceptual differences o f current performance all seven areas and the influence o f two
variables— educator position and size o f district— were explored. Hypotheses for the
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
127
categories o f Leadership, Strategic Planning, Student and Stakeholder Satisfaction,
Information and Analysis, Human Resource Development, Educational Process
Management, and School District Results were tested using a factorial two-way analysis
o f variance. Each hypothesis was rejected based on the significance o f one o f the main
effects. A significant difference was found in the responses of teachers when compared
to both superintendents and principals in each of the seven areas. No significant
difference was found between the responses o f superintendents and principals. The size
o f the district was not found to be a factor in responses in any category. There was no
significant interaction between size o f district and educator position.
Teachers responded at a significantly higher rate with the “Do Not Know”
selection. The majority o f responses were answered positively regarding the potential
usefulness o f the instrument for organizational analysis and school improvement.
However, comments made by respondents suggested that the length, complexity, and
language might be obstacles. Lack o f knowledge about the scope o f operations was
apparent on the part of respondents and they commented on the fact consistently.
Teacher comments also indicated doubts that administrators desired their sincere
involvement or opinion.
The Performance Analysis for School Districts is a reliable tool for consistently
assessing perception o f the current situation in each construct examined. The reliability
o f the School District Results construct could be improved to increase reliability, but
findings were still positive.. Although the response rate was only 36%, the total number
was large enough to positively influence the power o f the statistical analysis. The use of a
stratified random sample increased the generalization o f the findings to the larger
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
128
population o f Idaho educators. The findings only measured perceptions o f current
performance and did not account for varying levels o f prior knowledge o f quality or
organizational performance. The differences in the perceptions o f teachers and
administrators were statistically highly significant. The instrument has the potential to be
useful in discovering differences in perception among various components o f operation
and performance. Teachers clearly do not have comprehensive knowledge o f the system
in which they work. Administrators consistently perceived the current situation to be
better than the teachers did. Comments suggested that educators desire a simple, quick
approach to school district performance and generally feel cynical toward their role in
impacting school improvement.
The findings o f this study are o f practical significance for leadership and teachers
as school improvement strategies and management systems are designed. The need for a
model to be used as a framework for improving school district performance is apparent.
Figure 3 suggests such a framework to improve school productivity based on the theory
of quality within a systems perspective.
Recommendations
The Malcolm Baldrige National Quality Award takes into account the systems
and multiple constituencies and goal attainment models for organizational effectiveness
described in this study. Increasingly studies are examining the applications of these
models in a variety o f settings (Danne, 1992; Fritz, 1993; Miller, 1993; Smith, 1995;
Thompson, 1996; Wu, 1996). While most studies measured perception and opinion, they
often neglect to examine objective measures o f actual improvements in organizations that
have applied a self-study process, quality applications, and a systems approach.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
129
Leadership
K-12
Educational System
Process Management
Stakeholder and Student Expectations and Satisfaction
Strategic Planning
Information & Analysis Human Resource
Development & Management
Results
Figure 2. A Quality Systems Model for Performance Improvement.
approach to improving organizational performance. Continued research is needed to
study the application and the outcomes achieved as a result of this approach.
Consideration might be given to a causal-comparative design for such investigations
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
130
The findings related to differences in the perceptions o f educators should be
further investigated, and the relationship between how administrators lead and the teacher
perceptions o f the system as a whole are also important issues. Exemplary companies
that have been recognized as benchmarks for human resource practices could be
investigated to determine possible applications for schools o f the 21st century. Studies
could also explore how teacher performance is effected by teacher attitudes and levels o f
pride in the organization. Larger samples selected from other states could further
increase the generalization potential of the findings. Other strategies, besides the mail
survey approach, should be considered to simulate actual use of the instrument.
The instrument should be applied in a school district(s) to determine its potential
usefulness as a starting point for organizational improvement. A case study model which
simulates the MBNQA preparation and review process may be an effective research
design. Case studies for in-depth analysis could contribute significantly to the body of
knowledge about using quality approaches in schools.
The necessity for an agreed upon framework and criteria to measure productivity
and accountability is apparent. Fullan (1980) suggested that, “because it is so difficult to
evaluate the effectiveness o f organizations with anarchistic characteristics, managers tend
to seek, simple, uncomplicated indicators to justify their effectiveness. Decision theorists
have found that when complex and ambiguous situations are encountered, overly
simplistic decisions are applied. Overly simplistic indicators are frequently relied on.
The results o f this study and the comments made by the respondents support this notion.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
131
References
Anderson, G., Herr, K. & Nihlen, A. (1994). Studying your own school. Thousand Oaks,
CA: Corwin Press, Inc.
Bass, B. M. (1952). Ultimate criteria o f organizational worth. Personnel Psychology, 5, 57-
73.
Bemowski, K. (1996). Baldrige Award celebrates its 10thbirthday with a new look. Quality
Progress, 29, 49-54.
Bemowski, K. & Stratton, B. (1995). How do people use the Baldrige Award criteria?
Quality Progress, 28, 43-47.
Bennis, W. (1976). The planning o f change, (3rded.). New York: Holt, Rinehart and
Winston.
Bernhardt, V. (1994). The schoolportfolio. Princeton, NJ: Eye on Education.
Bonstingl, J. (1996). Schools o f quality. Alexandria, VA: Association for Supervision and
Curriculum Development.
Bracey, G. (1997). Setting the record straight. Alexandria, VA: Association for
Supervision and Curriculum Development.
Bradley, L. (1993). Total quality managementfo r schools. Lancaster, PA: Technomic
Publishing.
Brassard, A. (1993). Conceptions of organizational effectiveness revisited. The Alberta
Journal o f Educational Research, 39, 143-62.
Berman, McLaughlin. (1977).
Brown, M. (1994). Baldrige Award Winning Quality. Milwaukee, WI: ASQC Quality
Press.
Bushweller, K., Ed. (1996). Education vital signs. The American School Board, 183, A l-
A31.
Cameron, K. Critical questions in assessing organizational effectivness. Organizational
Dynamics, 9, 66-80.
Campbell, J. P., Brownas, E. A., Peterson, N. G. & Dunnetee, M. D. (1974). The
measurement o f organizational effectiveness: A review o f relevant research and
opinion. San Diego: Naval Personnel Research Center.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
132
Caravatta, M. (1997). Conducting an organizational self-assessment using the Baldrige
Award criteria. Quality Progress, 30, 87-91.
Chappell, R. (1993). Effects o f the implementations o f total quality management on the
Rappahannock County, Virginia public schools. (Doctoral dissertation, Virginia
Polytechnic Institute and State University, 1993). Dissertation Abstracts
International.
Collins, B. & Huge, E. (1993). Management by policy. Milwaukee, WI: ASQC Quality
Press.
Consortium on Productivity in Schools. (1995). Using what we have to get the schools we
need. New York: Institute on Education and the Economy.
Council on Competitiveness. (1995). Building on the Baldrige: American qualityfo r the 21st
century. Washington, DC: Council on Competitiveness.
Crosby, P. (1984). Quality without tears. New York: Plume Press.
Crosby, P. & Reimann, C. (1991). Criticism and support for the Baldrige Award. Quality
Progress, M ay, 41-44.
Danne, D. J. (1991). Total quality management and its implications for secondary
education. (Doctoral dissertation, Pepperdine University, 1991). Dissertation
Abstracts International.
Deer, C. (1976). ‘O. D .’ won’t work in schools. Education and Urban Society, 8, 227.
Deming, W. E. (1986). Out o f the crisis. Cambridgem MA: Massachusetts Institute of
Technology, Center for Advanced Engineering Study.
Deming, W. E. (1989). Foundationfo r management o f quality in the western world. Paper
presented at the meeting o f the Institute o f Management Science, Osaka, Japan.
Deming, W. E. (1994). The new economicsfo r industry, government and education.
Cambridge, MA: Massachusetts Institute o f Technology, Center for Advanced
Engineering Study.
DeMont, B. & DeMont, R. (1973). A practical approach to accountability. Education
Technology, 40-45.
Dubin, R. (1976). Organizational effectiveness: Some dilemmas o f perspective.
Organizational and Administrative Sciences, 7, 7-14.
Edmonds, R. R. (1979). Some schools work and more can. Social Policy, 9, 28-32.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
133
Edmonds, R. (1980). Effective schools for the urban poor. Educational Leadership, 40, 15-
23.
Eisenberg, E. & Goodall, H. (1993). Organizational communication. New York: St.
Martin’s Press.
Elam, S., Rose, L., & Gallup, A. (1996). The 28thannual Phi Delta Kappa/Gallup poll o f the
public’s attitudes toward the public schools. Phi Delta Kappan, 78, 41-59.
English, F. (1988). Curriculum auditing. Lancaster, PA: Technomic Publishing Co.
English, F. & Hill, J. (1994). Total quality education. Thousand Oaks, CA: Corwin Press.
Etzioni, A. (1960). Two approaches to organizational analysis: A critique and a suggestion.
Administrative Science Journal, 5, 257-278.
Feigenbaum, A. V. (1983). Total quality control. New York: McGraw-Hill.
Fields, J. (1994). Total qualityfo r schools a guidefo r implementation. Milwaukee, WI:
ASQC Quality Press.
Fritz, S. (1993). A quality assessment using the Baldrige criteria: Non-academic service
units in a large university. (Doctoral dissertation, University of Nebraska— Lincoln).
Dissertation Abstract International.
Fullan, M. (1980). Organizational development in schools: The state o f the art Review o f
Educational Research, 50, 121-183.
Fullan, M. & Miles, M. (1992). Getting reform right: What works, what doesn’t. Phi Delta
K appan, 73, 744-52.
Fullan, M. (1993). Innovation, reform, and restructuring strategies. In Cawelti, G. (Ed.),
Challenges and achievements o fAmerican education innovation, reform, and
restructuring strategies (pp. 116-133). Alexandria, VA: Association for Supervision
and Curriculum Development.
Fullan, M. (1997). Emotion and hope: Constructive concepts for complex times. In
Hargreaves, A. (Ed). Rethinking educational change with heart and mind (pp. 216-
233). Alexandria, VA: Association for Supervision and Curriculum Development.
Gall, M., Borg, W. & Gall, J. (1996). Educational Research An Introduction (6thed.).
White Plains, NY: Longman.
Garvin, D. (1988). Managing quality: The strategic and competitive advantage. New York:
Free Press.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
134
Georgopoulos, B. & Tannenbaum, A. (1957). The study o f organizational effectiveness.
American Sociological Review, 22, 534-550.
Gerstner, L., Semerad, R., Doyle, D. & Johnston, W. (1995). Reinventing education
entepreneurship in Am erica’s public schools. New York: Plume Press.
Glasser, W. (1992). The quality school. New York: Harper Collins.
Goodlad, J. (1984). A place called school. New York: McGraw-Hill.
Green, D. (1996). A case for the Koalaty Kid. Quality Progress, 29, 97-99.
Guba, E. (1981). Investigative journalism. In Smith, N. L. (Ed.). New techniquesfo r
evaluation. Beverly Hills, CA: Saga Publications.
Hannan, M. & Freeman, J. (1977). Obstacles to the comparative study and organizational
effectiveness. In Goodman, P. & Pennings, J. (Eds.), New perspectives on
organizational effectiveness. San Francisco: Jossey-Bass.
Hargreaves, A. (1997). Rethinking educational change: going deeper and wider in the quest
for success. In Hargreaves, A. (Ed.), Rethinking educational change with heart and
mind (pp. 1-26). Alexandria, VA: Association for Supervision and Curriculum
Development.
Hersey, P. & Blanchard, K. (1982). Management o f organizational behavior: Utilizing
human resources. Englewood Cliffs, NJ: Prentice-Hall.
Hodgkinson, H. (1996). Why have Americans never admired their own schools? The
School Administrator, 53, 18-22.
Hoy, W. & Ferguson, J. (1985). A theoretical framework and exploration o f organizational
effectiveness of schools. Educational Administration Quarterly, 21, 117-134.
Houston, P. (1996). From Horace Mann to the contrarians. The School Administrator, 53,
10-13.
Huelskamp, R. (1993). Perspectives on education in America. Phi Della Kappan, 7 4 , 1 18-
721.
Idaho State Department o f Education. (1996). Accreditation Standards and Procedures For
Idaho Schools. Boise: Author.
Idaho State Department o f Education. (1996). Idaho School Profiles, 1995-96. Boise:
Author.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
135
Imai, M. (1986). Kaizen: The key to Japan's competitive success. New York: McGraw-Hill
Publishing.
Jaeger, R. & Hattie, J. (1996). Artifact and artifice in education policy analysis: It’s not all
in the data. The School Administrator, 53, 24-29.
Joint Committee on Standards For Educational Evaluation. (1994). The program evaluation
standards, (2nded.). Thousand Oaks, CA: Sage Publications.
Juran, J. (1988). Juran on planningfo r quality. Cambridge, MA: Productivity Press.
Kamen, H. (1993). A study o f the impact o f the curriculum audit process in three school
systems. (Doctoral dissertation, University o f Cincinnati, 1993). Dissertation
Abstracts International.
Katz, D. & Kahn, R. (1978). The social psychology o f organizations, (2nd ed.). New York:
Wiley.
Kearns, D. & Doyle, D. (1988). Winning the brain race. San Francisco, CA: Institute for
Contemporary Studies Press.
Kennedy, M. (1995). An analysis and comparison o f school improvement planning models.
(Doctoral Dissertation, University o f Central Florida, 1995/ Dissertation Abstracts
International.
Klaus, L. (1996). Quality Progress sixth quality in education listing. Quality Progress, 29,
29-45.
Krejcie, R. & Margan, D. (1970). Determining sample size for research activities.
Educational and Psychological Measurement, 30, 607-610.
Langford, D. & Cleary, B. (1995). Orchestrating learning with quality. Milwaukee, WI:
ASQC Quality Press.
Levine, R. & Fitzgerald, H. Living systems, dynamical systems, and cybernetics. In Levine,
R. & Fitgerald, H. (Eds.). (1992/ Analysis o f Dynamic Psychological Systems,
Volume I. New York: Plume Press.
Lezotte, L. (1989). Base school improvement on what we know about effective schools. The
American School Board Journal, 176, 18-20.
Lezotte, L. W. & Bancroft, B. A. (1985). School improvement based on effective schools
research: A promising approach for economically disadvantaged and minority
students. Journal o f Negro Education, 54, 301-312.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
136
MacLellan, D. (1994). Towards a new approach for school system evaluation. (Doctoral
dissertation, Dalhousie University, Nova Scotia). Dissertational Abstracts
International.
Mann, D. (1976/ Policy decision-making in education: An introduction to calculation and
control. New York: Teachers College Press.
McCIanahan, E. & Wicks, C. (1994). Futureforce. Glendale, CA: Griffen Publishing.
Miller, S. (1993). The applicability of the Malcolm Baldrige National Quality Award
criteria to assessing the quality o f student affairs in colleges. (Doctoral dissertation,
Ohio University— Athens, 1993). Dissertation Abstracts International.
National Institute o f Standards and Technology. (1994). Malcolm Baldrige national Quality
award. Gaithersburg, MD: United State Department o f Commerce and Technology
Administration.
National Institute o f Standards and Technology. (1995). Malcolm Baldrige national quality
award education pilot. Gaithersburg, MD: United State Department of Commerce
and Technology Administration.
Nicolis, G. & Prigogine, I. (1977). Self-organization in nonequilibrium systems: From
dissipative structures to order through fluctuations. New York: Wiley.
Northwest Association o f Schools and Colleges. (1996). Setting World Standards For
Accreditation.
Nowakowski, J., Bunda, M., Working, B. & Harrington, P. (1985). A handbook of
educational variables. Boston, MA: Kluover-Nijhoff.
O’Neil, John. (1995). On schools as learning organizations: A conversation with Peter
Senge. Educational Leadership, 52, 20-23.
Owens, R. (1970). Organizational Behavior in Schools. Englewood Cliffs, NJ: Prentice-
Hall.
Pannirselvam, G. (1995). Statistical validation o f the Malcolm Baldrige National Quality
Award model and evaluation process. Unpublished doctoral dissertation, Arizona
State University, Tempe.
Partin, J. (1992). A measurement o f total quality management in the two-year college
districts o f Texas. (Doctoral dissertation, East Texas State University). Dissertation
Abstracts International.
Patterson, J. (1993). Leadershipfo r tomorrow’s schools. Alexandria, VA: Association for
Supervision and Curriculum Development.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
137
Patterson, J., Purkey, S. & Parker, J. (1986). Productive school systemsfo r a rational world.
Alexandria, VA: Association for Supervision and Curriculum Developnment.
Patton, M. (1983). Qualitative evaluation methods. Beverly Hills, CA: Sage Publications.
Pines, E. (1990). From top secret to top priority: The story of TQM. Aviation Week &
Space Technology, S5-S24.
Portner, J. (1997, March). Once a status symbol for schools, accreditation becomes rote
drill, Education Week, xvi (1), 30-31.
Provus, M. (1971). Discrepancy evaluationfo r education program: Improvement and
assessment. Berkley, CA: McCutchan.
Purkey, S. & Smith, M. (1982). Too soon to cheer?: Synthesis o f research on effective
schools. Educational Leadership, 43, 64-69.
Regeuld, M. (1993). A study o f continuous improvement processes based on total quality
management principles as applied to the educational environment. (Doctoral
dissertation, Pennsylvania State University). Dissertation Abstracts International.
Rhodes, L. (1990). Beyond your beliefs: Quantum leaps toward quality schools. The School
Administrator, 47, 23-26.
Rotberg, Iris. (1996). Five myths about test score comparison. The School Administrator,
53, 30-35.
Rubin, S. (1994). Public schools should learn to ski. Milwaukee, WI: ASQC Press.
Sagor, R. (1995). Overcoming the one-solution syndrome. Educational Leadership, 52, 24-
27.
Sarason, S. (1990). The predictable failure o f educational reform. San Francisco, CA:
Jossey-Bass Publishers.
Schmoker, M. (1996). Results the key to continuous improvement. Alexandria, VA: ASCD.
Scholtes, P. (1993, February). Quality Learning Series. Paper presented by the Quality
Learning Services Division o f the U. S. Chamber of Commerce. Madison, WI: Joiner
Associates.
Scriven, M. (1973). Goal-free evaluation. In House, E. R. (Ed.), School evaluation: The
politics and process. Berkley, CA: McCutchan.
Seigal, P. & Byrne, S. (1994). Using quality to redesign school systems. San Francisco,
CA: Jossey-Bass Publishers.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
138
Senge, P. (1990). The Fifth Discipline. New York: Currency Doubleday.
Senge, P., Kleiner, A., Roberts, C., Ross, R. & Smith, B. (1994). The Fifth Discipline
Handbook. New York: Doubleday.
Sergiovanni, T. (1992). M oral Leadership. San Francisco, CA: Jossey -Bass Publishers.
Shipley, J. & Collins, C. (1997). Going to scale with TOM. Tallahassee, FL: Southeastern
Regional Vision for Education.
Skrtic, T. (1991). Behind Special Education. Denver, CO: Love Publishing.
Smith, R. (1996). The impact o f training based on the Malcolm Baldrige National Quality
Award on employee perceptions. Unpublished doctoral dissertation, University of
Idaho, Moscow.
Stake, R. (1967). Toward a technology for the evaluation o f educational programs. In
Tyler, R., Gagne, R. & Scriven, M. (Eds.). Perspectives o f curriculum evaluation
(pp. 1-12). Chicago: Randy McNally.
Stampen, J. (1987). Improving the quality o f education: W. Edwards Deming and effective
schools. Contemporary Education Review, 3,423-433.
Stefanich, G. (1983). The relationship o f effective schools research to school evaluation,
North Central Association Quarterly, 88, 343-349.
Stufflebeam, D. (1983). The CEPP model for program evaluation. In Maciaus, G., Scriven,
M. & Stufflebeam, D. (Eds.). Evaluation models: Viewpoints in educational and
human services evaluation. Boston, MA: Kluver-Nijhoff Publishing.
Thomas, W. & Moran, K. (1992). Reconsidering the power o f the superintendent in the
progressive period. American Educational Research Journal, 29, 22-50.
Timpane, M. & Reich, R. (1997). Revitalizing the ecosystem for youth. Phi Delta Kappan,
464-470.
Tribus, M. (n.d.). The transformation o fAmerican education to a system fo r continuously
improved learning. Hayward, CA: Exergy, Inc.
Vertiz, V. (1995). What the curriculum audit reveals about schools. The School
Administrator, 25-27.
Vertiz , V. & Bates, G. (1995, April). The Curriculum Audit: Revelations About Our
Schools. Paper presented at the meeting o f the American Educational Research
Association, Division B.
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
139
Wagner, T. (1996). Bringing school reform back down to Earth. Phi Delta Kappan, 78,
145-149.
Wagner, T. (1993). Systemic change: Rethinking the purpose o f school Educational
Leadership, 51, 24-28.
Walton, Mary. (1990). Deming management at work. New York: Putnam Publishing Co.
Weick, E. (1976). Educational organizations as loosely couple systems. Administrative
Science Quarterly, 21, 1-19.
Wiersma, W. (1995). Research M ethods in Education. Needham Heights, MA: Allyn and
Bacon.
Worthen, B. & Sanders, J (1987). Educational Evaluation. White Plains, NY: Longman.
Wu, Hung-Yi. (1996). Development o f a self-evaluation system fo r total quality
management using the Baldrige criteria. Unpublished doctoral dissertation,
University of Missouri, Rolla.
Yuchtman, E. & Seashore, S. (1967). A system resource approach to organizational
effectiveness. American Sociological Review, 32, 891-903.
Zammuto, R. (1982). Assessing organizational effectiveness. Albany, NY: State University
o f New York Press.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Appendix A: Instrument
1.0 Leadership
This category examines the school district’s leadership (defined as the board of trustees and senior
administrators, at the district level) personal leadership and involvement in creating and sustaining a student focus, clear goals, high
expectations, and a leadership system that promotes performance excellence.
Please select I from thefive choices in each box which most closely describes your school district in that area. Fill in the circle
completely next to the statement selected with pencil or black ink.__________________________________________________________
Item # Description
1.1 The extent to which
there is clear direction
throughout the district
Definitions:
*Stakeholder- Individuals or
groups, both internal to the
school (students, all
personnel) and external
(parents, community
members, business) which
are affected by the
conditions and quality of
education and the
preparedness o f graduates.
1.1.1
1.1.2
1.1.3
1.1.4
1.1.5
1.1.6
A clearly communicated and consistent direction of the district focus based on
stakeholder* needs and expectations does not exist.
A clearly communicated and consistent direction of the district focus based on
stakeholder* needs and expectations exists. It is not widely disseminated or used by
personnel.
A clearly communicated and consistent direction of the district focus based on
stakeholder* needs and expectations exist and is known throughout the district. It is does
not appear to be used consistently in district decision-making.
A clearly communicated and consistent direction of the district focus based on
stakeholder* needs and expectations exists and appears to be considered to some degree
when making planning decisions. This information is communicated to parents.
A clear direction of district focus is broadly communicated throughout the community. It
guides all major decisions throughout the entire system. The district direction is
systematically re-evaluated through an improvement cycle, involving multiple stakeholders
* and sources of information.
I do not know.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
1.2 The extent to which a
review process exists to
study the performance**
of the district
Definitions:
*Stakeholder- Individuals or
groups, both internal to the
school (students, all
personnel) and external
(parents, community
members, business) which
are affected by the
conditions and quality o f
education and the
preparedness o f graduates.
**Performance refers to the
results produced by the
school district as illustrated
by multiple sources of
information.
1.2.1
1.2.2
1.2.3
1.2.4
1.2.5
1.2.6
A review process is done to meet state accreditation standards as often as required. There
does not appear to be any other structured review process to determine how the school
district is meeting or exceeding the needs and expectations of its stakeholders*.
Leaders study performance** information on an annual basis, using standardized test
scores, attendance, enrollment and financial information.
A systematic process is being developed district wide to establish multiple reliable and
valid indicators of student and district performance* *.
A study process exists to regularly assess the school district performance based on
multiple sources of information. All personnel regularly assess the performance of their
programs which is aligned to the district study process. This information is used to create
district direction and areas for improvement.
A systematic process for review of the district’s performance exists which exceeds the
requirements for state accreditation. Multiple stakeholders are involved. Decisions are
based on multiple indicators of performance information, community demographics, and
forecasting future needs. The review process is systematically evaluated and improved as
necessary.
I do not know.
1.3 Leadership’s role in
improvement efforts
Definition:
*Stakeholder- Individuals or
groups, both internal to the
school (students, personnel)
1.3.1
1.3.2
Leaders** initiate few district-wide improvements. Most improvements are initiated at
the building level.
Leaders** initiate district-wide improvements and allocate existing resources to support
those improvements.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
and external (parents,
community members,
business) which are affected
by the conditions and quality
of education or the
preparedness ofgraduates
**Leaders defined as Board
o f Trustees and senior
administrators at the district
level.
1.3.3
1.3.4
1.3.5
1.3.6
Leaders** personally encourage and advocate for school or district wide program
improvements that are consistent with district directions.
Leaders** are committed to continuous improvement of district performance based on a
thorough understanding o f the needs and expectations of stakeholders**. District
resources are allocated to accomplish the targeted improvement areas.
Leaders** are visibly involved and facilitate improvements throughout the system.
Leaders** monitor the system for progress and view themselves as accountable for the
performance of the district. There is a systematic review of the role of leadership in
improvement efforts.
1 do not know.
1.4 The extent to which
there exists a district-
wide collaborative and
participatory approach to
management*
Definitions.
*Defined asjointly working
to identify problems and
determining improvements
with others in the
organization who are
knowledgeable, involved and
affected by any decisions
made.
1.4.1
1.4.2
1.4.3
1.4.4
There appears to be little collaboration or substantive involvement of appropriate
personnel in decision-making. Important decisions are made at a district level.
There are vehicles in school district units** to make some decisions. District-level
decision making and approval is still the primary vehicle for significant decisions.
Leaders*** value participatory decision-making and are actively working to improve and
develop these processes in all school district units**.
Effective processes to involve personnel, parents and community stakeholders are in place
in all school district units**. Decision-making responsibilities and expectations are clearly
defined for the school district unit**.
to
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
**School district units are
defined as specific schools,
departments, or services of
that school district.
**Leaders defined as Board
o f Trustees and senior
administrators at the district
level
1.4.5 Collaborative and participatory management is institutionalized, valued and a part of the
culture. All personnel of the school community and significant stakeholders are included.
Collaborative processes are systematically evaluated to determine effectiveness and
improvements are made as necessary.
1.5. Board policy
Definitions:
*Stakeholder- Individuals or
groups, both internal to the
school (students, personnel)
and external (parents,
community members,
business) which are affected
by the conditions and quality
o f education or the
preparedness o fgraduates
1.5.1
1.5.2
1.5.3
1.5.4
1.5.5
1.5.6
Board policies are developed by the Board of Trustees with minimal involvement of
stakeholders* and reviewed as becomes necessary.
There is some involvement of stakeholders in the development and review of Board
Policies.
The district is developing a systematic process to review and design board policy
involving stakeholders to support the district direction and strategic plan.
Board policies have been revised to support the district direction and strategic plan.
Polices support the delivery of curriculum and instruction. Processes exist to
communicate this information to personnel who implement policy.
A systematic review process of board policy exists to insure alignment to district direction
and strategic plan. A review process of district performance is aligned to strategic
planning processes. Board polices are clearly communicated to all personnel.
I do not know.
U>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
1.6. School District
Responsibility to the
Public
Definition:
*Leaders defined as Board
o f Trustees and senior
administrators at the district
level
1.6.1
1.6.2
1.6.3
1.6.4
No defined processes exist to anticipate public concerns or expectations.
Some activities have been done to involve the community, but are usually reactive to an
expressed public concern. There is some involvement of the leadership in activities to
strengthen and support their communities.
Leaders* recognize the importance of anticipating public needs and expectations and are
developing processes to do so. Leaders* are actively involved in the community.
Leaders* invite stakeholders into operations of the district and are actively involved in
community groups to solve community problems.
1.6.5 The district serves as a role model of outreach and public service to the community
through effective processes to anticipate the public’s interests and needs. The district
systematically evaluates its own effectiveness in involving the public and being involved in
the community.
1.6.6 I do not know.
1.7 State, Legal and Ethical
Conduct in Operations
1.7.1 Citations for non-compliance to required federal, state regulations have frequently been
found. Expectations for ethical conduct are not clarified.
Definition:
*School district units are
defined as specific schools,
departments, or services of
that school district.
1.7.2
1.7.3
Citations for non-compliance are being addressed. Ethical conduct is demonstrated by
leadership.
Efforts are made to achieve compliance throughout all school district units*.
£
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
1.7.4
1.7.5
1.7.6
Leaders demonstrate high commitment to legal and ethical conduct in all aspects of the
school district through existing policies and practices. Prevention processes are in place
to ensure performance above minimal requirements.
The school district exceeds minimal requirements for compliance to federal, state
regulations. Leaders are role models for ethical conduct in all activities within the district
and the community. There is a systematic process in place to review policies and
practices related to ethical standards and compliance to legal requirements.
1 do not know.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
2.0 Strategic Planning
This category examines how the school district sets strategic directions and how it determines
stakeholders* expectations of the school district. Please consider how these stakeholder expectations and requirements are translated
into an effective performance management system, with a primary focus on student performance.
(*Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members,
business) which are affected by the conditions and quality o f education or the preparedness o f graduates; also includes state
agencies and their requirements)
Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle
Item # Description
2.1 Strategic
Development*
Definitions:
2.1.1 No school district improvement plan exists. There appears to be little organized effort to
examine performance of the school district, including student performance and district
operations.
*Defined as the
process by which
members of an
organization clarify
2.1.2 The school district does have some goals and objectives, usually associated with state or
federal requirements. Traditional indicators including standardized tests scores, compliance
reviews and accreditation ratings are used to determine district improvements.
the purpose and
develop the necessary
procedures and
operations to achieve
that purpose. A
2.1.3 District is developing a more comprehensive approach to strategic planning by examining
needs of students, analyzing the current performance of students, focusing on the needs and
expectations of stakeholders through the collaborative involvement of all school district
units** and community.
strategic plan is
designed.
2.1.4 District considers extensive sources of information both internal to the district and external,
such as community demographic information. Representatives from a variety o f stakeholder
groups participate in strategic planning. Clear long and short term goals and objectives exist.
Measures are defined, with clearly specified timelines and responsibility. School district
units** develop plans consistent with district strategic plan.
O n
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
**SchooI district units
are defined as specific
schools, departments,
or services of that
school district.
2.1.5
2.1.6
District has specific short and long term goals and objectives in place established as a result of
a comprehensive assessment of student performance and district operations, stakeholder needs
and expectations through the collaboration of stakeholders. All aspects of the school district
are examined to support the implementation and accomplishment of the goals and objectives.
School district units** have plans aligned to the district strategic plan. A systematic process is
in place to review the strategic plan to make necessary improvements.
1 do not know.
2.2 Focus of strategic
plan
Definition:
*School district units
are defined as specific
schools, departments,
or services of that
school district.
2.2.1
2.2.2
2.2.3
2.2.4
2.2.5
2.2.6
There appears to be a fragmented focus on improved student performance in the district.
Emphasis exists on improving student performance on standardized and state tests.
Improved student performance is the focus of strategic plan. Improvements are being
identified throughout school district units* to support high student performance.
District plan reflects integrated efforts of involved school district units* to support and
accomplish improved performance for all students.
District focus on improved student performance is evident throughout the school district
strategic plan. The district regularly assesses its plan for its focus on improved student
performance.
1 do not know.
2.3 Implementation
and assessment of
strategic plans
2.3.1
2.3.2
District strategic plan may exist but is not disseminated and reviewed with all school district
units*. Responsibility for implementation is unclear.
District plan is disseminated to all personnel in school district. Responsibility for
implementation is noted. Processes to accomplish the goals are unclear with no evaluation
measures specified.
-p*
- j
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
Definition:
*School units are
defined as specific
schools, departments,
or services of that
school district.
2.3.3
2.3.4
District strategic plan is disseminated and reviewed with all school district units* and with all
personnel throughout the school district. Plan is published and disseminated to the
community.
A clear implementation process exists, with responsibilities and timelines clearly delineated.
Work teams, involving personnel in appropriate school district unit* and external stakeholder,
are established. Evaluation measures are specified. Monitoring processes support successful
implementation.
2.3.5 Implementation of district strategic plan is clear, and aligned throughout all school district
units*. District and building plans are implemented and progress is assessed continually
throughout the district. A systematic process exists to evaluate the implementation of strategic
plans throughout the school district for continual improvement.
2.3.6 1do not know.
4^
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
3.0 Student Focus and Satisfaction/Stakeholder Satisfaction
This category examine how the school district determines
student and stakeholder* needs and expectations. Please consider how the school district enhances stakeholder relationships and
determines their satisfaction.
*{Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members,
business) which are affected by the conditions and quality of education or the preparedness o f graduates; also includes state
agencies and their requirements)
Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the
circle completely next to the statement selected with pencil or black ink._____________________________________________
Item Description
3.1 The ways in which the
school district determines
students’ needs from and
expectations of educational
programs
Definition:
*School district units are
defined as specific schools,
departments, or services of
that school district.
3.1.1
3.1.2
3.1.3
3.1.4
Standardized test scores are the primary source o f information to determine student
needs. Little or no analyses of the data are done nor disseminated to appropriate
school district units*. Students are not asked in any formal way for their needs or
expectations.
Besides state test scores, the district uses other sources of locally generated and
collected information to determine student needs and expectations Some analysis is
done as determined by individual school district unit *.
District is developing comprehensive strategies to determine all student needs.
District is in early stages of using multiple strategies, both inside and outside the
school district, to determine students’ comprehensive needs throughout all grade
levels and at specific times. Information is used in the district’s strategic planning
process.
149
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
3.1.5
3.1.6
District has a well developed and extensive system of determining student needs and
expectations. The emphasis is on prevention of school failure and the analysis of
multiple sources of information to determine trends. Information is analyzed,
compared and used in all district decision-making processes. The process to
determine needs and expectations of students is systematically evaluated.
I do not know.
3.2 High expectations for the
performance of students
Definitions:
*Performance expectations
refer to clearly defined
statements describing specific
academic, behavioral or social
criteria to measure
achievement, often referred to
as standards.
**Stakeholder-Individuals or
groups, both internal to the
school (students, personnel)
and external (parents,
community members, business)
which are affected by the
conditions and quality of
education or the preparedness
ofgraduates; includes state
agencies and requirements
3.2.1
3.2.2
3.2.3
3.2.4
3.2.5
3.2.6
Performance expectations* for students do not exist at any level.
Performance expectations* are being developed for some areas.
Performance expectations* exist for some grades/subject areas.
Performance expectations*, including exit standards for graduation, exist throughout
the K-12 system in all grade levels, courses and social conduct. Strong stakeholder
support exists for these criteria.
Performance expectations* are systematically evaluated and adjusted to reflect
stakeholder** expectations.
1do not know.
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
3.3 Student and stakeholder
satisfaction
Definitions.
*School district units are
defined as specific schools,
departments, or services of
that school district.
**Satisfaction with
educationalprograms and
performance of school district
3.3.1
3.3.2
3.3.3
3.3.4
3.3.5
3.3.6
There are no processes district-wide to determine student or stakeholder
satisfaction**. Anecdotal information often used to determine student or stakeholder
satisfaction*.
Some attempts to determine student or stakeholder satisfaction are made by
individual school district units*.
District is developing a process to determine student and stakeholder satisfaction.
District is making deliberate and consistent efforts to utilize a variety o f strategies to
determine student and stakeholder satisfaction throughout the district for all school
district units*. This information is disseminated throughout the district to all school
district units*.
District has clearly defined processes in place to measure, analyze and compare
student and stakeholder satisfaction** results at specific points and in all school
district units*. Information is used in strategic planning process. Processes are
systematically evaluated for effectiveness on a regular basis.
I do not know
3.4 Identifying future needs
and expectations of
students and stakeholders*.
Definition:
*Stakeholder- Individuals or
groups, both internal to the
school (students, personnel)
and external (parents,
3.4.1
3.4.2
3.4.3
District focuses on immediate needs of students as they occur.
District considers local demographic factors and trends which affect enrollment and
student needs and stakeholder* expectations.
District considers local, state, and national trend data and demographics to determine
future needs of students and expectations of stakeholders*.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
community members, business)
which are affected by the
conditions and quality of
education or the preparedness
o fgraduates; also includes
state agencies and their
requirements
3.4.4
3.4.5
3.4.6
District is in early stages o f using demographic factors, changing state, federal
requirements or trends, changing expectations and needs of higher education and the
workplace as part of the strategic planning process.
Multiple strategies to consider future needs/expectations of students and
stakeholders* from multiple sources are systematically used in all strategic planning
processes. These processes are evaluated on a regular basis to determine need for
improvement. Processes used are compared to other organizations who have
demonstrated exemplary ways to forecast student and stakeholder* needs.
I do not know.
LA
N>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
4.0 Information and Analysis This category examines the management of and effective use of data and information to drive
mission-related performance excellence in a school district. Please consider how your school district collects and uses information and
data to make decisions.
Please select / from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle
completely next to the statement selected with pencil or black ink.___________________________________________________
Item # Description
4.1 Selection and Use
of Information and Data
Definitions:
*Conventional
information is defined
as standardized and
state test scores,
enrollment, attendance,
dropout, discipline,
operating budget
**School district units
are defined as specific
schools, departments, or
services of that school
district.
4.1.1
4.1.2
4.1.3
4.1.4
4.1.5
4.1.6
Conventional information* is used primarily by district office and board in planning. This
information is not widely or regularly disseminated throughout the district.
Conventional information* is disseminated at least annually to all school district units**.
This information is used by school district units to assess student or operational performance.
A process is being developed to collect, manage, and use specific information and data which
are needed to meet the mission and develop key district goals which focus on improving
student performance.
District has a systematic process in place to collect, analyze, and disseminate critical
information, beyond conventional information*, to all school units** related to district key
goals and objectives. Information used comes from a variety of sources used to determine
student performance, student needs and satisfaction, stakeholder needs and satisfaction.
A systematic process for collecting, analyzing, and using comprehensive information and
data from multiple sources is fully deployed throughout all school district units**. All
personnel have easy access to the information and use it to evaluate, adjust, and create key
goals for school district unit** plans. The information and data process is systematically
evaluated and improvements are made as necessary.
1do not know.
u>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item U Description
4.2 Selection and Use 4.2.1 No process is in place currently to seek or use comparative data*.
of Comparative
Information and Data 4.2.2 Some practices exist to compare conventional data*. Practice is limited to specific school
district units** or personnel.
Definitions:
*Comparative data, or 4.2.3 District is developing a process to determine needs and priorities, criteria, and use of
benchmarking, is an comparative data*.
improvement process in
which a school district 4.2.4 A systematic process exists to use comparative data* and is fully implemented in all school
compares its district units**. A process for benchmarking is used to set improvement targets and integrate
performance against best practices.
against best-in-class
school districts and uses 4.2.5 The use of comparative data* is widely used throughout all school district units** to set
the information to targets, integrate new practices, and evaluate performance. The benchmarking* process is
improve its own systematically evaluated and improved as necessary.
performance.
4.2.6 1 do not know.
**School district units
are defined as specific
schools, departments, or
services o f that school
district.
4.3 Analysis and Use of 4.3.1 Analysis of conventional information** is done at the district level.
School District
Performance* Data 4.3.2 Analysis of conventional information** is done by the school district unit.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
Definitions: 4.3.3 Information and data are collected, disaggregated, analyzed and disseminated district wide.
*Performance data Information is used to gain understanding of student and student group performance and
includes data or school district unit performance.
informationfrom all
aspects o f the 4.3.4 Performance data* from all school district units are integrated and analyzed to assess overall
organization including district performance. Comparative data is used in the analysis.
student performance
measures, enrollment, 4.3.5 Comprehensive performance data* is analyzed, disseminated throughout the district and
discipline, human accessible to all school district units. This information is an integral part of planning process
resources, business and used to adjust and establish key objectives essential to all decisions by all school district
operations and units.
community.
4.3.6 I do not know.
**Conventional
information is defined
as standardized and
state test scores,
enrollment, attendance.
dropout, discipline,
operating budget
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
5.0 Human Resource Development and Management This category examines how administrative, faculty and support
personnel (all non-certified, business and operations staff throughout the district) are empowered to develop and utilize their full
potential in order to enable the school district to realize its mission through the accomplishment of key performance goals. Please
consider the district driven efforts to build and maintain an environment conducive to performance excellence, full participation, and
personal and organizational growth.
Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle
Item # Description
5.1 Learning and 5.1.1 Insufficient attention is given to the learning and/or working climate by the leadership*.
Working Climate
5.1.2 Some attention is given to creating a high performance environment for both students and
Definitions: personnel, but usually driven by school district unit ** administrator.
*Leadership is defined as
district level senior 5.1.3 Efforts to create a positive, productive, safe environment are evident throughout the
administrators and board district.
of trustees.
5.1.4 A strong positive, productive and safe environment is visible throughout the district. High
**School district units performance is fostered among personnel. Morale is high among personnel district-wide.
are defined as specific
schools, departments, or 5.1.5 Feedback from personnel is systematically obtained and used to guide decisions regarding
services o f that school the working and learning environment. This area is systematically reviewed and aligned
district. with the strategic planning process in the district.
5.1.6 I do not know.
ON
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
5.2 Work Systems* 5.2.1 Positions and work tasks are organized within traditional positions, roles, and
responsibilities with most of the authority for decision-making with district level leadership.
5.2.2 Efforts are being made to move decision-making to teams within school district units**.
There have been some decisions to de-centralize specific functions.
Definitions: 5.2.3 There are deliberate efforts underway to improve the district’s work systems through
*Work systems are increased opportunities for self-directed responsibility of personnel in all school units** in
defined as how jobs, work designing, managing, and improving the district’s operations in order to accomplish the
and decision-making are
designed at all levels
district’s mission and key goals.
within the organization. 5.2.4 Work processes exist which allow for all personnel to contribute optimally in their school
unit** through self-directed teams which foster flexibility, communication among school
**School district units units, and accomplishment of key goals. Labor and management have collaboratively
are defined as specific
schools, departments, or
designed personnel practices to support this.
services o f that school 5.2.5 Work and job functions are designed to accomplish district goals. Effective
district. communication and collaboration exists across work functions. Processes for evaluation,
compensation, promotion, and recognition is exemplary. Work system processes are
systematically evaluated on a regular basis.
5.2.6 1 do not know.
157
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
5.3 Personnel Education,
Training, and
Development
Definitions:
*School district units are
defined as specific
schools, departments, or
services o f that school
district.
5.3.1
5.3.2
5.3.3
Opportunities for staff development are limited to in-service days. Personnel have some
input into topics or design.
Needs assessments are conducted for faculty and support personnel. Staff development is
based on the needs identified. Topics may or may not be aligned with the district’s strategic
plan.
Processes are being designed to align education and training decisions to the district’s
mission and key performance goals. Orientation for new personnel provides training on
the district’s mission, goals, and work systems.
5.3.4 A process for determining, designing, and evaluating education and training is established
and implemented across all school district units*. Education and training are provided in a
variety of ways. Application of knowledge and skills is expected and supported through
specifically designed strategies. Reaction to training is regularly assessed and evaluated for
necessary improvements.
5.3.5 Processes for education and training are aligned to strategic planning processes. Decisions
for education and training are made on the basis of key performance goals and district
competencies needed to achieve performance expectations. Staff development activities are
measured for impact on learning, performance of staff and effect on students’ performance.
The design and delivery processes for staff development are evaluated on a regular basis.
5.3.6 1do not know.
L /l
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
5.4 Performance 5.4.1 The performance appraisal* process is completed on an annual basis involving the
Appraisal Systems individual and the immediate supervisor. Little information is generated from the process
to promote further development. Personnel find little value in the process.
Definition:
*Performance appraisal 5.4.2 Performance appraisals* are done annually and result in the development of specific goals
refers to the regular established cooperatively by the individual and supervisor to promote further development.
evaluation procedures o f Personnel find some value in the process, but agree it could be improved.
(he performance o f
personnel. 5.4.3 Personnel are engaged in developing performance appraisal* processes to support high
performance, stakeholder satisfaction, continuous improvement and collaboration between
management and personnel.
5.4.4 Each personnel unit, including leadership, uses a performance appraisal* process which
involves feedback from key identified stakeholders with whom they work closely. Results
of the appraisal process are linked to district’s professional development plans and
continual development of performance.
5.4.5 Performance appraisal* processes meet the professional needs of all personnel and provide
useful feedback from identified key stakeholders. Personnel work collaboratively with
management to identify areas of growth to further the district’s mission. Performance
appraisal* processes are systematically reviewed to identify improvement areas.
5.4.6 I do not know.
sO
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
5.5 Employee Well-
Being and Satisfaction
5.5.1
5.5.2
Employee motivation and satisfaction are given little attention.
Employee motivation and satisfaction are addressed through employee recognition events,
or individual administrator’s efforts.
5.5.3 Processes are being developed to determine employee needs and expectations. The work
environment is being improved to accommodate needs to the extent possible.
5.5.4 Employee satisfaction, well-being, and motivation are seen as key requirements to develop
capabilities to realize its mission and reach its performance goals. Satisfaction surveys are
administered on a regular basis. Personnel are able to identify, recommend, and make
improvements. Multiple strategies for reward and recognition exist.
5.5.5 The accomplishment of performance goals is recognized as fundamental to employee
satisfaction. Processes to measure employee needs and satisfaction are integrated into
strategic planning process. A variety of opportunities are available to promote well-being,
satisfaction and motivation throughout the district. These processes are systematically
evaluated to make improvements as necessary.
5.5.6 1do not know.
O n
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
6.0 Educational and Operational Process* Management
This category examines how work is designed, managed,
accomplished, and evaluated for effectiveness. Please think about the extent to which all work processes are designed, managed,
accomplished and evaluated with a focus on stakeholder satisfaction in all work units. Also consider how processes are designed,
effectively managed, and improved to achieve better performance.
(*Process refers to the steps and sequence o f steps done in a specific work activity; e.g. enrolling a new student in school;
requisition of instructional supplies; hiring new personnel.)
Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle
completely next to the statement selected with pencil or black ink.__________________________________________________
Item Description
6.1 The design and
management
of educational programs*
Definitions:
*Educationalprograms are
defined as all programs and
services provided to students
and conducted by
professional, certified
personnel or non-certified
personnel under the
supervision of certified
personnel
6.1.1
6.1.2
6.1.3
6.1.4
Design and management of programs are based on federal or state regulations, traditional
practices and individual preferences or opinions.
Design and management of some programs are based on student performance on state­
wide testing, perceived student needs, and/or preferences of management.
Processes are being designed and developed to base decisions for educational programs
and services on student needs, performance results, stakeholder satisfaction and research-
proven practices. Student performance expectations and curricula are being developed to
align with these needs, expectations, and practices.
Many programs and services are designed and managed through established processes
based on a review of student needs, performance results, stakeholder satisfaction and
research-based best practices. Curricula are aligned to performance standards in most
units and levels.
O n
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
6.1.5
6.1.6
A systematic process exists for the design and management of all educational programs.
All educational programs and services meet established standards to assure high quality of
programs and services. Programs and services are based on a systematic review of student
needs, performance results, community demographics, stakeholder expectations and cost
analyses. All curricula are aligned to performance expectations for all grades and courses.
This process is systematically evaluated and improvements are made as determined
necessary.
1 do not know.
6.2 Delivery of educational
programs and services
Definition:
*Educational Programs and
services are defined as all
programs and services
provided to students and
conducted by professional,
certified personnel or non­
certifiedpersonnel under
the supervision o f certified
personnel
6.2.1
6.2.2
6.2.3
6.2.4.
Delivery of educational programs and services* are based on federal or state regulations,
traditional practices and individual preferences, opinions. Local curricula are
inconsistently delivered.
Delivery of some educational programs and services* are based on student performance
information and in some cases, research-proven effective practices. Curricula are delivered
consistently in some school district units.
Processes are being developed to improve the delivery of educational programs* through a
systematic review of performance results, research-based best practices and stakeholder
expectations.
Educational programs and services* are delivered to meet student needs, prevent school
failure, optimize student achievement, integrate research-based practices and respond to
stakeholder expectations. Delivery of services is evaluated based on performance
information and improvements are made as necessary.
On
to
Item # Description
6.2.5
6.2.6
A systematic process exists to insure delivery of educational programs and services* that
optimally meet student needs, insure student success and exceed stakeholder expectations.
Processes are proactive to prevent student failure. The process for program review is
systematically evaluated as an integral part of the strategic planning process and
improvements made as necessary.
1 do not know.
6.3 Design, management and
delivery of educational
support services
*Educational support
services include all programs
and services which support
the educational programs,
such as business operations,
transportation, public
relations, purchasing, clerk,
legal, volunteers, food
service, records, buildings
and grounds.
**Stakeholder- Individuals
or groups, both internal to
the school (students,
personnel) and external
(parents, community
members, business) which
6.3.1
6.3.2
6.3.3
6.3.4
Educational support services are designed, managed and delivered based on traditional
practices and individual preferences. Frequent complaints from internal and external
stakeholders occur with no defined process to address them. Decisions are made with
little to no input or involvement from stakeholders**.
Educational support services* are designed with some input from stakeholders**. Cost
analyses of operations are routinely done. Some stakeholder** satisfaction information
has been acquired. Required audits are performed for district finances.
Processes are being developed to include problem identification, analyses of performance
results, cost analyses, and stakeholder** requirements. Stakeholders of services are
included in the design. Information is being acquired to determine stakeholder satisfaction.
Most services have processes*** which support the mission of the district through a focus
on improved productivity, efficiency and quality to support optimal student performance.
Processes include early identification of problems, corrective action processes, and
comparing processes to other external organizations. Stakeholder** requirements and
satisfaction are regularly assessed and used to make improvements.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
are affected by the
conditions and quality o f
education or the
preparedness o f graduates;
also includes state agencies
and their requirements
***Process refers to the
steps and sequence o f steps
done in a specific work
activity; e.g. enrolling a new
student in school; requisition
of instructional supplies;
hiring new personnel
6.3.5
6.3.6
A systematic process exists to design, manage and deliver all support services v/hich
involve stakeholders** and is based on the educational needs of students, performance
results o f that service, cost analyses, productivity measures, and stakeholder**
satisfaction. This process is systematically reviewed and improvements are made as
necessary.
1 do not know.
6.4 Data and Information
Processes*
Definition:
*Includes the collection,
management and
dissemination of data on
enrollment, achievement,
operations, stakeholder
satisfaction and other
pertinent information which
are used in evaluation and
planning processes
6.4.1
6.4.2
6.4.3
Data collected includes district, school, grade-level enrollment, attendance, dropout, and
student performance on statewide testing. Information is not widely disseminated
throughout the district. Some decisions made by leadership are based on this information.
Additional data regarding enrollment, district demographics and utilization of specific
educational programs and services is collected. Data collected is determined by program
managers or building principals. Some analyses occur and are used to make
recommendations and decisions for improvement.
Processes are being developed to collect, analyze, disseminate and use more
comprehensive information necessary to determine improvement areas. Additional
measures of student performance are being developed for frequent indicators of learning.
Stakeholder needs and satisfaction data are included.
CT
4^
Item # Description
6.4.4 Decision-making processes are based on information on student and operational
performance. Information includes all measures of student achievement by grade level,
assessment of educational programs and services, assessment of support services, and
comprehensive building level data including school, grade and course enrollment,
disciplinary, graduation, drop-out, parent involvement, and stakeholder satisfaction.
Processes exist to collect, analyze and disseminate specific data which may be required to
make improvements.
6.4.5 A systematic process exists for the collection, analysis, dissemination and use of data and
information. Strategic decisions affecting the direction of the district and target goals are
made using this information. Comprehensive assessments exist for monitoring student
performance, evaluating effectiveness of all programs and services, and for stakeholder
satisfaction. Data and information processes are systematically evaluated and necessary
improvements are made.
6.4.6 I do not know.
6.5 Internal and external*
communication
process**
Definition:
*Internal refers to personnel
and students within the
school district and external
refers to parents and
community stakeholders
6.5.1
6.5.2
6.5.3
There is unclear and inconsistent communication regarding the direction and goals within
the district. Some efforts made to communicate with parents and community.
Communication regarding the direction and goals within the district is usually done on an
annual basis, but not frequently. School newsletters or district newsletters are sent to
parents.
Internal communication is improving but still inconsistent. Efforts are being made to
improve communication with external stakeholders.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
*Communication process
refers to methods used to
inform and seek opinion
form others.
6.5.4
6.5.5
6.5.6
Communication is consistent, timely and thorough, both with internal and external
stakeholders.
A clear and understood process for thorough communication both internally and externally
exists. District personnel, students, parent and community are fully informed in a timely
manner. Communication process** is systematically evaluated for improvement
opportunities.
I do not know.
6.6 Management of supplier
and partnering*
processes
Definitions:
*Suppliers are those
businesses or individuals
with which the district
contracts for specific
services such as training,
consulting, transportation,
legal, etc. Partnering
processes are defined as
those relationships with
other organizations,
community agencies,
businesses to design,
implement, provide services
6.6.1
6.6.2
6.6.3
6.6.4
No specifications are established by school district for expectations from suppliers.
There is little effort to develop collaborative relationships with stakeholders.
Some specifications and expectations exist for some supplier areas. Problems are
addressed as they occur. The district has developed collaborative relationships with
parent, local businesses and some community organizations.
Efforts are being made to develop a proactive approach to problem identification and
prevention with suppliers*. Requirements are being developed for some suppliers and
communicated to achieve improved quality of supplies and materials. District is making
increasing efforts to develop partnerships with stakeholder groups to support mission and
goals. Efforts are being made to involve them in decision-making processes.
Established processes exist to determine supplier and partner* requirements for most
areas. Requirements are communicated to assure expected performance by supplier and
partner. Stakeholders** are asked to evaluate effectiveness.
O n
O n
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Item # Description
for the students or
stakeholders of the district
**Stakeholder- Individuals
or groups, both internal to
the school (students,
personnel) and external
(parents, community
members, business) which
are affected by the
conditions and quality of
education or the
preparedness o f graduates;
also includes state agencies
and their requirements
6.6.5
6.6.6
A systematic process exists to establish and communicate requirements to suppliers and
partners*. Processes include regular evaluation of their effectiveness, quality and costs.
Suppliers and partners* share in the district’s goals. Processes are regularly evaluated to
determine improvements necessary.
1 do not know.
o
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
7.0 School District Performance Results
This category examines the district’s results in student achievement, quality of
programs and services, and operations. Please think about the current results that your school district can demonstrate in student
achievement, human resources and stakeholder satisfaction. Please consider how your school district performance results compare
with comparable districts.
Please select one of thefive choices in each item which most closely describesyour school district byfilling in the circle next
to it with a pencil or black pen.
# Description
7.1 Student performance
results
*Stakeholder- Individuals
or groups, both internal to
the school (students, all
personnel) and external
(parents, community
members, business) which
are affected by the
conditions and quality o f
education and
preparedness o fgraduates.
**Benchmarking is an
improvement process in
which an organization
compares its performance
against best-in-class
organizations, determines
how these organizations
7.1.1
7.1.2
7.1.3
7.1.4
7.1.5
7.1.6
Student performance, as revealed by national, state or local measures, reveal significant
deficiencies and are below expectations of stakeholders* and other comparable districts
Performance by some students on national, state or local measures reveals satisfactory
results when compared to state or national results. Below average performance results
exists for other students.
Student performance on national, state or local measures is improving in specific areas and
by increasing numbers of students. Data is being accumulated from past few years to
determine if this is a trend.
Student performance on national, state or local measures is showing consistent upward
trends over time. Performance indicates improvement over the past few years compared to
other comparable districts, both in and out of state.
Student performance results are sustained on all measures and by all student groups. All
student groups recognize school district for outstanding performance. Other school
districts use performance results for benchmarking**. Graduate follow-up information
reveals successful employment or completion of higher education.
1 do not know.
O n
00
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
# Description
7.2 Student Conduct
Results
7.2.1 Unacceptable rates of student absenteeism, tardies, suspensions, expulsions for students
exist.
*Indicators involving
student behavior such as
disciplinary infractions,
7.2.2 Recent improvement in at least one student conduct indicator*. Results are consistently
reviewed for progress.
suspensions, expulsions,
arrests, etc.
**Benchmarking is an
7.2.3 Improvement trends are beginning to emerge in 3 or more indicators*. Improvements are
emerging over time when compared to other comparable school districts. Some student
groups show little improvement.
improvement process in
which an organization
compares its performance
7.2.4 Significant gains are noted in student conduct areas over a three year period among most
student groups, compared to in state and out of state comparable districts.
against best-in-class
organizations, determines
how these organizations
achieved their performance
levels and uses the
7.2.5 Sustained results are demonstrated in all areas of student conduct and among all student
groups. District performance in this area is used for a state or national benchmarking**.
Graduate follow-up information reveals successful employment or completion of higher
education.
information to improve its
own performance.
7.2.6 I do not know.
ON
NO
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
# Description
7.3 Student and
Stakeholder
Satisfaction Results
* Stakeholder-Individuals or
groups, both internal to the
school (students, personnel
) and external (parents,
community members,
business) which are affected
by the conditions and
quality o f education or the
preparedness o fgraduates;
also includes state agencies
and their requirements
**Benchmarking is an
improvement process in
which an organization
compares its performance
against best-in-class
organizations, determines
how these organizations
achieved their performance
levels and uses the
information to improve its
own performance_________
7.3.1
7.3.2
7.3.3
7.3.4
7.3.5
7.3.6
Anecdotal information suggests low satisfaction among students and stakeholders*.
Results collected through a systematic process reveal significant low satisfaction among
students and stakeholders with school district performance and operations.
Some areas reveal improvement in stakeholder* and student satisfaction when compared to
baseline data.
Many areas reveal significant gains in stakeholder * and student satisfaction. Long term
trends are emerging. District is comparing trends with other comparable school districts.
Sustained results over several years reveal high satisfaction of all student and stakeholder*
groups for the performance of student achievement, quality of educational programs and
services, and support services. The school district is used for benchmarking** in and out
of state.
I do not know.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Description
7.4 Human Resource*
Results
*Human resource
indicators include
employee well-being, labor
relations, satisfaction,
professional development,
work system performance
and effectiveness.
**Benchmarking is an
improvement process in
which an organization
compares its performance
against hest-in-class
organizations, determines
how these organizations
achieved their performance
levels and use the
information to improve its
own performance.
7.4.1
7.4.2
7.4.3
7.4.4
7.4.5
7.4.6
Anecdotal information reveals low satisfaction of personnel and poor employee-employer
relationships. High absence and turnover rates by personnel exist. High rate of grievances
exist.
No formal measurement exists but anecdotal information suggests that most personnel
appear satisfied. Personnel accept current terms of employment. Personnel absenteeism,
turnover and recruitment are not considered to be problems. Reactions to staff
development activities are generally positive.
Systematic attempts to measure human resource indicators* reveal some emerging
improvement patterns.
Results indicate improvements in many human resource indicators*. Absenteeism rates for
personnel and turnover show steady decline. Employee satisfaction measures reveal
consistent upward trends. Staff development activities reveal impact on instructional and
work practices.
Employee satisfaction is high among all personnel classifications. District is recognized as
an organization to benchmark**. Personnel absenteeism, turnover is low , with sustained
employment pool of high quality qualified applicants. Staff development opportunities
indicate sustained results on improved student performance.
1do not know.
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Description
7.5 Educational Program
and Service Results
Definitions:
*Educationalprograms
and services are defined as
all programs and services
provided to students and
conducted by professional,
certifiedpersonnel or non-
certified personnel under
the supervision o f certified
personnel
**Benchmarking is an
improvement process in
which an organization
compares its performance
against best-in-class
organizations, determines
how these organizations
achieved their performance
levels and uses the
information to improve its
own performance_________
7.5.1
7.5.2
7.5.3
7.5.4
7.5.5
7.5.6
No results, other than state, national or local student achievement measures, are available
to determine effectiveness of educational programs and services*. Enrollment data is used
as indicator of effectiveness for some educational programs or services*.
Some educational programs and services * evaluations exist but demonstrate poor
performance results. Compliance with federal, state, local requirements exists.
Some educational program and service* evaluations are beginning to show improvements
on several indicators compared to baseline data.
Steady gains are being made in enrollment, attendance and student graduation rates when
compared to comparable school districts. Educational programs and services* are
demonstrating improvements in performance results.
Sustained results exist indicating outstanding performance for educational programs and
services. High student satisfaction exists. High attendance, high graduation rates exist
with graduate follow-up information revealing successful employment or higher education
after graduation. District is used as a benchmark** both state and nationally for
excellence in educational programs and services.
I do not know.
-j
to
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
Description
7.6 Educational support
services
Definitions:
*Educational support
services include all
programs and services
which support the
educational programs,
such as business
operations, transportation,
public relations,
purchasing, clerk, legal,
volunteers, food service,
records, buildings and
grounds.
**Benchmarking is an
improvement process in
which an organization
compares its performance
against best-in-class
organizations, determines
how these organizations
achieved their performance
levels and use the
information to improve its
own performance_________
7.6.1
7.6.2
7.6.3
7.6.4
7.6.5
7.6.6
Audits of district reveal some areas of non-compliance with regulatory or legal
requirements. Little or no information is available from educational support services*.
Some performance data exists for some services. Results reveal significant gaps in
stakeholder satisfaction. Regulatory and legal compliance is improving.
Performance results of support services reveal some recent improvement
Improvements are evident in some educational support service areas. Benchmarks are
being used from other comparable organizations and reveal upward trend.
Sustained improvement results exist with high customer satisfaction, cost effectiveness,
efficiency and productivity. District educational support services are used as
benchmarks** for other comparable districts.
1do not know.
U>
Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission.
1. Could this instrument be useful as a tool to study the seven areas of your school district?
Extremely Usefi.il Somewhat Useful Little Use No Use______
Please explain why or why not._________________________________________________________________________________
2. Could this instrument be useful as a tool to help district personnel determine areas for future school district improvement?
Extremely Useful Somewhat Useful Little Use No Use______
Please explain why or why not._________________________________________________________________________________
My most sincere appreciation for your responses
174
175
Appendix B:
Panel of Experts Used in Content Validation
Dr. Susan Leddick— Trainer, consultant, practitioner of quality and the Malcolm Baldrige
National Quality Award.
Dr. Roland Smith— Consultant in quality applications and Malcolm Baldrige Examiner at
the state and national level.
Dr, Jim Shipley— Trainer in Pinellas County, Florida schools. Designed a similar
instrument for use in those schools. Malcolm Baldrige Examiner at the national level.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
176
Appendix C:
Letter to Panel o f Experts
August, 1997
Dear
Thank you for agreeing to be a content area expert for my dissertation study. I have
enclosed the current draft o f the instrument for your review and critique. The feedback
that I am requesting is in the following areas.
• Alignments o f the intent, design and content o f the instrument with the Malcolm
Baldrige National Quality Award criteria.
• Clarity o f items.
• Specific suggestions for improving the instrument
• Extent to which this instrument could be a guide for self-study in a school district.
The research questions posed in this study are:
1. How do educators perceive their own school district’s performance based on an
instrument designed using the Malcolm Baldrige National Quality Award
Education Criteria?
2. Are there differences in these ratings based on type o f educators or size of
district?
3. Do educators find this instrument a useful tool to study these areas o f a school
district?
4. Do educators believe this instrument could be useful in determining school
improvement needs?
Please feel free to give your comments to me in the easiest way for you. You can reach
me through these numbers, fax, or e-mail.
I so appreciate your willingness to advise me and share your expertise.
Sincerely,
Sally Anderson
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
177
Appendix D:
Matrix o f Population Sample
Superintendents Principals Teachers
Grp. Population N % S** N % S** N % S**
1 5000 + 13 13.4 10 220 44 96 6736 51.5 191
2 2500-4999 13 13.4 10 84 16.8 37 2358 18 67
1000-2499 27 27.8 21 106 21.2 46 2374 18 67
4 500- 999 22 22.6 18 55 11 24 987 8 29
5 1- 499 21 21.6 17 31 6 14 617 5 19
Total 26 217 373
Superintendents Principals Teachers
N S N S N S
State Totals 97 76 500 217 13,076 373
*Note: Group classifications by size are based on the 1996-97 Annual Statistical Report
prepared by the Idaho State Department o f Education.
**Note: Proportional sample number from total sample.
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
178
Appendix E:
Codes on the Instrument
Size o f District Type of Educator:
1 S
1 P
1 T
2 S
2 P
2 T
->
S
3 P
J T
4 S
4 P
4 T
5 S
5 P
5 T
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
179
Appendix F:
Cover Letter and Directions
October 6, 1997
Dear Colleague,
As educators, we work diligently to improve student performance and optimize the
effectiveness of our schools. It is on this shared responsibility that I ask you to participate in
my dissertation study.
The enclosed instrument is not a survey. It is an analysis tool designed to assist school
districts in the process o f self-study for the purpose of continual improvement o f school
systems. Selections you make on this instrument are neither right nor wrong, good nor bad.
They should reflect your perception of where your school district is at this point in time in
that area. It is your honest and realistic professional opinion that will provide me the
information for my study.
This analysis instrument is adapted from the Malcolm Baldrige National Quality
Award - Education Pilot, the curriculum audit process, and the 1996 accreditation standards
from the Northwest Association o f Schools and Colleges. The instrument asks you to reflect
on seven categories common to every school district. Within each category are specific items
to which you will respond.
You, specifically, are in a unique position of responsibility to influence and implement
school improvement efforts in your district. I realize that I am asking very busy professionals
to use some of your already limited time towards this endeavor. Since it is not a survey, it
may take you a little longer to thoughtfully respond. I am appealing to your commitment to
the improvement o f education in Idaho, your curiosity about the findings o f this research, and
your willingness to assist a colleague who shares your interest and desire for continued
excellence in education.
Your returned response is critical to the findings. Read the directions thoroughly to
assist you with this unique project. Please return the completed instrument to me in the self-
addressed stamped envelope provided. I am asking that you send it to me by October 27,
1997.
Please accept the pen as a small token o f my appreciation for participation in this
study. Please feel free to use it when completing the enclosed analysis. I would welcome the
opportunity to discuss my results with you personally and the implications for how school
improvement might be approached. You may reach me at the J.A. & Kathryn Albertson
Foundation, 208-342-7931.
My sincere appreciation,
Sally Anderson, Doctoral Student University o f Idaho
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
180
Directions
Please Read Carefully! Thank You! Please Read Carefully! Thank You!
• The Performance Analysisfor School Districts has been prepared on paper that will be
scanned. Therefore, please fill in the circle next to the item you select completely with
pencil or black ink. Do not fold or tear. Do not staple. Use clip.
• This instrument is an analysis tool to assist you in a reflective self-study of the major
components o f a school district. The questions do NOT ask about a specific school. Rather,
they focus on the whole school district.
• Unlike a survey, this is not intended to be responded to quickly. It is intended to be a
reflective tool for your thoughtful consideration regarding the complete scope o f your school
district operations.
• As the sole researcher, I can assure you that all information is confidential and anonymous.
Results will be reported in grouped data only. No individual responses are reported.
• It is not necessary to do any research or consult with others to help you answer the items,
unless you wish to. I am interested in your perspective, your knowledge, your perception
based on what you know from your view in your school district.
• There is the opportunity to select an “I do not know category” if you truly feel you can not
make a judgem ent on that item.
• Some terms used are defined in each sub-category. Please refer to the definitions under each
sub-category in the upper left-hand corner for added clarity.
• If you feel that you do not have information about an area, answer based on what it appears
to be from your point of view.
• Please do not write in any comments unless they are asked for.
• Please respond to every section to the best o f your ability.
• Select only one o f the five choices in each section that most accurately reflects your
perception o f your school district at this point in time. I am interested in knowing your
perception.
• Return the completed Performance Analysisfor Schools to me in the self-addressed stamped
envelope provided by October 27, 1997.
• If you have any questions or would like the results o f the study, please call me at the J.A. &
Kathryn Albertson Foundation (208-342-793 1).
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
181
Appendix G:
Postcard Reminder
October 25, 1997
Dear Colleague,
Recently, I sent you a school district self-study instrument in the mail to collect
information for my dissertation research study. If you have returned it to me already, I
am most grateful. If you have not, I ask that you take the time to do so.
Assessing the performance o f organizations is becoming an extremely important practice
and as school districts, we will need to consider such a process. The instrument serves
only to find out your perceptions o f the current performance o f your school district using
the Malcolm Baldrige National Quality Award as the criteria. Please send me the
completed instrument by November 5, 1997.
Thank you for your contribution to my study and your gift o f time.
Sally Anderson, Doctoral Candidate (208-342-7931)
University o f Idaho
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
182
Appendix H:
Letters o f Support from IASA and IEA
IDAHOASSOCIATIONOFSCHOOLADMINISTRATORS
777SouthLatah Boise.ID83705
PHONE:(208)345-1171 FAX:345-1172
M ichael L. Friend, Executive D irector
E-MAIL: idschadm@micron.net
WEBSITE: httpi//www.lewiston.icl2.id.us.oiganizations/iasa/"Leadership For Tomorrow's Leaders"
October1,1997
DearParticipant:
TheIdahoAssociationofSchoolAdministratorsispleasedtoendorsethisresearchstudyandprovidethe
opportunityforyoutoparticipateintheproject. Asthekeyleadersofschoolimprovementefforts,our
perspectiveisvitallyimportant;thisstudyrecognizesthatfact. TheIASAhasgreatinterestinthefindingsof
thisparticularresearchstudydesignedbySallyAnderson,aUniversityofIdahodoctoralstudent. Theresults
oftheresearchwillbepublishedinourjournal,Perspectives.
Yourcompletionofthissurveyinstrumentisofcriticalimportancetothestudy. Theresponseswillbekept
confidentialandhandledonlybySally. Resultswillbereportedintheaggregate.
We recognizethatthistaskwillrequirevaluabletimeandconsiderablethoughtonyourpart. Considerthe
useofyourtimeasaninvestmentinourfutureapproachtoschoolimprovementandthedevelopmentofhigh
performancedistricts. Pleasetakethenecessarytimetosupportyourcolleagueinthisproject!
Thankingyouinadvanceforyourassistance.
MikeFriend
ExecutiveDirector
Affiliated DivMoaf: Idaho School Superintendent!' Auocietioo: Idaho Ataociaiioo oTSecondary School Principals: Idaho Aisociarion of Betncntaiy School Principal!;
Idaho Association of Special Education Administrators AUkd Aamdatec Idaho School District Council: Northwest Women for Educational Action:
Idaho School Business Officials; Idaho Rural Schools Association; Idaho Middle Level Aisociarion; Idaho School Public Relations Association;
Idaho Association of Educational Office Professionals; Idaho Association for Supervision and Curriculum Development
Sincerely,
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
IDAHO EDUCATION ASSOCIATION
P.O. BOX 2638, BOISE. IDAHO 83701,620 NORTH SIXTH STREET. 83702
INTERNET:http://www.idahoea.org
208/344-1341
'FAX 208/336-6967
ROBIN NETTINGA
President
e-mail:mettinga@nea.org
JAMES A. SHACKELFORD
Executive Director
e-mail:jashacfc@nea.org
October 1, 1997
Dear Participant:
The Idaho Education Association is pleased to have the opportunity to
encourage you to assist with this important study. As the cornerstone of all
school improvement efforts, your perspective is vitally important and this study
recognizes that. The IEA is interested in the findings ofthis research study that
has been designed by Sally Anderson, doctoral student with the University of
Idaho.
You have been randomly selected to be a participant in this study. Your
completion of this instrument is both critically important to the study and could be
of practical significance to you and your district. The results of this study are
entirely confidential and handled only by Sally Anderson. Results will only be
reported in the aggregate.
We recognize that it will require some of your time and considerable
thought. Consider the use of your time as an investment towards our future
approach to school improvement and high performance school districts. Please
take the necessary time to support your colleagues in this study by completing
this instrument.
Thank you for your assistance.
Sincerely,
Executive Director
JAS/jh
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
184
Appendix I:
Districts by Enrollment Size
District District
Number Name
Classification 1— Districts of Over 5,000:
001 Boise
002 Meridian
025 Pocatello
091 Idaho Falls
131 Nampa
271 Coeur D ’Alene
093 Bonneville
411 Twin Falls
082 Bonner County
151 Cassia County
331 Minidoka County
340 Lewiston
132 Caldwell
13 districts totalfor Classification I
Classification 2— Districts of 2,500 to 4,999:
055 Blackfoot
193 Mountain Home
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
185
District District
Number Name
Classification 2, continued:
321 Madison
273 Post Falls
251 Jefferson County
272 Lakeland
139 Vallivue
261 Jerome
221 Emmett
061 Blaine County
281 Moscow
215 Fremont County
03 Kuna
13 districts totalfo r Classification 2
Classification 3— Districts o f 1,000 to 2,499:
52 Snake River
201 Preston
60 Shelley
241 Grangeville
134 Middleton
371 Payette
(table continues)
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
186
District
Number
District
Name
(Classification 3, continued)
33 Bear Lake County
101 Boundary County
381 American Falls
171 Orofino
21 Marsh Valley
431 Weiser
391 Kellogg
412 Buhl
322 Sugar-Salem
291 Salmon
41 St. Maries
372 Fruitland
413 Filer
231 Gooding
150 Soda Springs
414 Kimberly
370 Homedale
401 Teton County
421 McCall-Donnelly
232 Wendell
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
187
District District
Number Name
(Classification 3, continued)
59
351
28 districts totalfo r Classification 3
Classification 4— Districts of from 500 to 999:
137 Parma
58 Aberdeen
371 New Plymouth
392 Wallace
252 Ririe
253 West Jefferson
262 Valley
192 Glenns Ferry
363 Marsing
181 Challis
286 Whitepine
304 Kamiah
136 Melba
285 Potlatch
111 Butte County
(table continues)
Firth
Oneida County
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
188
District
Number
District
Name
(Classification 4, continues)
365 Bruneau-Grand View
148 Grace
202 West Side
044 Plummer-Worley
242 Cottonwood
341 Lapwai
133 Wilder
072 Basin
tdistricts totalfo r Classification 4
Classification 5— Districts of from 1 to 499:
312 Shoshone
422 Cascade
233 Hagerman
013 Council
283 Kendrick
135 Notus
415 Hansen
417 Castleford
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
189
District District
Number Name
(Classification 5, continued)
282 Genesee
071 Garden Valley
274 Kootenai
418 Murtaugh
073 Horseshoe Bend
432 Cambridge
305 Highland
182 Mackay
161 Clark County
342 Culdesac
Oil Meadows Valley
302 NezPerce
149 North Gem
314 Dietrich
121 Camas County
316 Richfield
234 Bliss
392 Mullan
292 South Lemhi
(table continues)
R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
190
District
Number
District
Name
(Classification 5, continued)
382 Rockland
433 Midvale
*092 Swan Valley
*394 Avery School
*364 Pleasant Valley
*383 Arbon
*191 Prairie
*416 Three Creek
*Note: Eliminated due to size— less than 100 and no superintendent.
29 districts totalfo r Classification 5
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
IMAGE EVALUATION
TEST TARGET (QA-3) ✓
' v
A
1.0
1.1
L i ■ 2.8
L£ _
Ui K£
If U£
■4° lllll 2.0
2 3
2.2
1.8
1.25 1.4 1.6
150mm
IIWlGE.Inc
1653 East MainStreet
Rochester. NY14609 USA
Phone: 716/482-0300
Fax: 716/288-5989
O 1993. Applied Imago. Inc.. All Rignts Reserved
Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.

Thesis

  • 1.
    INFORMATION TO USERS Thismanuscript has been reproduced from the microfilm master. UMI films the text directly from the original or copy submitted. Thus, some thesis and dissertation copies are in typewriter face, while others may be from any type ofcomputer printer. The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleedthrough, substandard margins, and improper alignment can adversely affect reproduction. In the unlikely event that the author did not send UMI a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. Oversize materials (e.g., maps, drawings, charts) are reproduced by sectioning the original, beginning at the upper left-hand comer and continuing from left to right in equal sections with small overlaps. Each original is also photographed in one exposure and is included in reduced form at the back ofthe book. Photographs included in the original manuscript have been reproduced xerographically in this copy. Higher quality 6” x 9” black and white photographic prints are available for any photographs or illustrations appearing in this copy for an additional charge. Contact UMI directly to order. UMIA Bell & Howell Information Company 300 North Zeeb Road, Ann Arbor MI 48106-1346 USA 313/761-4700 800/521-0600 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 2.
    R eproduced withperm ission of the copyright owner. Further reproduction prohibited without permission.
  • 3.
    NOTE TO USERS Theoriginal manuscript received by UMI contains indistinct, slanted and or light print. All efforts were made to acquire the highest quality manuscript from the author or school. Microfilmed as received. This reproduction is the best copy available UMI R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 4.
    R eproduced withperm ission of the copyright owner. Further reproduction prohibited without permission.
  • 5.
    Using the MalcolmBaldrige National Quality Award Education Pilot Criteria for Self- Assessment of School Districts Presented in Partial Fulfillment o f the Requirements for the Degree o f Doctor o f Philosophy with a Major in Education in the College o f Graduate Studies University of Idaho By Sally Anderson December, 1997 Major Professor: Dr. Cleve Taylor Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 6.
    UMI Number: 9827869 Copyright1998 by Anderson, Sally C. AH rights reserved. UMI Microform 9827869 Copyright 1998, by UMI Company. AH rights reserved. This microform edition is protected against unauthorized copying under Title 17, United States Code. UMI300 North Zeeb Road Ann Arbor, MI 48103 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 7.
    Authorization to SubmitDissertation This dissertation o f Sally Anderson, submitted for the degree o f Doctor o f Education with a major in Education and titled, “Using the Malcom Baldrige National Quality Award Education Pilot Criteria for Self-Assessment of School Districts” has been reviewed in final form, as indicated by the signatures and dates given below. Permission is now granted to submit final copies to the College o f Graduate Studies for approval. Major Professor Date*: ■1 Date:Committee Members Dr. Michael Tomlin r Penny Schweibert Date: # Date."2 Department Administrator Dean, College of Education Dr. Roger Reynobison / — Date: V 9 $ Dr. .ferry T/lbhscherer Date Dr. Dale Gentry Final Approval and Acceptance by the College o f Graduate Studies V I A - — Date: _ _ 5 V / 3 / ^ / Jeanme M. Shreeve R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 8.
    Abstract The demand forimprovements in education continues. Failed reform attempts, educational fads, and poor planning designs have been cited as variables affecting the approach to improvements in public schools. This study examines the literature o f failed reforms, current approaches to determine the performance o f schools, and models o f improvement based on quality theory and practices. This study investigated the perceptions o f three types o f educators— superintendents, principals, and teachers— regarding the performance o f their school district in seven categories o f organizational performance. The size o f the district based on student enrollment was used as the second independent variable to determine if there were any significant differences in perceptions based on size o f district. An instrument was developed using the criteria in the Malcolm Baldrige National Quality Award, 1997 version; the Education Pilot criteria; curriculum audit standards; and accreditation standards. The study used a proportional stratified random sampling procedure by size o f district and type of educator. The findings were analyzed using a two-way analysis o f variance for each of the seven categories. The study found the reliability of the instrument to be a low o f .74 for School District Results to a high o f .85 for Leadership. Significant differences in the perceptions of performance o f the school districts in each o f the seven categories were found to exist between superintendents and teachers, as well as principals and teachers. No significant differences were found between superintendents and principals or in any category by the size of district. The study discusses the implications o f the findings for a framework for school improvement. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 9.
    Acknowledgements The ideas, design,and completion o f this project were the result o f many dialogues and the collective knowledge o f many. I wish to express my sincere appreciation for the support and guidance o f my advisor, Dr. Cleve Taylor, and the mentoring o f my committee, Dr. Roger Reynoldson, Dr. Penny Schweibert, and Dr. Mike Tomlin. I wish to thank Dr. Mike Friend o f the Idaho School Administrators Association and Jim Shackleford o f the Idaho Teachers Association for their contributions o f resources and support for this study. My sincere appreciation goes to Dr. Carolyn Keeler, Dr. Del Siegle, and Dr. Bill Parrett for their technical expertise and recommendations. I also wish to thank Eleanor Fisk for her assistance with the laborious task o f scanning the returned instruments and Alice Gould, Stephanie Fox , Dawn Davis, and Chris Latter for their assistance in the details and preparation o f this document and the defense. My most sincere appreciation is to my husband, Mike, and our boys, A. J. and Jon, for countless sacrifices they made so that my goals could be accomplished. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 10.
    Dedication This effort isdedicated to the three men who have taught me the most important lessons in my life. To the memory of my father who gave me the thirst for new knowledge and the potential to seek it; to my husband, whose love is the greatest gift o f my life and whose commitment, support, and patience are true models for all; and to my son, A. J., who inspires me to grow and who will always be a continual source o f pride and enlightenment. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 11.
    vi Table o fContents Page Authorization to Submit Dissertation...................................................................................................... ii Abstract.........................................................................................................................................................iii Acknowledgements.................................................................................................................................... iv Dedication.......................................................................................................................................................v List o f Tables.............................................................................................................................................viii List of Figures..............................................................................................................................................ix Chapter 1: Introduction............................................................................................................................. 1 Background o f the Problem....................................................................................................... 1 The Effectiveness o f School Reform..........................................................................2 Statement of the Problem..............................................................................................................5 Significance o f the Problem.........................................................................................................5 Traditional Methods for Determining School Performance...............................................6 Quality Models for Organizational Effectiveness..................................................................7 Research Questions........................................................................................................................ 8 Hypotheses.......................................................................................................................................9 Limitations...................................................................................................................................... 9 Delimitations................................................................................................................................. 10 Definitions......................................................................................................................................10 Summary.........................................................................................................................................13 Chapter 2: Literature Review...................................................................................................................14 Introduction....................................................................................................................................14 Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 12.
    Education as aSystem ................................................................................................................ 14 Organizational Effectiveness....................................................................................................17 Determining Effectiveness in SchoolSystems.......................................................................21 Quality Theory................................................................................................................34 Continuous Improvement in Business.....................................................................40 Summary.........................................................................................................................................59 Chapter 3: Methodology...........................................................................................................................60 Introduction....................................................................................................................................60 The Research M odel....................................................................................................................60 Instrumentation............................................................................................................................. 61 Subjects and Settings.................................................................................................................. 63 Collection o f Data........................................................................................................................ 64 Data .Analysis................................................................................................................................65 Summary.........................................................................................................................................65 Chapter 4: Findings....................................................................................................................................6 6 Introduction....................................................................................................................................6 6 Rate o f Return...............................................................................................................................67 Characteristics o f Sam ple......................................................................................................... 68 Reliability o f Performance Analysis for School Districts................................................71 Descriptive Analysis................................................................................................................... 72 Inferential Statistical A nalysis...............................................................................................115 Analysis o f “Do Not Know” Responses.............................................................................. 121 Usefulness o f the Instrument as a Tool.................................................................................122 Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 13.
    viii Summary......................................................................................................................................124 Chapter 5: Summary,Conclusions, and Recommendations.........................................................125 Summary......................................................................................................................................125 Conclusions.................................................................................................................................126 Recommendations.....................................................................................................................128 References................................................................................................................................................ 131 Appendix A: Instrument........................................................................................................................140 Appendix B: Panel o f Experts Used in Content Validation.........................................................175 Appendix C: Letter to Panel of Experts.............................................................................................176 Appendix D: Matrix o f Population Sample.......................................................................................177 Appendix E: Codes on the Instrument...............................................................................................178 Appendix F: Cover Letter and Directions.........................................................................................179 Appendix G: Postcard Reminder.........................................................................................................181 Appendix H: Letters o f Support..........................................................................................................182 Appendix I: Districts by Enrollment S ize.........................................................................................184 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 14.
    ix List o fTables Page Table 1. Factors o f Organizational Effectiveness............................................................................22 Table 2. Meta-analysis Findings of School System Evaluation Components a Reported in the Literature....................................................................................................23 Table 3. Curriculum Audit Findings o f 67 School Districts Between 1988 and 1994...................................................................................................... 29 Table 4. A Comparison Betweeen Teaching Theories of QualityExperts...............................40 Table 5. Baldrige National Quality Award Criteria 1997............................................................ 47 Table 6 . Validity o f the MBNQA M odel...........................................................................................48 Table 7. Accuracy o f the MBNQA W eights................................................................................... 49 Table 8 . Core Values/Concepts o f MBNQA Education Pilot1995............................................. 54 Table 9. 1995 MBNQA Educational Pilot Criteria......................................................................... 58 Table 10. Stratified Random Sample Matrix....................................................................................64 Table 11. Total Return Rates by Educator Position........................................................................67 Table 12. Frequencies and Percentages o f Returns Received by Educator Position And District Size.................................................................................................................... 68 Table 13. Percentage of Highest Degree and Time in Position by Size and Position...........70 Table 14. Rank Order o f Combined Sam ple....................................................................................69 Table 15. Reliability o f Instrument..................................................................................................... 71 Table 16. Means by District Size and Positions Combined......................................................... 72 Table 17. Means by District Size for Districts With More Than 5,000 Students Enrolled.. 73 Table 18. Means by District Size for Districts With 4, 999 to 2,500 Students Enrolled......73 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 15.
    Table 19. Meansby District Size for Districts With 2, 499 to 1, 000 Students Enrolled 74 Table 20. Means by District Size for Districts With 999 to 500 Students Enrolled...............74 Table 21. Means by District Size for Districts With Less Than 499 Students Enrolled....... 75 Table 22. Means by Educator Position for Superintendents.........................................................75 Table 23. Means by Education Position for Principals.................................................................. 76 Table 24. Means by Educator Position for Teachers......................................................................76 Table 25. Item Frequency and Percentage o f Response by Position and Size o f District For Items in the Leadership Category............................................................................... 80 Table 26. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Strategic Planning Category..................................................................87 Table 27. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Student Focus and Satisfaction/Stakeholder Categories............... 90 Table 28. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Information and Analysis Category.................................................... 94 Table 29. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Human Resource Development Category.........................................97 Table 30. Item Frequency and Percentage of Response by Position and Size o f District For Items in the Educational Process Management Category.................................. 103 Table 31. Item Frequency and Percentage of Response by Position and Size o f District For Items in the School Districts Results Category.....................................................109 Table 32. Two-Way ANOVA Leadership Construct....................................................................116 Table 33. Two-Way ANOVA Strategic Planning Construct...................................................... 117 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 16.
    xi Table 34. Two-WayANOVA Student/Stakeholder Construct..................................................118 Table 35. Two-Way ANOVA Information and Analysis Construct........................................119 Table 36. Two-Way ANOVA Human Resource/Management Construct.............................. 119 Table 37. Two-Way ANOVA Educational Process/Operational Management Construct..................................................................................................................................120 Table 38. Two-Way ANOVA School District Results Construct.............................................121 Table 39. Chi Square for “Do Not Know” Responses..................................................................122 Table 40. Combined Percentage for Usefulness o f Instrumentation........................................123 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 17.
    xii List o fFigures Page Figure 1. An Educational System as an Open System.....................................................................16 Figure 2. A Quality Systems Model for Performance Improvement.......................................129 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 18.
    1 Chapter 1 Introduction Background ofthe Problem The performance o f schools is currently determined by a multitude o f indicators based on political, traditional, and institutional influences. Public opinion for the performance o f the complete system is often based on one or more o f these indicators (Bracey, 1997; Bushweller, 1996; Elam, Lowell & Gallup, 1996). Although improvements are occurring in many o f the nation’s schools, results are still anecdotal, isolated, and far from replicable (Fullan, 1993). Public criticism still abounds and the perception o f inferior quality and poor performance remains (Bushweller, 1996; Hodgkinson, 1996; Houston, 1996; Huelskamp, 1993). The demands for greater accountability for publicly funded institutions have not diminished. The lack o f evidence o f improved performance, effective planning, and the increase spending o f public funds without discernible measures o f tangible results have led to the demand for more business-like strategies (DeMont, 1973; Gerstner, 1995; Kearns & Doyle, 1988). School improvement and how to achieve it continues to inspire public, political, and professional dialogue and debate. The approach to improving public schools is as varied as the prophets and their doctrines. Little to no sustainable improvements, public hostility, and disenfranchised teachers are left in the wake of such well-intentioned efforts (English & Hill, 1994). When teachers from the high performing Willamette Primary School in Oregon were asked why they thought so many schools were failing, they blamed the pursuit o f “it” (Sagor, 1995). Solving the problems in education with a one-solution approach perpetuates the notion that “it” will remedy the problem and things will be better once we find “it.” These R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 19.
    2 types o fsolutions are often at a visible, obvious level denying the complexity o f other interdependent relationships and root causes (Bernhardt, 1994; Deming, 1994; Scholtes, 1995). Non-systemic interventions to improve organizations merely shift problems from one part o f the organization to another (Senge, 1990). The Effectiveness o f School Reform Upon review o f current literature, factors affecting the efficacy o f reform strategies appear to fall into three categories: (a) selection o f reform initiatives, (b) implementation o f reform initiatives, and (c) the improvement or change strategy selected. The 1960s saw a multitude of reform initiatives influenced significantly by a national concern that American education was falling behind foreign accomplishments and the civil rights movement (Fullan, 1993). Solutions were often superficial, quick-fix remedies made impatiently as a result of various pressures facing the decision makers or educational fads (Fields, 1994; Fullan, 1992). The result was often— and still continues to be— an abundance of disjointed, incomplete improvement initiatives (Fullan, 1997). The presumption that developing innovations on a national scale would lead to widespread adoption appeared to be flawed (Fullan, 1993). Flawed implementation is another source o f much discussion in the literature of educational reform. Berman and McLaughlin (1977) did a comprehensive study of programs that were federally funded. They found many examples o f failed implementation which included failure to take into account local nuances and capacity, desire for additional funds for political reasons rather than educational reasons, and the presumption that innovations are implemented one at a time contrary to the reality o f schools. Another perspective offered by Fullan and Miles (1992) is the misunderstanding o f resistance. They argue that issues of Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 20.
    3 effective implementation andcommunication strategy are often at the heart o f the reason why reform fails rather than personal attitudes and resistance (Fullan et al, 1992). A third reason for failed reform reflected in the literature is related to the absence o f a specific change strategy. It has been found that often each stakeholder o f education has a different, and usually faulty, belief about how change occurs (Fullan, 1993; Fullan et al. 1992; Hargreaves, 1997). This results in confusion and conflict during both design and implementation phases. Denial o f the complexity o f problems and solutions is often observed. Critics o f past reform efforts advocate for three things: (a) greater recognition of the complexity o f the educational system; (b) deeper second order changes in the organization; and (c) the need to be created, designed, and implemented by those knowledgeable o f the institution (Fields, 1994; Hargreaves, 1997; O’Neil, 1995; Sarason, 1990; Wagner, 1993). Reform efforts, particularly those driven through mandated practices contingent on state or federal dollars, often result in symbols o f improvement over substance (Berman, 1977; Wagner, 1993). During the past decade, most people involved in the reform o f education have come to advocate a systemic perspective (Fullan, 1992; Timpane & Reich, 1997). The resurgence o f interest in systems theory applications is currently resulting in a heightened attention to and recognition o f the complexity of organizations, particularly educational institutions. A central principle underlying systems thinking is that structure influences behavior (Deming, 1986; Patterson, Purkey & Parker, 1986; Senge, 1990). The structure, therefore, used to initiate, conduct, and evaluate an improvement process is related to the potential effectiveness o f each specific solution deployed. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 21.
    4 Sarason (1990) proposestwo basic premises to influence the design and implementation o f solutions in school reform efforts. The first is the presence o f a conceptual framework that recognizes the relatedness o f human behavior. The second is a thorough understanding of the context o f that improvement. Reformers to date have been criticized for not having an implicit theory about how to achieve change, and they do not always recognize the influence o f the intractability o f the system (Fullan & Miles, 1992; Sarason, 1990). Ignoring these two factors can result in an approach that seeks the cure and ignores the diagnoses. Focusing on doing the right thing, over doing in the right way, can result in using the means as the end (Bennis, 1976). In K-12 educational systems, the development o f school improvement plans often becomes a substitute for results (Sergiovanni, 1992). A focus on the outcomes or results of education have rarely been operationalized (Schmoker, 1996). Educators often resist confronting the results and using them to make decisions for school improvement (Bernhardt, 1994; Schmoker, 1996). Schools are traditionally limited in their use o f information and have little need to depend on systematic feedback from a variety of their customers (Bernhardt, 1994; Consortium on Productivity in Schools, 1995; Schmoker. 1996). The capacity o f data and information to reveal strengths, weaknesses, successes, and failures are threatening to educators particularly in a political context (Schmoker, 1996). Schmoker (1996) further states that schools are too poorly organized to see the connection between effort and outcomes. The theoretical base upon which improvements are determined and made in the total organization, or any part o f the total organization, is critically important in demonstrating outcomes (Deming, 1991). The framework which follows from the theory results in the R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 22.
    5 models, methods, andtools (Skrtic, 1991). This study investigates a theory and the development and use o f a framework for a process o f organizational analysis o f its performance. Statement o f the Problem The process o f improving school performance often lacks an organized strategy, processes for decision-making, deployment o f those decisions, and a mechanism for evaluating results o f those strategies (Bernhardt, 1994; Fullan & Miles, 1992). The manner in which schools think about the school improvement process determines their ability to deploy it successfully (Bernhardt, 1994). The current methods o f determining organizational performance in schools, identifying the areas o f improvement, and implementing these changes lack a conceptual framework which recognizes the relatedness o f human behavior (Sarason, 1990). The accreditation process, once intended to be a mechanism for self-study, has become a political formality which focuses on the surface indicators with no mechanism for deeper improvements leading to results (Portner, 1997). Education is lacking a useful comprehensive framework for systemic analysis o f its performance and its approach to improvement. Significance o f the Problem Goodlad (1984) remarked that in order to survive, an institution must have the faith of its clients in its usefulness and a measure o f satisfaction o f its performance. More than 10 years later, public education continues to be challenged by the many constituencies who have similar criticisms (Bushweller, 1996; Hodgkinson, 1996; Houston, 1996; Huelskamp, 1993; Gerstner, 1995; Kearns et al, 1988). The use o f measures of satisfaction from the customers o f education is limited. Subsequent approaches to improving performance that have the Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 23.
    6 potential o fincreasing satisfaction are diverse, scattered, and often politically motivated (Consortium on Productivity in the Schools, 1995). This research explores the existing approaches used to determine the performance o f a school district. The arguments germane to this research have been organized into three categories: (a) the investigation o f the traditional methods for currently determining the performance of a school district, (b) the concept of organizational effectiveness, and (c) the potential use o f models emerging from quality theory to analyze organizational effectiveness in school districts. Traditional Methods For Determining School Performance Determining the performance of education is an undertaking worthy of in-depth analysis o f its own. It has not been established in the literature that the measures currently used and analyzed are the appropriate indicators o f the performance o f the educational system (Huelskamp, 1993). Traditional methods o f assessing the successes and failures o f public education include, most typically, multiple constituency models. These models are designed to meet standards or criteria set by various stakeholders for various purposes (Brassard, 1993). Traditional models include financial, management, and curricular audit procedures; program evaluation studies; federal or state compliance reviews; and specific indicators of student performance. Accreditation is currently the most comprehensive practice which purports to determine the performance o f a school (Portner, 1997). The accreditation process, once a status symbol for schools, now is viewed as a routine examine with little relevance to school improvement (Portner, 1997). It does not, however, look comprehensively at the entire school district since schools are accredited as singular units. The performance o f schools, and, therefore, school systems, is often inferred by the general Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 24.
    7 public based onthe performance o f student scores on annual measures as reported in the media (Rotberg, 1996). Specific units or functions o f the school district are often reviewed, as required by state and federal laws, such as financial audits, regulations o f Title I, or the Individuals with Disabilities Act. The focus o f these processes is to determine compliance with regulations and the need for any corrective action. Non-compliance in some cases can mean financial penalties to the district. Curriculum audits offer a thorough process for assessing the organization, delivery, support, and results o f the instructional process. Specific standards have been developed and criteria are used to determine the degree of effectiveness. Professionally trained auditors, external to the district, conduct the process and prepare a final report. Peer reviews, such as management audits, also occur. They are often designed by administrators to analyze specific parameters o f management. Quality Models For Organizational Effectiveness According to Field (1994), the National Education Association and the American Association o f School Administrators suggest that few innovations or educational changes stimulated from outside o f education will occur without educator commitment. Management is responsible for the design and approach to improving the performance of the system (Crosby, 1984). The people who understand the processes and their outcomes well enough are school education leaders within school organizations (Fields, 1994; Sarason, 1990). In the absence o f a foundational theory upon which to base practices and an organized approach to accomplish the improvements, the educational leader is vulnerable to public criticism and negligent of their duties. There is also a need to bring the practitioner into the creation and design o f the practice (Deming, 1993, Glasser, 1992; Imai, 1986). There are ever-increasing examples of classroom teachers who are feeling helpless against a barrage o f public criticism Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 25.
    8 and increasing, uncoordinateddemands o f federal, state, and local officials (Bracey, 1997; Fullan, 1997; Hargreaves, 1997). Quality theory, practices, and tools are being used increasingly by educational and service organizations. Educators are applying quality principles by defining the needs and perceptions o f internal and external customers, using information to make decisions, and designing results-oriented strategies for systemic improvement involving people in all parts o f the organization. The application o f the Malcolm Baldrige National Quality Award, originally designed for business, has been extended to educational institutions and offers a framework for analysis and recognition (National Institute o f Standards and Technology, 1995). Research Questions The research questions posed in this study are: 1. How do educators perceive their own school district’s performance based on an instrument designed using the Malcolm Baldrige National Quality Award Education Criteria? 2. Are there differences in these ratings based on type o f educator or size o f district? 3. Do educators find this instrument a useful tool to study these areas o f a school district? 4. Do educators believe this instrument could be useful in determining school improvement needs? Hypotheses The study will test the following null hypotheses: Ho i: There are no significant differences in the Leadership category o f the Performance Analysis for School Districts by type or size. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 26.
    9 Hen: There areno significant differences in the Strategic Planning category of the Performance Analysis for School Districts by type or size. H03: There are no significant differences in the Student Focus and Student and Stakeholder Satisfaction category o f the Performance Analysis for School Districts by type or size. H04: There are no significant differences in the Information and Analysis category of the Performance Analysis for School Districts by type or size. H05 : There are no significant differences in the Human Resource Development and Management category o f the Performance Analysis for School Districts by type or size. H o 6: There are no significant differences in the Educational and Operational Process Management category o f the Performance Analysis for School Districts by type or size. H07: There are no significant differences in the School District Results category of the Performance Analysis for School Districts by type or size. Limitations The study is subject to the following limitations: 1. The study presumes a truthful response and that respondents will understand items. 2. Responses to items are subject to personal biases, motivations, perspectives, and experience of the respondents. 3. Responses are presumed to be independently made. 4. Respondents’ prior knowledge o f theoretical constructs behind the instrument is unknown. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 27.
    10 5. The designo f the study is not experimental; therefore no causal relationship can be inferred. 6 . Responses will be collected through a mailed survey that decreases the probability of 1 0 0 % participation. 7. The study presumes that meaningful analyses can be made on less than 100% o f returned responses. Delimitations 1. The study is limited to superintendents, principals, and classroom teachers within Idaho, which affects generalizability of the findings to other educators outside o f Idaho. 2. The entire population will not be used. A proportional stratified random sample will be drawn from the population o f interest. Therefore, the data realized is subject to the limitations o f the sample. Definitions The following terms are used in the study or in the Performance Analysis for School Districts instrument: 1. Approach refers to the systems in place to improve quality and customer satisfaction (Brown, 1994). 2. Collaborative and participatory approach to management is defined as jointly working to identify problems and determining improvements with others in the organization who are knowledgeable, involved, and affected by any decisions made. 3. Communication processes refer to methods used to inform and seek opinions from others. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 28.
    11 4. Comparative dataor benchmarking refer to an improvement process in which an organization compares its performance against best-in-class organizations, determines how these organizations achieved their performance levels, and uses the information to improve its own performance (Shipley & Collins, 1997). 5. Conventional information refers to standardized and state test scores, enrollment, attendance, dropout, discipline, and operating budget. 6. Deployment refers to the extent to which an approach has been implemented across an organization (Brown, 1994). 7. Educational programs and services refer to all programs and services provided to students and conducted by professional, certified personnel or by non-certified personnel supervised by certified personnel. 8. Educational support services refer to all programs and services which support educational programs, such as business operations, transportation, public relations, purchasing, clerical services, legal services, volunteers, food service, records, buildings, and grounds. 9. Expectations refer to clearly defined statements describing specific academic, behavioral, or social criterion to measure achievement. 10. Data and information processes include the collection, management, and dissemination o f data on enrollment, achievement, operations, and stakeholder satisfaction that are used in evaluation and planning processes. 11. Internal communication processes refer to personnel and students within the school district. 12. External communication processes refer to parents and community stakeholders. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 29.
    12 13. Human resourcearea includes employee well-being, satisfaction, professional development, work system performance, and effectiveness. 14. Human resource indicators include employee well-being, labor relations, satisfaction, professional development, work system performance, and effectiveness. 15. Leadership refers to district-level senior administrators and board o f trustees. 16. Organizational effectiveness is a social construct referring to the quality o f an organization that achieves the performance expected o f it (Brassard, 1993). 17. Performance refers to the results produced by the school district as illustrated by multiple indicators. 18. Performance data includes data or information from all aspects o f the organization, including student performance measures, enrollment, discipline, human resources, business operations, and community. 19. Results refer to data on the performance o f the organization (Brown, 1994). 20. School district units refer to the specific schools, departments, or services o f that school district. 21. Stakeholder refers to individuals or groups, both internal to the school (students, all personnel) and external (parents, community members, business) which are affected by the conditions and quality o f education and the preparedness o f graduates. 22. Student conduct indicators refer to measures of student behavior such as disciplinary infractions, suspensions, expulsions, arrests, etc. 23. Strategic development refers to the process by which members o f an organization clarify the purpose and develop the necessary procedures and operations to achieve a purpose and design a strategic plan. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 30.
    13 24. Suppliers referto those businesses or individuals with which the district contracts for specific services such as training, consulting, transportation, legal, etc. 25. Partnering processes refer to relationships between organizations, community agencies, and businesses through which services for students and stakeholders are designed, implemented, and provided. 26. System refers to a complex of elements in mutual interaction (Owens, 1970). 27. Total quality or continuous improvement refer to a system that elicits organization-wide participation in planning and improving processes in order to meet or exceed customer expectations. 28. Work systems are defined as the way in which jobs, work, and decision-making are designed at all levels within the organization. Summary There are few organized approaches to the assessment o f performance in school districts that apply an integrated analysis of the subsystems. The lack o f use o f information to make strategic improvement decisions and a systems-based approach to assessing the current effectiveness contributes to unsuccessful reform initiatives in education. An initial step in any process o f examination is to determine what now exists (Goodlad. 1984). This study seeks to determine the usefulness of an assessment process to a acquire baseline perception of the organization’s performance as it exists today using three constituencies of the organization. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 31.
    14 Chapter 2 Literature Review Introduction Areaso f literature previously cited in Chapter 1 are investigated in depth in this chapter. First, it is critical to understand the nature o f the educational institution as a system. Defining the nature of educational subsystems and their relationship to each other in the larger system is fundamental to understanding how to study and improve its effectiveness. Second, organizational effectiveness theory and practice is discussed. Third, current approaches to determining school effectiveness and issues o f accountability are reviewed. Fourth, quality theory is discussed as foundational to understanding the emerging applications in both business and education. Finally, the applications o f the Malcolm Baldrige National Quality Award in business and education are described. Education as a System During the past decade there has been an increased attention to systems thinking. The field o f systems thinking includes cybernetics, chaos theory, and Gestalt theory, and is reflected in the works of Ludwig von Bertallanfy, Russell Ackoff, Gregory Bateson, and Eric Trist (Senge et al, 1994). Ludwig von Bertalanfy recognized the relationships among several important concepts current in biology in the 1930s (Levine & Fitzgerald, 1992). He named the integration o f these ideas general systems theory, incorporating cybernetic concepts such as feedback. Miller (1993) describes the theory as a philosophy of science that studyies natural phenomena of all sorts as heterogeneous wholes composed o f multiple different but interrelated parts rather than studying each part in isolation. Three types o f systems are described in the literature of the biological sciences. Isolated systems are described by Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 32.
    15 Nicolis and Prigogine(1977) as those which do not exchange matter or energy with their environment. Closed systems do exchange energy with their environment while open systems exchange both energy and matter with their environment (Nicolis & Prigogine, 1977). Systems thinking is a discipline for seeing the wholes and the pattern o f interrelationships among key components o f a system (Eisenberg & Goodall, 1993; Owens, 1970; Senge, 1990; Senge et al, 1994). A system is a collection o f parts that interact to function purposefully as a whole (Deming, 1986; Patterson, 1993; Senge. 1990). Inter­ dependence is the primary quality o f a system. It refers both to the completeness o f the workings o f a system in its environment and the interrelationships o f individuals that fall within the system. These interdependent relationships between people give the organization its culture. Process, feedback, and contingency are also components o f systems (Eisenberg & Goodall, 1993). From a systems perspective, a school district, like other organizations, does not exist as an entity unto itself, yet it often behaves as one (Eisenberg & Goodall, 1993). School districts are both open and social systems (Hoy & Miskell, 1991; Owens, 1970). A social system is defined as an interactive, interrelated, and interdependent network o f components and unique organizational properties that form an organized whole and function to serve common goals (MacLellan, 1994). An open system depends on the external environment for their continued existence, requiring resources from external inputs to the systems (Consortium on Productivity in the Schools, 1995; Deming, 1986; Eisenberg & Goodall, 1993; Owens, 1970; Senge, 19903). Figure 1 presents a school district as an open system. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 33.
    16 People []"" Money Support Resources INPUTS OUTPUT Education System $$CUSTOMERS HigherEducation Taxpayers Employers Educated students who can continue to learn Figure 1. An Educational System as an Open System. In open systems, groups outside the system affect the system’s survival and their ability to change. School districts take in political, financial, and human resources and use them to create a service. This service results in a product to the surrounding environment of the workplace, higher education, and community. Open systems theory emphasizes the dynamic aspects o f organization; that is, the movement in one part leads in predictable fashion to movement in other parts. They are in a constant state o f flux because they are open to inputs from the environment (Katz & Kahn, 1978). School districts have also been described as loosely coupled systems (Weick, 1976). Weick (1976) explains that the use of the term intends to convey the image that coupled events are responsive but that each event also preserves its own identity and some evidence of its physical separateness. There is usually lack o f clarity, coordination, and articulation between and among subsystems within the larger system, despite their interdependence. Such systems often are organizations in which accountability and interdependence between subsystems are low and autonomy is high (Deer, 1976; Fullan, 1980). Subsystems are purposely not closely connected and do little to control each other’s activities. They tend to Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 34.
    17 respond by shuttingout environmental threats and increase the sense o f efficacy and autonomy o f its members. Theories o f bureaucracy from which schools continue to be organized, have paid little attention to the organization’s dependency on internal and external environments (Deming, 1986; Eisenberg & Goodall, 1993). The task o f teaching was viewed as clearly understood, routine, and predictable (Katz & Kahn, 1978; Owens, 1970). This mechanistic approach to organizing work in schools is considered to be efficient. This structure does not work when the environment is uncertain and, in fact, interferes with the organization’s ability to be adaptive to its inputs (Consortium o f Productivity in Schools, 1995). The influence of systems theory results in emphasis on inputs, processes, how the processes interact, information flow and feedback, management o f relationships, and outputs (Deming, 1986; Eisenberg & Goodall, 1993). Organizational Effectiveness The body of knowledge o f organizational theory, behavior, and the pursuit of a model to determine organizational effectiveness is substantial, confusing, and often in conflict (Brassard, 1993; Georgopoulos, 1957; Zammuto, 1982). Organizational effectiveness has been defined in the literature in a variety o f ways. Attempting to define effectiveness, develop criteria, and apply them to a variety o f organizations continues to be noted in the literature (Brassard, 1993; Cameron, 1980; Georgopoulos, 1957; Zammuto, 1982). Yuchtman (1967) points out two assumptions that are either implicitly or explicitly made: (a) Complex organizations have an ultimate goal or function, and (b) the ultimate goal can be identified empirically and progress toward it measured. How organizational effectiveness is R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 35.
    18 defined is relatedto the theoretical model from which it was developed. Three models emerge in the literature (Brassard, 1993; Zammuto, 1982). As early as the 1930s the emergence o f a goal-based approaches to organizational effectiveness can be seen (Brassard, 1993; Cameron, 1980; Zammuto, 1982). These models are often referred to as rational models and are functional rather than conceptual (Georgopoulos, 1957). Organizational effectiveness is seen as the degree o f achievement of multiple goals or the degree o f congruence between organizational goals and observable outcomes (Zammuto, 1982). The focus o f the rational organization is goal orientation (Cameron, 1980; Patterson, Purkey & Parker, 1986). The design, articulation, and achievement o f goals are emphasized in the organizations applying this model. The assumptions in this model are that goals remain stable over time, goals are determined by the leaders o f the organization, and goals become translated into objectives within the sub-units of the organization (Patterson et al, 1986). The organization is seen as an entity rationally structured in order to achieve the goals to which it ascribes. The goals are typically created to help the organization achieve its expected performance. The focus o f evaluating effectiveness from this model is on the outputs produced by the attainment o f goals (Cameron, 1980). The development o f efficiency-related criteria to insure the accomplishment o f goals is often designed to influence the use o f resources to achieve optimal performance, productivity, and profits for the organization (Bressard, 1993). This model led to such practices as management by objectives that remained popular through the 1970s. The focus o f management in this model is the accomplishment o f the organization’s goals (Hersey & Blanchard, 1982). The emphasis o f management within this model is setting goals and objectives that are accomplished by motivating and controlling Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 36.
    19 others in theorganization to carry them out. In practice, however, organizations often do not reflect these goals in their daily activities, either at a micro or macro level (Brassard 1993). There are limitations to a goals approach pointed out in the literature (Cameron, 1980; Etizoni, 1960; Katz & Kahn, 1978). Success may be overlooked if there is no goal to measure it. Goals may be too low, misplaced, or harmful long term. Goals are usually expressed as idealized states and are often not realistically assessed. The nature and effects o f the social systems from which goals emerge often are not considered in the attainment o f goals. The goals model may be useful when organizational goals are clear, consensual, and measurable (Cameron, 1980; Patterson, Purkey & Parker, 1986). The criteria for determining effectiveness then become unique to that organization and its goals. Systems-based approaches emerged during the 1950s, according to Owens (1970) and Zammuto (1982). These models draw on the emerging body of general systems theory discussed initially in this chapter. Applying this theory, organizational effectiveness is then viewed as the extent to which an organization as a social system fulfills its objectives without incapacitating its means and resources and without placing a strain upon its members (Zammuto, 1982). W. Edwards Deming believed that systems are developed to perform repetitive tasks (Deming, 1982). Most problems within organizations, he believed, came from sub-optimization of that system, meaning the system was performing these tasks below their capability. The inconsistencies and contradictions that become apparent upon analysis o f the system can be used to detect and isolate the flaws of that system (Bradley, 1993). Other models appearing during the 1970s are referred to as the multiple constituent definitions o f effectiveness. The organization is effective insofar as it meets the expectations o f actors associated with it in one way or another and who try to promote their objectives and R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 37.
    20 interests (Brassard, 1993;Cameron, 1980). An organization is effective insofar as the majority o f those participating in it perceive that they can use it to satisfy their interests. This model stresses the importance o f satisfying the expectations o f the actors who agree to support the organization and who influence its ability to obtain the resources it needs and conserve its legitimacy (Brassard, 1993). This model acknowledges up front the influence that these constituents have on the organization. Related to these models are those approaches driven by the requirements o f external organizations such as accreditation agencies, laws, and regulations (Brassard, 1993). Dubin (1976) pointed out that organizational effectiveness has a different meaning, depending on whether the organization is viewed from the outside or inside. The inside perspective o f an organization tends to be a traditional managerial viewpoint which emphasizes return on investment and efficient use o f resources. The perspective from the outside evaluates the output of the organization relative to its contribution to the environment or the context outside the organization. Dubin (1976) further points out that there is no correlation relationship between these two perspectives. In fact, he says, they are worlds apart and cannot be reconciled. “We must face squarely the fact that organizations live under conflicting demands regarding their effectiveness” (Dubin, 1976, p. 8). Bass (1952) suggested the criterion of organizational success needed to be expanded to include measures relevant to employees, society as a whole, and the organization’s management. He suggested organizational performance should be assessed based on: (a) the degree to which an organization’s performance was profitable and productive, (b) the degree to which an organization was of value to its employees, (c) the degree to which an organization and its members were of value to society. Campbell et al (1974) found over 25 different variables Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 38.
    21 that were usedas measures o f effectiveness in organizations prior to 1973 (see Table 1). The most commonly recurring criteria were adaptability/flexibility, productivity, and satisfaction. Determining Effectiveness in School Systems MacLellan (1994) did a meta-analysis o f organizational effectiveness criteria used in school systems. He found fifteen criteria: goals, environment, leadership, structure, work force, interaction, process, decision-making, workplace, culture, change, communication, curriculum, accountability and politics. The studies used ranged from 1967 through 1991. Determining the effectiveness of schools is even more elusive than for other organizations. Schools, universities, and colleges have been referred to as organized anarchy in the literature o f organizational study (Cameron, 1980). Some typical characteristics are: 1. Goals are ill-defined, complex, changing, and often contradictory. Goals of some sub-units may be unrelated to the broader organizational goals. 2. There is often no connection in the way work is done and the outcome. 3. More than one strategy can produce the same outcome. 4. There is little or no feedback from the output to the input. 5. Sub-units are not tightly connected, so it is easier to ignore outside influences. 6. Widely differing criteria of success may be operating simultaneously in various parts of the organization. There is often an ambiguous connection between the organizational structure and the activities o f the organization. It is typical to find rigid structures and hierarchies imposed upon loose, fuzzy processes. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 39.
    22 Table 1. Factorso f Organizational Effectiveness Overall effectiveness o f the organization Productivity Efficiency Profit Quality Accidents Growth Absenteeism Turnover Motivation Control Flexibility/adaptation Role and norm compliance Readiness Utilization o f environment Evaluations by external entities Internalization o f organizational goals Satisfaction Morale Conflict/cohesion Goal consensus Managerial task skills Managerial interpersonal skills Managerial management communication Stability Value o f human resources Note: From The Measurement o f Organizational Effectiveness: A Review o f Relevant Research atid Opinion (pages 39-40), by J. P. Campbell, E. A. Brownas, N. G. Peterson and M. D. Dunnette, 1974, San Diego: Naval Personnel Research. Cameron (1980) makes the point that none o f the described models of organizational effectiveness will work for organized anarchy. Criteria o f effectiveness are usually vague and ambiguous, making organizational goals difficult to measure and not necessarily agreed upon by all sub-units. There is often no feedback loop between outputs and inputs, making the systems model an unnatural fit. Cameron (1980) suggests that the multiple constituencies model may be the most appropriate for the organized anarchy. The demands o f the constituencies, once defined, can be assessed on the degree to which they are met. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 40.
    23 Although there issubstantial information regarding evaluation of schools and school programs in the literature, there is little consistency in the approaches and parameters evaluated. Eight components are reported by Nowaiowski (1985). They include business and finance, curriculum and instruction, policy, planning and evaluation, pupil personnel services, personnel, school-community relations, and school management. MacLellan (1994) ranked components o f school systems evaluated in the literature. Table 2 illustrates his findings. Table 2. Meta-analysis Findings of School System Evaluation Components by Rank as Reported in the Literature. 1. Goals 8. Decision-making 2. Environment 9. Work place 3. Leadership 10. Culture 4. Structure 11. Change 5. Workforce 12. Communication 6. Interaction 13. Curriculum 7. Process Note: From “Towards a New Approach for School System Evaluation,” (Page 159), by David MacLellan, 1994. (Doctoral dissertation, Dalhousie University, Nova Scotia). DisseNational Abstracts International. There are three methodologies that appear to be represented in a substantial manner in the literature: (a) evaluation research, (b) curriculum audits, and (c) effective schools research. The researcher has also included a discussion o f the accreditation process in Idaho. The discussion o f the effectiveness of schools is often linked with discussions of accountability or to performance of student learning. Although both factors are relevant to Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 41.
    24 this study, theresearcher focused on models that were using multiple indicators of effectiveness. A key deficit in most educational systems, which is all to frequently pointed out by the critics of public education, is the lack of effective evaluation (Worthen, 1987). The public demand for accountability has made many educators fearful o f the concept. DeMont and DeMont (1973) suggested three improvements to the process o f demonstrating accountability: (a) an increased focus on the outputs o f education, (b) the production of more effective evaluative and research models, and (c) the inclusion o f non-educators in the decision-making process. They suggest that an accountability model be a comprehensive plan for problem solving aimed at improving educational practice. The requirements of this model include: (a) the designation o f the persons responsible for the program operation. (b) conducting an internal program review, (c) conducting an external program review, and (d) use o f the results to diagnose needs and prescribe action. Evaluation research has been a tool used frequently in public schools to make judgments about the merit, value, or worth of educational programs (Gall, Borg & Gall. 1996). They are most often used to determine the effectiveness o f specific programs, benefits to cost ratios, or areas for improvement. Formal evaluation consists of systematic efforts, using qualitative and/or quantitative designs to define criteria and obtain accurate information (Worthen, 1987). Formal evaluation studies are often done as a basis for decision-making and policy formation, to evaluate curricula, monitor expenditure of public funds, or improve educational programs (Worthen, 1987). Worthen (1987) notes that many evaluation studies do not lead to significant improvements in school programs. He cites several reasons including inadequacies o f research design, the use o f evaluation information. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 42.
    25 and the viewo f evaluation as a discrete study rather than a system o f self-renewal (Worthen, 1987/ The Joint Committee on Standards for Educational Evaluation (1994) developed standards designed for use in judging quality o f educational evaluation. These standards cover criteria involving the utility o f the evaluation, its usefulness to the persons involved, feasibility o f the design to the setting, legal and ethical factors, and the extent to which the study yields valid, reliable, and comprehensive information for making judgements. Guba (1981) outlined four major models o f educational evaluation: (a) objectives, (b) outcomes, (c) effects, and (d) audience concerns. Evaluation approaches that are based on specific goals and objectives assess the congruence between the standard or the goal and the performance (Provus, 1971). In a discrepancy-based model o f evaluation, standards are defined and developed, the performance is assessed, the discrepancy is determined, there is feedback to the decision-makers, and there is a decision. The critical point in this model is the establishment o f a standard and assessment against that standard. The C. I. P. P. (Context, Input, Process, Product) model is a decision-making approach relying on generation o f information to be used in making decisions (Stufflebeam, 1983). The context provides an illustration o f the needs and goals, input on how resources and procedures are used to reach goals. The process focuses on any defects in the implementation of those goals and products and the measurement of the outcomes. Scriven (1973) proposed the consumer- oriented model that included establishing standards or indicators; comparing effects to benefits and costs; and making judgements about change, use, and choice. The focus in this model is the judgement o f merit or worth. The countenance model— later called the responsive mode— distinguishes three phases: (a) antecedents, (b) transactions, and Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 43.
    26 (c) outcomes (Stake,1967). This model relies on an informal framework in which observation, judgement, and data matrices are emphasized. Curriculum audits provide another source o f evaluating organizational effectiveness for schools. Curriculum management audits were first offered by the accounting firm of Peat, Marwick, Mitchell and Company where a partner, Fenwick English, adapted it from the financial audit process (Vertiz, 1995). He brought the service to the American Association o f School Administrators which created the National Curriculum Audit Center. The Center trains curriculum auditors and contracts with school districts. The first audit was done in 1979 in the Columbus Public Schools in Ohio. As o f April 1995, curriculum audits had been performed in nearly 100 school districts in the United States and two foreign countries (Vertiz & Bates, 1995). According to Vertiz (1995), the audit became an important data source in state take-over o f school systems in New Jersey and Kentucky. It is based upon the concepts of effective instruction, curricular design, and delivery. The audit is designed to determine the extent to which a sound, valid, and operational system o f curriculum management is implemented (Vertiz & Bates, 1995). According to Vertiz and Bates, curricular quality control requires: (a) a written curriculum in a clear, translatable form for application by teachers in classrooms or related instructional settings; (b) a taught curriculum which is shaped by, and is interactive with, written curriculum; and (c) a tested curriculum which includes the tasks, concepts, and skills o f pupil learning that are linked to both the taught and written curricula. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 44.
    27 English (1988) describedthe five standards he created for the auditing process: 1. The school district is able to demonstrate its control o f resources, programs, and personnel. Control refers to the system’s ability to channel and focus its resources toward the achievement o f its goals and its mission (Kemen, 1993). Auditors look for indicators that demonstrate linkages between the board, central management, and the instructional process (English, 1988). 2. The school district has established clear and valid objectives for students. Auditors examine board policy, administrative procedures, courses o f study, and scope and sequence of curriculum (English, 1988). 3. The school district has documentation explaining how its programs have been developed, implemented, and conducted. The district must demonstrate clear and operational linkages between all layers o f the system. Auditors look for alignment between policy, curriculum, instruction, materials, and assessment (English, 1988). 4. The school district uses the results from the district designed or adopted assessments to adjust, improve, or terminate ineffective practices. Auditors evaluate the extent to which the district collects data to evaluate its performance. The data should reflect its goals, provide usable information that should be used to adjust, or improve district goals (English, 1988). 5. The school district has been able to improve productivity. Productivity is the relationship between the inputs and the cost o f obtaining any given level o f outputs (English, 1988). Each standard encompasses numerous criteria. These criteria are evaluated through document reviews, interviews with the board and professionals, and observations by trained R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 45.
    28 auditors who areschool administrators external to the organization. The data is then triangulated according to agreed upon conditions. Table 3 illustrates specific criteria and the findings o f a study o f audits conducted between 1988 and 1994 by Vertiz and Bates (1995). The authors conclude that the majority o f school districts who participated in the audit process were deficient in major management structures and functions that pertain to the design and delivery o f curriculum. The investigators found that 90% or more o f the findings were deficient in 80% o f the areas investigated. Kamen (1993) found that the extent o f implementation o f the audit recommendations is dramatically affected by the nature o f the audit selection method. When the audit is voluntarily selected by a district, there is a high level o f implementation. There are positive effects generally as demonstrated by greater empowerment of all personnel and a tendency towards a systems perspective. Management processes appeared to improve. Results suggest that there is significantly less implementation o f recommendations when the process is mandated. Under some conditions, it can become a political battleground with resistance, denial, and defensiveness. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 46.
    29 Table 3. CurriculumAudit Findings o f 67 School Districts Between 1988 and 1994. Standard number: Criteria Strong rating Deficient rating 1 : Policy design 6% 94% I : Policy implementation 0% 100% 1 : Planning design 10% 90% 1 : Planning implementation 10% 90% I : Organizational structure 5% 95% 1 : Organizational implementation 27% 73% 1 : Personnel practices 0% 100% 1 : Personnel supervision and supervision 14% 86% 2 : Instructional goals and objectives 6% 94% 2 : Curriculum scope 22% 78% 2 : Curriculum guide: design 2% 98% 2 : Curriculum guide: delivery 0% 100% 2 : Curriculum management structure 3% 97% 3 : Internal consistency 3% 97% 3 : Equity: design 5% 95% 3 : Equity: implementation 7% 93% 3 : Monitoring practices 4% 96% 3 : Staff Development: design 2% 98% 3 : Staff development: delivery 0% 100% 3 : Articulation and coordination 2% 98% (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 47.
    30 Table 3, cont’d.Curriculum Audit Findings o f 67 School Districts Between 1988 and 1994. Standard number: Criteria Strong rating Deficient rating Testing program: scope 2% 98% Testing program: quality 3% 97% Use of assessment data 2% 98% Use o f program evaluation data 0% 100% Curriculum-driven budget 0% 100% Cost effectiveness 4% 96% Organizational improvement 0% 100% Facilities 39% 61% School climate 83% 17% Support system functioning 30% 70% Note: From The Curriculum Management Audit: Revelations About Our School (1995), by Virginia Vertiz and Glynn Bates. Paper delivered to the American Education Research Association, Division B. Effective schools research offers a set o f criteria for determining organizational performance. Effective schools have been described by several parameters. Effective schools add value through their services, high evaluations from students, high expectations and high norms of achievement, strong leadership, collaborative decision-making, clear goals, system wide culture, safe environment, and a dedicated workforce (Mann, 1976, Purkey & Smith, 1982). Seven characteristics emerged from the body o f literature known as effective schools research (Edmonds, 1980). They include (a) strong, instructional leadership; (b) a safe, orderly climate; (c) high expectations for achievement; (d) emphasis on Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 48.
    31 basic skills; (e)continual monitoring o f progress; (f) goals that are clear and understood; and (g) culture. Research began in the mid-1970s to determine why some schools were effective and others were not. The research of Ron Edmonds and Lawrence Lezotte began in large urban schools. Their study resulted in the identification o f correlates present in effective schools (Edmonds, 1979; Lezotte & Bancroft, 1985). Strong administrative leadership was found to exist, particularly focused on planning, supporting and monitoring the instructional process. There were high expectations for all students and the staff o f the building. A positive school climate existed in the building as evidenced by a sense o f pride and community. There was a focus on the instructional program in the total school with emphasis on training in teaching practices. Finally, there was a thorough assessment process allowing for continual monitoring o f student progress. Numerous improvement strategies followed that focused on developing the specific correlates in schools. Stefanich (1983) pointed out that much of the impetus for these applications was based on intuitive rationale rather than hard data. Lezotte (1989) has since integrated quality theory into his approach to school improvement, citing such precepts as the importance of an attitude o f continuous improvement, a deliberate change strategy, and attention to all parts of the system. Some independent contractors using the correlates o f effective schools as the standard have created an audit-type process. Each state has an accreditation process usually affiliated with a regional accrediting organization (Portner, 1997). In Idaho the purpose o f accreditation is to help schools achieve the required Standards for Idaho Schools and enhance school improvement (Idaho State Department of Education, 1996). There are four options for how Idaho schools seek accreditation. They may choose one o f the following options: R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 49.
    32 1. The IdahoElementary/Secondary Accreditation Standards. 2. The Northwest Accreditation Standards. 3. The Idaho School Accreditation School Improvement Model. 4. An alternative school improvement plan. Regardless o f the option selected, the school must demonstrate the standards defined and the components o f thoroughness as specified by Idaho Code 33-119. A thorough system o f education has been defined in Idaho code as one in which: 1. A safe environment conducive to learning is provided. 2. Educators are empowered to maintain classroom discipline. 3. The basic values o f honesty, self-discipline, unselfishness, respect for authority, and the central importance o f work are emphasized. 4. The skills necessary to communicate effectively are taught. 5. A basic curriculum necessary to enable students to enter academic or vocational post-secondary educational programs is provided. 6. The skills necessary for students to enter the workforce are taught. 7. The students are introduced to current technology. 8. The importance o f student’s acquiring the skills to enable them to be responsible citizens o f their home, schools, communities, state, and nation is emphasized. Regardless o f the option selected, schools must demonstrate on an annual basis the five required standards: Standard I: Philosophy/Mission, Vision, Polices: School philosophy and policies need to be aligned with thoroughness legislation. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 50.
    33 Standard II: Personneland Certification: All educators o f Idaho students must be certified as specified in the State Board o f Education Rules for the Public Schools o f Idaho. Standard III: Curriculum/Instruction/School Improvement: This standard is defined in the thoroughness legislation. Standard IV: Accountability/Assessments/Measures: Schools must establish standards for all grade levels and high school exiting standards for graduation, participate in statewide testing programs, have written plans to reduce dropouts, and report on student attendance. Standard V: Safe Learning Environment: Schools must have safe facilities. Each school must have a comprehensive, district-wide policy and procedure in place encompassing safe environment and discipline. There are specific additional standards for each level— elementary, middle, and secondary. Each standard has specific criteria to which deviation points are assigned. Schools are accredited annually according to ratings determined by points. If schools receive a status of not approved for more than one consecutive year, state funds can be withheld and a report to the public is made. The Northwest accreditation process involves a self-study for initial accreditation involving staff, students, and community (NASC, 1996). The accreditation process runs on a ten-year cycle involving a self-study during the ninth year of the process. What is unclear in the literature is how the information from any method of determining organizational effectiveness is used. Brassard (1993) cautions against the need to compare the performance o f organizations or to identify characteristics o f those that are effective. Having criteria o f effectiveness reinforces the notion that: (a) organizations R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 51.
    34 possess an inherentrationality and (b) criteria become requirements which are imposed often independent o f their purpose. He makes the point that criteria adopted must define the performance that the organization must achieve if it is to be useful. Hannan and Freeman (1977) argue that inter-organizational comparisons can not be accomplished because there is no way for scientific analysis o f comparative organizational effectiveness. The above review o f approaches to the overwhelming task o f determining organizational effectiveness illustrates the varied strategies used in the past and present. Such performance assessments are done for different reason. There is little discussion in the literature regarding the process o f evaluation of organizational self-study for the ultimate purpose of improvement. There is an increasing interest in action research or practitioner based research done by the practitioners within their own site as a reflective process of investigation (Anderson et al, 1994). Practitioner research is best done as a collaborative effort to accomplish multiple perspectives for the purposes o f taking actions in a specific situation. Zammuto (1982) points out that it is useful to remember that organizations are social inventions created to satisfy human needs. These needs influence how people evaluate the effectiveness of organizational performance based on their experience with organizations and the impact of that performance on them or their preferences. The purpose o f assessment in anything is to determine the performance and then improve it. Quality Theory The importance o f theory as it relates to the areas cited above is explored since from theory, assumptions, models, practices, and tools emerge (Skrtic, 1991). Many companies today are using total quality theory or continuous improvement theory as both a conceptual framework and operationally (Walton, 1990). Total quality or continuous improvement is an R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 52.
    35 approach to organizationaldevelopment that has both historic roots and evolving tenets. It involves both reflective and active components for organizational development. It has been defined as a people-focused management system that aims at continual increase o f customer satisfaction at continually lower real cost (Crosby, 1984; Deming, 1986; Imai, 1986). This theoretical model is a systems approach to organizational improvement, meaning that improvements should be made with the whole organization in mind (Deming, 1986, 1994; Senge, 1990). The terms total quality control and total quality management were coined by Fiegenbaum (1983). He defined total quality control as “an effective system for integrating quality development, quality maintenance, and quality improvement efforts o f various groups in an organization so as to enable marketing, engineering, production, and service at the most economical levels to allow for full customer satisfaction” (Fiegenbaum, 1983, p. 823). He used the term total to mean a systems approach to achieve excellence. He defined quality in terms o f the specific requirements o f the customer. Japanese management theory has influenced quality theory in the Western world significantly. Referred to as Kaizen in Japan, it is the single most important concept influencing Japanese management (Imai, 1986). The Kaizen philosophy means on-going improvement through the involvement o f everyone, in all aspects o f life. Imai remarks, “I came to the conclusion that the key difference between how change is understood in Japan and how it is viewed in the West lies in the Kaizen concept; a concept that is so natural and obvious to many Japanese managers that they often do not realize that they possess it!” (Imai, 1986, p. 3). He concludes that this concept is either very weak or non-existent in American and European business based on his many years o f studying the differences. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 53.
    36 There are consistentfundamental principles in quality theory, which emerge upon review o f the literature by those who are credited as being the major experts in the quality. The researcher will focus on these principles rather than an in-depth analysis o f the historical perspectives o f quality theory. Upon review o f the literature, it is clear that quality theory has emerged from a variety o f historical management approaches, economic contexts o f the times, cross-cultural influences o f both East and West, and an ever-increasing body of knowledge that is evolving through practice (Crosby, 1984; Danne, 1991; Deming, 1986). W. Edwards Deming, often considered the “father of quality,” developed a theory of profound knowledge that incorporates the major tenets o f quality theory (Deming, 1986, 1989, 1994). He believed that not only skills, but also knowledge about management was paramount. Deming (1989) stated, “hard work and best efforts, put forth without guidance of profound knowledge, leads to ruin in the world that we are in today. There is no substitute for knowledge” (Deming, 1984, p. 10). The system of profound knowledge includes four principles, each related and interacting with the other. The first principle is appreciation for a system, which Deming defined as “a network o f interdependent components that work together to try to accomplish the aim of the system” (Deming, 1984, p. 50). He stressed the interdependencies within a system and the necessity o f cooperation among the parts. The greater the independence between the components, the greater the need for communication and cooperation between them. The system needs to have an aim that is clear to all in the organization. Without this clear purpose, says Deming. the aim becomes a value-judgement made on individual bases (Deming, 1984). Deming often used the example of a good orchestra to illustrate a well-optimized system. “The players are not there to play solos as prima donnas, to catch the ear o f the listener. They are R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 54.
    37 there to supporteach other. They need not be the best players in the country.” (Deming, 1984, p. 15). According to Deming, management o f a system is action based on prediction. The prediction needs to be rational and based on the information that the system teaches people in the organization. Therefore, the performance o f any part o f the system must be judged in relationship to other parts and the aim of the system. A second element is knowledge o f variation or statistical theory. Deming believed that without statistical analysis methods, attempts to improve a process would be hit or miss. Understanding that variation will always exist in all components o f a system— people, processes, results— is fundamental. He called for an understanding o f the capability of a process. Developing stable processes— which means the process is in a state o f statistical control— is the goal in determining a system’s capability. He makes the distinction between the types o f variation, special cause, and common cause. Common cause he defines as the variations that occur by chance and can be attributable to a system Special causes, on the other hand, are caused by events outside of a system. Deming felt that these were important to know before one attempted to work on a system (Deming, 1986, 1989). If these distinctions are not understood, he suggested, mistakes can be made that are costly and ineffective. The prevention of errors and nonconformance to specifications are key principles resulting from the knowledge o f variation (Crosby, 1984; Deming, 1986). Philip Crosby, a recognized quality expert, invented the term zero defects which he defined as no acceptable rate of defects for products or services that do not meet customer’s requirements (Crosby, 1984). The emphasis is on prevention, rather inspection or the process of detecting the good and the bad. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 55.
    38 Deming’s third principlebuilds on the need to use the system and its variation to generate what he called the theory o f knowledge (Deming, 1986, 1989, 1994). He believed that good intentions were not enough for management. Managers, he insisted, need to continually build knowledge and theories based on that knowledge. Deming believed that this was the only basis for management’s ability to predict. When processes are in the state o f statistical control, statistical theory can assist in prediction. Theories then emerge from this knowledge. Theories, professed Deming, are necessary to generate questions. Without questions, there may only be examples o f successes, and if these are duplicated under the pretense o f a solution, they can lead to failure. Continual approach o f narrow solutions can lead to more and more of the solution (Senge, 1990). Theory is critical in optimizing a system which can meet the customer’s expectations the first time (Deming, 1986, 1989; Crosby, 1984). Joseph Juran, another quality expert, extended Deming’s beliefs as they pertained to knowledge-based decisions to the role o f management (Juran, 1988). He believed that it was the responsibility o f top management to lead the company through massive training in quality. Juran placed an emphasis on planning, customer satisfaction, and the use o f data collection and analysis and has been credited with being the first to address the broader issues o f management as they relate to quality (Danne, 1991; Miller, 1993). The fourth principle o f profound knowledge is psychology. Deming felt that this body o f knowledge was critical in the interaction between people and circumstances, the interaction between people, and the interaction between people and the system (Deming, 1986, 1989). He emphasized the importance o f leaders in recognizing the differences in people and using these differences to optimize the system. Recognition of differences in how Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 56.
    39 people leam, howthey work, and how they relate to each other is an additional factor that a manager should understand. Leaders are obligated to make changes in the system which will result in improvement (Crosby, 1984; Deming, 1986). Critical to this system o f profound knowledge are the following: 1. People have an innate need for self-esteem and respect. 2. Circumstances can either provide or deny people opportunities for dignity and self-esteem. 3. Management that denies opportunities for dignity and self-esteem will smother intrinsic motivation. 4. Some extrinsic motivators rob employees o f dignity and self-esteem. 5. Management should recognize the innate inclination o f people to leam and invent. Deming believed that new systems o f rewards needed to be established to restore respect for the individual and release the potential o f human resources (Deming, 1986, 1989, 1994). Organizational behavior can affects the quality o f services, products, and, in the case of schools, the quality o f instruction (Deming, 1994; Patterson et al, 1986). What has emerged from the Deming system o f profound knowledge is an evolving body o f knowledge that incorporates systems theory, scientific method, management by fact, and participation o f everyone within the system. Each of the quality experts mentioned have similar messages emphasizing different concepts. Table 4 provides a matrix o f key quality principles and the interpretation o f each offered by Deming, Juran, and Crosby. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 57.
    40 Table 4. AComparison Among Teaching Theories o f Quality Experts. Concept Deming Juran Crosby Definition Predictable degree of Fitness for use Conformance o f quality dependability suitable to requirements to market Performance Use o f statistics to Avoidance of Zero defects Standards measure performance campaigns to do in all areas perfect work Approach to Optimization o f system; Management must Prevention; Improvement elimination o f goals consider human process without methods side of quality development Statistical Use SPC for Use could lead Rejection of Process quality control to “tool-driven” statistically accepta Control approach levels of quality Employee Employee Use of teams; Quality Participation participation in quality circles improvement decision-making teams Continuous Improvement in Business The history o f the recent movement to improve performance in the private sector is relevant to current and future applications in other settings. The origins o f the quality R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 58.
    41 movement can betraced to the 1940s and the context o f World War II (Pines, 1990). The United States War Department established a Quality Control section in 1942 as a response to an increased demand for mass production of weapons and other war materials. Staff from Bell Telephone Laboratories were used— primarily two statisticians, Walter A. Shewhart and W. Edwards Deming. Their approach was to predict the performance o f production by measuring manufacturing processes and stabilizing their performance. When these statistical techniques were applied, America’s defense was exemplary. At that time, the progressive approach to manufacturing was referred to as acceptable quality levels (AQL), which assumed that there was an acceptable level of allowable failures. The approach offered by Shewhart and Deming suggested that this approach was one of the reasons why the United States was seeing a decline in productivity compared to other countries. Garvin ( I98S) reports four major quality eras. Prior to and during the 1930s, the emphasis was on inspection. Processes for detecting defects such as grading, counting, and repairing were common in American businesses. From the 1930s to the 1950s, statistical quality control became popular. This strategy assumed that the principles of probability and statistics would allow managers to control the variation in a production process to determine if the cause o f the variation was inherent in the process or the result o f a special cause. During the 1950s and 1960s, the quality assurance movement emphasized the planning function, and the concept o f continuous process improvement was originated. The linkage between quality and controlling costs was made. Beginning in the 1980s, the quality management period was significantly influenced by W. Edwards Deming. An NBC-TV documentary that aired on June 24, 1980, IfJapan Can, Why Can't We? explored how Japanese products came to be perceived as far superior to those of the United States (Walton, R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 59.
    42 1990). During aninterview, Deming, age 79 at that time, shared how he taught Japanese management and engineers how to use quality as a system. These techniques enabled them to detect and eliminate defects, cut down on waste, reduce costs, and increase productivity. He used methods referred to as statistical process control (SPC). Although they had been used in America after World War II, their use faded when volume overruled quality. Since 1980, many companies have adopted quality principles and practices. Curt Reimann, recently retired Director o f the Malcolm Baldrige National Quality Award, reported in a telephone interview that all o f the high performing companies today are, in one way or another, applying quality principles and practices. He further related that there were many examples o f failed attempts, but companies that have successfully applied these principles and became learning organizations are realizing results. Brown (1994) found that some executives felt quality peaked in 1992 and many companies have abandoned quality to resume a back-to-basics approach emphasizing results. Reimann pointed out in the interview that the only reason to attend to processes was to improve results. This unfortunate but common misunderstanding in the application o f quality has been substantiated by the literature (Brown, 1994). To help encourage United States companies and reward them for providing high quality products and services, the Malcolm Baldrige National Quality Award was created in 1987 under President Reagan. The award was named after Malcolm Baldrige, the Secretary of Commerce credited for his managerial approach to long-term improvement in economy, efficiency, and effectiveness in government (National Institute of Standards and Technology, 1994). By enacting the Stevenson-Wydler Technology Innovation Act o f 1980, Congress established the Baldrige Award which created a public-private partnership designed to R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 60.
    43 encourage quality inAmerican companies (Brown, 1994). Gavin (1991) stated, “In just four years, the Malcolm Baldrige National Quality Award has become the most important catalyst for transforming American business. More than any other initiative, public or private, it has reshaped managers’ thinking and behavior” (Gavin, 1991, p. 80). The MBNQA Award (1994) was designed to promote: 1. Awareness o f quality as an increasingly important element in competitiveness. 2. Understanding o f the requirements for quality excellence. 3. Sharing o f information on successful quality strategies and the benefits derived from the implementation o f those strategies. The Council on Competitiveness (1995) compiled a report after studying the Baldrige Award and quality in the United States. Their finding were as follows: 1. The quality o f American goods and services is getting better. Unfortunately, this progress has led to the perception that extending quality management principles and practices is no longer a high national priority. Our competitors are continuously improving their quality, and the United States cannot afford to be complacent. 2. The Baldrige National Quality Award and its state and local offshoots have been key in the effort to strengthen United States competitiveness. The annual government investment o f $3.4 million in this program is leveraged by over $100 million in private sector contributions. The impact o f the Baldrige Award on the competitiveness o f United States industry and the dividends it pays to the United States economy far exceed these investments. 3. The United States quality movement faces a new set of challenges. We need to overcome the confusion o f terms and apparently competing approaches (TQM, ISO 9000, reengineering). New ways to extend quality to more large companies, as well as to small- R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 61.
    44 and medium-sized enterprises,are needed, and new sectors, such as government, education, and healthcare, should be included. 4. Although a number of vehicles are available to advance the process o f promoting quality management— including state and local quality award programs, colleges and universities, and the Manufacturing Extension Partnership— there has been inadequate coordination among them and with the National Baldrige Award Program. 5. The Baldrige Award Program, having galvanized United States quality efforts, is now positioned to become the vehicle for stimulating and coordinating efforts to expand quality as a national priority. In a telephone interview with Curt Reimann, this researcher inquired about the development of the specific criteria used. He related that the National Institute o f Standards and Technology (NIST) began analyzing organizations that were currently succeeding, and iisolated characteristics that were present in these organizations. A model was developed consistent with the prevailing quality theory at that time. In order to ensure that the criteria and processes remained relevant and reflected current thinking, the designers o f the MBNQA developed a two-year revision cycle (Bemowski, 1996). The process allows for continuous improvement reflecting what has been learned both in theory and in practice. Reimann indicated that there has not been any effort on the part of NIST to empirically validate the criteria. The approach, however, has been one o f accumulating the information qualitatively and drawing inferences. The intent o f the criteria and award process is not to be prescriptive. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 62.
    45 A survey wasconducted to determine how the criteria are being used, specifically by those companies who requested applications but who had not applied for the award (Bemowski & Stratton, 1995). Three findings were reported: 1. The criteria were overwhelmingly used as a source o f information on how to achieve business excellence. Over 48% were using the criteria to improve processes in their companies while less than 25% of the respondents used the criteria to apply for the award. About 50% o f the respondents indicated that they used the criteria to promote a common language within the company. 2. The majority o f respondents found that the criteria’s usefulness met or exceeded their expectations. 3. There was great diversity in the enterprises using the criteria. They were predominately used by managers o f a broad range o f industries. The researchers concluded that the stated purposes o f the award were being accomplished. The difficulties in interpreting the MBNQA criteria are well known (Bemowski, 1996). That factor was a consideration during the latest revision of the Award criteria, according to Reimann in his interview. The 1997 MBNQA criteria categories are as follows: 1. Leadership: Refers to how well senior managers provide leadership and sustain clear values, directions, performance expectations, customer focus, and a leadership system throughout the company. 2. Strategic Planning: Examines how the company sets and determines strategic directions and key action plans by translating them into an effective performance system. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 63.
    46 3. Customer andMarket Focus: Examines how well a company determines their customers’ expectations and then satisfies customer needs. 4. Information and Analysis: Examines managing and effectively using data and information to support key company processes and the performance measurement system. 5. Human Resource Development and Management: Examines requirements to develop full work force potential, i.e. an environment conducive to full participation, quality leadership, and personal and organizational growth. 6. Business Results: Examines performance and improvement made by the organization, including customer satisfaction, financial and market performance, human resource results, supplier and partner performance, and operational performance. The framework from which the criteria are designed is based on a systems perspective as illustrated in Figure 1. Refer to Table 5 for the organization of the categories. Pannirselvam (1995) conducted a study to validate the MBNQA model and evaluation process. Results from data in the 1993 version o f state awards following the same criteria revealed that the model is internally consistent and a reliable measure of quality. Tables 5 through 7 summarize the findings of that study. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 64.
    Table 5. BaldrigeNational Quality Award Criteria, 1997. 47 Category Criteria 1. Leadership 1.1 Leadership system 1.2 Company responsibility and citizenship 2. Strategic Planning 2.1 Strategy development process 2.2 Company strategy 3. Customer and Market Focus 3.1 Customer and market knowledge 3.2 Customer satisfaction and relationship enhancement 4. Information and Analysis 5. Human Resource Development and Management 6. Process Management 4.1 Selection and use o f information and data 4.2 Selection and use o f comparative information and data 4.3 Analysis and review o f company performance 5.1 Work systems 5.2 Employee education, training, and development 5.3 Employee well-being and satisfaction 6.1 Management of Product and Service Processes 6.2 Management o f Support Processes 6.3 Management o f Supplier and Partnering Processes 7. Business Results 7.1 Customer Satisfaction Results 7.2 Financial and Market Results 7.3 Human Resource Results 7.4 Supplier and Partner Results 7.5 Company Specific Results Note: From Malcolm Baldrige National Quality Award Criteria, 1997, National Institute of Standards and Technology. (Gaithersburg, MD: United States Department of Commerce and Technology Administration) Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 65.
    48 Table 6. Validityo f the MBNQA Model. Research Question Finding 1. Are the items under each criterion a reliable measure o f the trait they attempt to measure? Yes 2. Is the MBNQA model a good and complete measure of quality management practices? Yes 3. Do the MBNQA criteria represent an accurate measure o f an organization’s quality management practices? Yes 4. Do all the items under each o f the seven categories represent a single construct? Yes 5. Is variability in the assessment o f total quality systems? Yes 6. Is there assessment o f some elements more variable than others? Yes 7. Is the variability in assessment related to the type o f organization evaluated? Yes 8. Is the variability in assessment related to the characteristics o f the evaluator? Yes Note: From Statistical Validation o f the Malcolm Baldrige National Quality Award Model and Evaluation Process, 1995, by Pannirselvam. (Doctoral dissertation, Arizona State University) Doctoral Dissertations. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 66.
    49 Table 7. Accuracyo f the MBNQA Weights. Research Question Finding 1. Do the weights assigned to the examination items No accurately reflect the importance of each o f these items to a good management system? 2. Should the weights assigned to the seven criteria be different Yes for different sizes and types o f businesses? Note: From Statistical Validation of the Malcolm Baldrige National Quality Award Model and Evaluation Process, 1995, by Pannirselvam. (Doctoral dissertation, Arizona State University) D octoral Dissertations. There is debate in the literature and in the field regarding the effectiveness o f the MBNQA criteria and the award process. Some criticisms focus around the notion that companies spend too much time and money on the application process and are distracted from the work o f the company (Crosby, 1991). There is also concern expressed that there was no clear definition o f quality. The advocates for the criteria stand firmly on the belief that since it is not intended to be prescriptive, there should not be a common definition of quality. Criticism has also centered on the belief that the criteria support the selection of companies that produce high quality results and are financially successful. Today quality is seen as a field unto itself, with its own theory, models, practices, and tools. Although its early applications were predominately in manufacturing, the applications have quickly spread to service industries and the public sector. Continuous Improvement in Education Deming (1994) described America 2000: An Education Study as a “ . . . horrible example o f numerical goals, tests, rewards but no method.” Can the theories, principles, and R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 67.
    50 practices o fquality or continuous improvement, developed in industry, help in the transformation of schools (Bradley, 1993; Fields, 1994; Glasser, 1992; Langford, 1993; Rhodes, 1990; Schmoker, 1996; Tribus, 1993)? Our current system o f education has been influenced historically by industry as well. Educational administration has its roots in the theory o f scientific management, spawned by Fredrick Taylor during the period o f 1910-35 (Bradley, 1993; Stempen, 1987). Max Weber also had considerable influence during that period on the management and administration o f organizations (Owens, 1970). He characterized the ideal bureaucracy as having the following characteristics: 1. A division o f labor based on functional specialization. 2. A well-defined hierarchy o f authority. 3. A system of rules covering the rights and duties o f employees. 4. A system o f procedures for dealing with work situations. 5. Impersonality o f interpersonal relations. 6. Selection and promotion based on technical competence. However, Weber also warned that massive, uncontrollable bureaucracy could be a threat to free enterprise capitalism (Owens, 1970). In the 1950s, systems theory was applied to schools as a social system with a hierarchical role structure (Owens, 1970). These theories were attempting to understand the organization as a place of greater productivity and efficiency. Ernest Hartwell, a superintendent o f three different large city school systems in the early 1900s, held that if administrators applied business principles to progressive educational ideas, schools would become efficient, stable organizations and would be more profitable to the students and the R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 68.
    51 community (Thomas &Moran, 1992). Today’s organization o f teaching, testing, and the judging by grades has its roots in the industrial revolution’s theories o f mass production, inspection, and re-work (Deming, 1994). Deming (1986, 1994) had much to say about the system o f education and the type of changes that need to occur. He believed that not only did business management ignore psychology, but so did the managers of education. His beliefs about the individual differences of people, their need for self-esteem, dignity, and intrinsic motivation provided a basis for his criticism o f the educational system. His theory, practices, and tools have been very appealing to educators who are trying not to lose hope in the on-going battle to improve schools and reverse the tide of public criticism. In 1991, the Association o f Quality Control conducted its first Quality in Education Survey (Klaus, 1996). At that time, 133 K-12 and higher education institutions responded, indicating that quality had been implemented. Five years later, the study indicated that 451 educational institutions had implemented quality (Klaus, 1996). Educators have wrestled with the theories o f quality and the wisdom of experts in industry and have made applications in a meaningful way (Bernhardt, 1994; Bonstingl, 1996; English et al, 1994; Fields, 1994; Glasser, 1992; McClanahan& Wicks, 1994; Rubin, 1994; Tribus, Langford & Cleary, 1995). There are increasingly more efforts being made to conduct research in the application of quality in the field o f education (Chapell, 1993; Danne, 1991; Fritz, 1993; Louer, 1993; Miller, 1993; Partin, 1992; Regauld, 1993; Smith, 1996). The results in organizational improvement can be seen in several schools and school districts across the country. Improvements on disciplinary action have been reported by Mt. Edgecumbe in Sitka, Alaska (Danne, 1991; Langford et al, 1995). Decline in drop-out rates R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 69.
    52 have been citedby George Washington Vocational/ Technical High School in New York City (Danne, 1991). Redesign o f programs to prevent students from dropping out and to increase their success is a focus for some schools (Danne, 1991). Improved systems o f data collection, analysis, and benchmarking have been developed in a number o f school districts (Langford et al, 1995; Seigal et al, 1994). Motivated by high failure rates, staff at Parkview School District identified root causes and implemented multiple systemic solutions resulting in a decrease o f the failure rate by 50% in just one year (Seigel et al, 1994). The emphasis at the Christa McAuliffe Elementary School in Prince William County, Virginia, has been on teaching students quality practices and tools to assist them in working together, being responsible for their own learning and progress, and involving the larger community (Seigal et al, 1994). There are 70 elementary schools in the United States and abroad where the Koalaty Kid model o f continuous improvement has been implemented (Green, 1996). Students use quality tools to monitor their own progress and improvement in mastering new skills and content. In 1993 the decision was made to launch the Malcolm Baldridge National Quality Award program using the Education Criteria Pilot in 1994-95. A pilot approach was taken to address the many issues involved in extending eligibility to education (National Institute o f Standards and Technology, 1995). During the pilot year, schools who applied were not eligible for the award. The objectives o f the Education Pilot Program were to: 1. Determine the interest and readiness of educational organizations to participate in a national-level recognition program based on the ability to demonstrate overall performance improvement. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 70.
    53 2. Evaluate thePilot Criteria. 3. Determine the capability o f the evaluation system, including volunteer experience, availability, and time commitment 4. Determine the value o f the feedback given to Pilot Program participants. 5. Determine whether or not there should be subcategories of eligibility, taking into account school type and size. 6. Determine the likely influence o f the award on: (a) sharing o f best practices information, (b) cross-sector cooperation, (c) elevation o f educational standards. The criteria for the Education Pilot are based on core values and concepts. These are summarized Table 8. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 71.
    54 Table 8. CoreValues/Concepts o f MBNQA, Education Pilot, 1995. Core Value Core Concepts I. Learning-Centered A. Focus on learning and real needs o f learners Education B. High developmental expectations/standards for all students C. Understanding that student learning rates/styles vary D. Major emphasis on active learning E. Regular, extensive formative assessment early in learning process F. Periodic use o f summative assessment to measure progress against key relevant external standards/norms G. Assist students/families chart progress using self- assessment H. Focus on key transitions such as school-to-school and school-to-career II. Leadership A. Clear, visible directions and high expectations B. Modeling o f strategies for continuous improvement methods and processes by senior administrators C. School polices that reinforce learning/improvement climate and encourage self-directed responsibility throughout the school D. Building community support and aligning business and community leaders with their aims III. Continuous A. Clearly established goals Improvement/ B. Fact-based measures/indicators Organizational C. Systematic cycles of planning/execution/evaluation Learning D. Focus on improving processes for improved results E. Embedded approach that involves students IV. Faculty/Staff A. Increased knowledge of faculty/staff about student Participation/ learning and assessment strategies Development B. Improved performance o f faculty/staff C. Organization tailored to a more diverse workforce and more flexible, high-performance work practices V. Partnership A. Internal and external partnerships to better accomplish Development overall goals B. Partnerships that seek to develop long-term objectives, strategies for evaluating progress, and means for changing conditions (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 72.
    55 Table 8, cont’d.Core Values/Concepts of MBNQA, Education Pilot, 1995. Core Value Core Concepts VI. Management A. Improvement system based on cause-effect thinking, by Fact measurement, information, data, and analysis B. Measurements that support school’s mission/strategy C. Focus on student learning through a comprehensive and integrated fact-based system VII. Long-range A. Strong future orientation with long-term commitment View to students and stakeholders B. Investment in creating and sustaining assessment system focused on student learning C. School leadership familiar with research findings and practical applications o f assessment/learning D. School serves as role model in its operations VIII. Public A. Protection o f public health, safety, and environment in Responsibility all practices And Citizenship B. Ethical and non-discriminatory in all practices D. Support of, and leadership in, purposes important to public IX. Fast Response A. Faster, more flexible response to customer needs B. Simultaneous improvement in quality and productivity C. Strong customer focus X. Results A. School performance system focused on results Oriented B. Balanced needs and interests of all stakeholders C. Student performance demonstrated throughout their career in a variety o f ways D. Effective and efficient use o f school resources There are emerging initiatives in states and school districts in which the MBNQA Education Pilot Criteria provides the framework for school improvement. The researcher is aware o f several efforts. Pinellas County Schools in Florida has implemented the Superintendent’s Quality Challenge, a model based on the Education Pilot criteria. The state o f New Mexico has initiated a joint private and public sector project, Strengthening Quality in Schools, which incorporates the criteria and their state award process as a component. The Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 73.
    56 Pacific Bell Foundationhas sponsored the Education for the Future Initiative in several schools in California in which a school portfolio process o f organizational improvement was developed based on components o f the MBNQA Education Pilot Criteria (Bernhardt, 1994). The specific criteria are listed in Table 9. Revisions are currently being conducted by the National Institute on Standards and Technology to align the framework to the 1997 MBNQA for businesses. Since 1995 there has not been funding from the legislature to continue the development o f the Education Pilot, and efforts are actively being made to raise the capital needed to continue to develop the process. Several states have now included K-12 education in their state quality award process. New Mexico, Florida, New York, and Minnesota are four which have done so. Idaho is currently initiating those discussions. The MBNQA Education Pilot Criteria (Table 9) are described below: 1. Leadership: Examines the personal leadership o f senior administrators and their involvement in creating and sustaining student focus, clear goals, high expectations, and a leadership system that promotes performance excellence. Also examines how these objectives and expectations are integrated into the school’s management system. 2. Strategic and Operational Planning: Examines how the school sets strategic directions and determines key plan requirements and how plan requirements are translated into an effective performance management system with primary focus on student performance. 3. Student Focus and Student and Stakeholder Satisfaction: Examines how the school determines student and stakeholder needs and expectations by defining levels and trends in key measures o f satisfaction relative to comparable schools and/or appropriately selected organizations. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 74.
    57 4. Information andAnalysis: Examines the management and effectiveness o f data and information used to support overall mission-related performance excellence. 5. Human Resource Development and Management: Examines how faculty and staff development are aligned with the school’s performance objectives. Also examined are the school’s efforts to build and maintain a climate conducive to performance excellence, full participation, and personal and organizational growth. 6. Educational and Business Process Management: Examines the key aspects of process management, including learning-focused education design, education delivery, school services, and business operations. Examines how key processes are designed, effectively managed, and improved to achieve higher performance. 7. School Performance Results: Examines improvement o f student performance; the school’s educational climate, services, and business operations at performance levels relative to comparable schools; and/or appropriately selected organizations. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 75.
    Table 9. 1995MBNQA Education Pilot Criteria. 58 Category Criteria 1. Leadership 1.1 Senior Administration Leadership 1.2 System and Organization 1.3 Public Responsibility and Citizenship 2. Information and Analysis 2.1 Management o f Information and Data 2.2 Comparisons and Benchmarking 2.3 Analysis and Use of School Level Data 3. Strategic and Operational Planning 3.1 Strategy Development 3.2 Strategy Deployment 4. Human Resource Development and 4.1 Human Resource Planning and Evaluation 4.2 Faculty/Staff Work Systems 4.3 Facuity/Staff Develop ment 4.4 Faculty/Staff Well-being and Satisfaction 5. Educational and Business Process Management 5.1 Education Design 5.2 Education Delivery 5.3 Education Support Service Design/Delivery 5.4 Research, Scholarship,and Service 5.5 Enrollment Management 5.6 Business Operations Management 6. School Performance Results 6.1 Student Performance Results 6.2 School Climate Improvement Results 6.3 Research, Scholarship, and Service 6.4 School Business Performance Results 7. Student Focus and Student/ Stakeholder Satisfaction 7.1 Current Student Needs and Expectations 7.2 Future Student Needs and Expectations 3.3 Stakeholder Relationship Management 3.4 Student and Stakeholder Satisfaction Determination Note: From Malcolm Baldrige National Quality Award, 1997, National Institute of Standards and Technology. (Gaithersburg, MD: United States Department of Commerce and Technology Administration) Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 76.
    59 Summary The call forincreased productivity, efficiency, and effectiveness remains constant over the decades o f school reform. This chapter reviewed current literature on organizational effectiveness, effectiveness in schools, and emerging models of quality applications. The literature reviewed establishes the background and current practices both in business and education for measuring and improving organizational performance. Quality theory has been explored as a framework for approaching school improvement. Current traditional practices for determining comprehensive organizational performance were described. The application o f the Malcolm Baldrige National Quality Award Education Pilot to school improvement is an emerging area o f research and application. School improvement and reform strategies are now recognizing the need for a systemic change strategy that recognizes the comprehensive and complex nature of school districts (Anderson, 1993; O’Neil, 1993; Wagner, 1993). Increasingly, businesses are using the criteria o f the Malcolm Baldrige National Quality Award to assess their status and guide them towards improvements that produces sustained results through an aligned system which illustrates the core values o f the MBNQA. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 77.
    60 Chapter 3 Methodology Introduction This researchstudy examined a theory and a framework on which the operation and performance of a school district can be assessed to determine where improvements might be necessary. There are three purposes: 1. To determine how participants rate the performance o f their district currently in each o f the seven categories o f the Performance Analysisfo r School Districts. 2. To determine if these scores differ by type o f educator or size o f school district. 3. To determine how participants perceive the usefulness o f the instrument as a framework for self-analysis by a school district in school improvement. The study involved the development of an instrument to collect information regarding school district performance. It investigated differences by type o f educator and size of district. It measured the perception of participants about the instrument’s usefulness in approaching school improvement. The Research Model The model for the research study was as follows: Y t]k = u + a, ~ bj+ (ab),j ~ e ljk The value of the response variable is the sum of: u = the effect o f the overall mean, a, = the effect of the district size. bj =the effect of the position type. (ab),j = the effect o f the interaction of district size and position. e ,jk = random error in model. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 78.
    61 Instrumentation The framework fordetermining organizational performance was adapted from the Malcolm Baldrige National Quality Award Education Pilot (1995). The researcher constructed an instrument based on the Malcolm Baldrige National Quality Award (1997) and the Education Pilot Criteria (1995) as the primary framework. Since the 1995 Education Pilot is being revised, the researcher was advised by Curt Reimann, retired Director o f NIST, to consider the current 1997 changes. The researcher also integrated components from the Northwest Accreditation Standards and the curriculum audit process. The intent o f such an instrument was to reflect the comprehensive system o f a school district. Therefore, the researcher felt that there were elements in both the accreditation process and the curriculum audit process that could potentially be overlooked by an exclusive approach using only the MBNQA. There are seven categories o f organizational performance used in this instrument: 1.0 Leadership 2.0 Strategic Planning 3.0 Information and Analysis 4.0 Student Focus and Student and Stakeholder Satisfaction 5.0 Human Resource Development and Management 6.0 Educational and Operational Process Management 7.0 School Performance Results The descriptions in each category are based on the following Likert-type scale used to construct the language in each of the seven subcategories. The scale for the subcategories o f Leadership, Strategic Planning, Information and Analysis, Student Focus and Student and R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 79.
    62 Stakeholder Satisfaction, HumanResource Development and Management, and Educational and Operational Process Management were: 1. No systematic approach evident. 2. Awareness stages of systematic approach but minimal requirements. 3. Developing a system that emphasizes prevention o f problems meets expectations. 4. A refined, well-developed approach that is deployed with broad applications. 5. A thorough, systematic approach that is fully deployed, institutionalized, and idealized. The scale for the subcategory o f School Performance Results was: 1. No results or results below expectations. 2. Some improvements; early stages of developing trends. 3. Improvement trends or good performance in some areas. 4. Current performance is good to excellent with trends over time. 5. Superior performance with sustained results; state or national benchmark. The instrument used to collect data was designed to yield continuous data reflecting the ordered nature o f the items in each category and a weighting for each category. (See Appendix A). The rationale was designed to: (a) more closely align it with the scoring design o f the MBNQA in which additional points are awarded for more fully developed quality practices and performance, and (b) yield continuous data weightings that more appropriately answer the research questions put forward in the study. The instrument was reviewed for content validity to insure it would answer the research questions. Selected experts in the use o f the Baldrige criteria in business and/or education, both in-state and out-of-state, were used. To qualify as a content-area expert, the Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 80.
    63 individual must be,or have been, a state or national quality award examiner using the criteria from the Malcolm Baldrige National Quality Award and must be experienced in applying quality applications in educational institutions. Appendix B contains a list of the individuals consulted and their qualifications. A cover letter (Appendix C) addressed the specific research questions and the nature of the feedback that the researcher was requesting. Comments were received and items were revised. The revised instrument was tested on twelve Idaho educators— two superintendents, five principals, and five teachers. Minor revisions were made and the category of “I do not know” was added to each subcategory. Subjects and Setting The population used was educators working in Idaho public schools. A proportional stratified random sample was selected using the 1996-97 database from the Idaho State Department o f Education. The population was stratified by size of student enrollment using the classifications as outlined by the Idaho State Department o f Education as follows: Classification 1 = 5,000+ Classification 2 = 2,500 - 4,999 Classification 3 = 1,000 - 2,499 Classification 4 = 500 - 999 Classification 5 = 1 - 499 A proportional allocation for sample size was used based on the ratio of the number of districts in each category to the total number o f school districts in the state (Weirsma, 1995). The sample size was determined using a table o f recommended sample size (Krejcie, 1970). Table 10 and Appendix D provide the matrix for the sample design. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 81.
    64 Table 10. StratifiedRandom Sample Matrix by Classification. Superintendents Principals Teachers Classification N % s** N % s** N % s** #1 5,000+ 13 13.4 10 220 44 96 6,736 51.5 191 #2 2,500-4,999 13 13.4 10 84 16.8 37 2,358 18 67 #3 1,000-2,499 27 27.8 21 106 21.2 46 2,374 18 67 #4• 500-999 22 22.6 18 55 11 24 987 8 29 #5 1-499 21 21.6 17 31 6 4 617 5 19 State Totals 97 — 76 500 — 217 13,076 — 373 Collection o f Data The instrument was prepared for electronic scanning. Each sheet was coded by size o f district and type o f educator (Appendix E). The instrument was mailed to individuals selected in the sample. This method was selected for convenience and to insure anonymity o f the respondents. Appendix F contains cover letters and directions. The researcher secured letters o f support from the Idaho Association of School Administrators and the Idaho Education Association to help ensure return (Appendix H). Each person selected in the sample was also sent a pen printed with the message, “Thank you for participating in the Performance Analysis for School Districts,” and a self-addressed, stamped envelopes was enclosed. Mailing was conducted in October of 1997, and respondents were given two weeks to respond. A reminder post card was sent immediately following the deadline (Appendix G). Random phone calls were also made asking for a response. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 82.
    65 Data Analysis There wereseven dependent variables, i.e. scores from each category. There were two independent variables: (a) type o f educator, with three levels (superintendents, principals, teachers), and (b) size of district, with five levels o f enrollment (over 5000, 4999- 2500, 2499-1000, 999-500, 499-1 (Appendix I). The SAS computer software program was used to compile the data and generate the statistical analysis. Descriptive data was collected to determine the characteristics o f the sample, the highest degree earned and the number o f years o f experience in their current position, and the attitudes o f the respondents regarding the instrument. Frequencies o f responses is also illustrated. A two-way factorial analysis of variance for each category was used to compare two independent variables (Huck, Cormier & Bounds, 1974). Cronbach’s alpha was done to test reliability o f the instrument. The final research question regarding the potential usefulness o f the instrument was answered with descriptive statistics. Qualitative analysis o f the comments about the instrument was done using a constant comparative model (Patton, 1983). Summary Chapter 3 established the procedural design o f the study that investigated how educators in Idaho perceive their school districts in seven different categories. The sample and instrument development were described. The statistical analysis used in Chapter 4 was specified. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 83.
    Chapter 4 Findings Introduction This studywas designed to investigate three areas: 1. The perceptions o f superintendents, principals, and teachers in Idaho regarding the performance o f their school district in seven areas. 2. The differences o f perceptions based on type of position and/or size o f school district. 3. The perceived usefulness of the instrument constructed by the researcher for self-study. The dependent variables were the scores for each of the seven constructs. There were two independent variables, size o f district and position o f educator. District size was divided into five levels depending on student enrollment: (a) over 5,000; (b) 4,999- 2,500; (c) 2,499-1,000; (d) 999-500; and (e) 499-1. The position of the educators were separated into three types: (a) superintendents, (b) principals, and (c) teachers. The instrument, the Performance Analysis o f School Districts, was designed based on the 1997 Malcolm Baldrige National Quality Award criteria, the 1995 Education Criteria, curriculum audit standards, and the Northwest Accreditation Standards. The instrument was piloted, tested, and reviewed for content validity. Revisions were made based on results. The sample was selected from the population o f educators in Idaho public schools. The data was collected through a mailed survey. The results were scanned from returned individual instruments. The data was analyzed using the Statistical Analysis System (SAS). Descriptive analysis was done for characteristics o f central tendency and R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 84.
    67 a factorial analysiso f variance was used to test hypotheses. Cronbach’s alpha was used to determine reliability. Qualitative analysis was done using a constant comparative model for the comments o f participants. Rate of Return The total sample size was 666. The total number sent was 656. Adjustments were made for instances in which several participants had responsibility for more than one o f the targeted positions or because participants were no longer in the position. A total of 258 surveys were returned for a 36% rate of return. Nine (9) were eliminated due to participant error, and eleven (11) were not used because they were received too late. Wiersma (1995) reports 70% as a minimal acceptable rate o f return when surveying professional samples. Table 11 illustrates the return rates by educator position for the total number o f surveys sent. Table 12 illustrates the frequencies and percentages of the returns received by educator position and district size. The highest percentage returned from the total sample by position was for teachers, and the highest percentage returned from the total sample by size was for districts with over 5,000 students enrolled. Table 11. Total Return Rates by Educator Position. Number Sent Number Received Percentage Received Superintendents 76 49 64% Principals 211 88 42% Teachers 369 101 27% Total 656 238 36% R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 85.
    68 Table 12. Frequenciesand Percentages o f Returns Received by Educator Position and District Size. 5000+ 2500-4999 1000-2499 500-999 1-499 Total Superintendents: Frequency 8 6 13 13 9 49 Percent o f total 3.4 2.5 5.5 5.5 3.8 Percent by size 7.5 14.6 30.2 52.0 40.9 Percent by position 16.3 12.2 26.5 26.5 18.3 20.6 Principals: Frequency 44 17 14 8 5 88 Percent o f total 18.5 7.1 5.9 3.4 2.1 Percent by size 41.1 41.5 32.6 32.0 22.7 Percent by position 50.0 19.3 15.9 9.1 5.7 36.9 Teachers: Frequency 55 18 16 4 8 101 Percent o f total 23.1 7.6 6.7 1.7 3.6 Percent by size 51.4 43.9 37.2 16.0 36.6 Percent by position 54.6 17.8 15.8 3.9 7.9 42.4 Total: Frequency 107 41 43 25 22 238 Percent by size 45 17.2 18.1 10.5 9.2 Characteristics o f Sample A proportional, stratified random sample was selected from the state-wide data base o f certified educators employed in Idaho public schools during the 1996-97 school R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 86.
    year. Participants wereasked two demographic questions: (a) their highest terminal degree, and (b) the length o f time in their current position. Table 13 illustrates the highest degree by size o f district and position, and length o f time in current position district size and by position. Table 14 illustrates the rank order by percent o f time in position and highest degree. The most frequent terminal degree in the sample was a Masters with the most frequently occurring range o f experience being twelve or more years. Percentages o f terminal degrees varied by size o f district as illustrated in Table R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 87.
    70 Table 13. Percentageo f Highest Degree and Time in Position by Size and Position. Variables Degrees Years B M S D 1 1-3 4-7 8-11 12+ 5,000+: Superintendents 37.5 62.5 12.5 25.0 50.0 12.5 Principals 52.3 27.3 13.6 9.1 9.1 29.5 18.2 34.1 Teachers 45.5 50.9 3.6 5.5 12.7 18.2 12.7 50.9 4,999-2,500: Superintendents 33.3 33.3 33.3 16.7 50.0 16.7 16.7 Principals 47.1 47.1 5.9 23.5 29.4 29.4 Teachers 55.6 38.9 5.6 5.6 22.2 11.1 16.7 44.4 2499-1000: Superintendents 15.4 69.2 15.4 15.4 7.7 53.8 15.4 7.7 Principals 78.6 14.3 7.1 7.1 21.4 21.4 28.6 21.4 Teachers 87.5 12.5 25.0 12.5 12.5 50.0 999-500: Superintendents 30.8 53.8 15.4 23.1 15.4 30.8 30.8 Principals 87.5 12.5 25.0 62.5 12.5 Teachers 75.0 25.0 25.0 25.0 50.0 499-1: Superintendents 22.2 66.7 11.1 11.1 11.1 33.4 44.4 Principals 100 40.0 60.0 Teachers 50.0 37.5 12.5 12.5 37.5 25.0 12.5 12.5 Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 88.
    71 Table 14. RankOrder o f Combined Sample. Percentage Highest Degree Percentage Years in Position 44.1 Bachelors 32.8 12+ 23.5 Masters 27.7 4-7 22.3 Specialist 15.5 1-3 8-11 8.4 Doctorate 7.6 >1 Reliability o f Performance Analysis for School Districts Cronbach correlation coefficient was used to determine the consistency o f the instrument in measuring the seven constructs. Reliability coefficients (Table 15) suggested internal test consistency existed in each construct, with the Leadership construct having the highest reliability and the School District Results the lowest. Table 15. Reliability o f Instrument. Category Construct Cronbach’s Alpha Leadership Strategic Planning Student and Stakeholder Satisfaction Information and Analysis Human Resources Educational Process Management School District Results .851446 .831169 .768750 .803754 .842505 .828683 .741474 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 89.
    72 Descriptive Analysis Descriptive statisticswere used to illustrate central tendency and variability among the dependent variables. Tables 16 through 24 illustrate the means and standard deviations by district size and position o f educator. The lowest overall mean occurred in the Information and Analysis category when combining district size and educator position, while the highest mean occurred in the Leadership construct. Table 16. Means For District Size and Positions Combined. Construct N Mean SD Leadership 233 3.15 .98 Strategic Planning 233 3.06 1.11 Student /Stakeholder Satisfaction 233 2.82 .90 Information & Analysis 237 2.62 1.13 Human Resources 233 2.71 .99 Educational Process 228 2.77 .93 School District Results 229 2.94 .91 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 90.
    73 Table 17. Meansby District Size for Districts With 5000 or More Students Enrolled. Construct N Mean SD Leadership 106 3.15 1.02 Strategic Planning 106 3.22 1.05 Student /Stakeholder Satisfaction 107 2.86 .93 Information & Analysis 107 2.70 1.24 Human Resources 107 2.62 1.02 Educational Process 104 2.84 1.02 School District Results 104 2.95 .96 Table 18. Means by District Size for Districts With Between 4,999 and 2,500 Students Enrolled. Construct N Mean SD Leadership 40 3.17 1.05 Strategic Planning 39 2.91 1.10 Student /Stakeholder Satisfaction 40 2.77 .95 Information & Analysis 40 2.55 1.16 Human Resources 39 2.86 .93 Educational Process 37 2.65 .81 School District Results 37 2.63 .70 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 91.
    74 Table 19. Meansby District Size for Districts With Between 2,499 and 1,000 Student Enrolled. Construct N Mean SD Leadership 42 3.06 .82 Strategic Planning 43 3.00 1.09 Student /Stakeholder Satisfaction 43 2.79 .74 Information & Analysis 43 2.51 .95 Human Resources 43 2.66 .94 Educational Process 43 2.66 .83 School District Results 42 2.97 .82 Table 20. Means by District Size for Districts With Between 999 and 500 Students Enrolled. Construct N Mean SD Leadership 25 3.22 .99 Strategic Planning 23 3.01 1.24 Student /Stakeholder Satisfaction 25 2.71 .95 Information & Analysis 25 2.59 .94 Human Resources 25 2.76 .97 Educational Process 25 2.68 .89 School District Results 25 3.06 1.10 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 92.
    75 Table 21. Meansby District Size for Districts With Less Than 499 Students Enrolled. Construct N Mean SD Leadership 20 3.26 .94 Strategic Planning 22 2.71 1.27 Student /Stakeholder Satisfaction 19 2.88 .96 Information & Analysis 22 2.54 1.10 Human Resources 19 3.02 1.07 Educational Process 19 3.07 .89 School District Results 21 3.23 .89 Table 22. Means by Position for Superintendents. Construct N Mean SD Leadership 48 3.52 .67 Strategic Planning 47 3.28 1.09 Student /Stakeholder Satisfaction 49 3.11 .76 Information & Analysis 49 2.90 .96 Human Resources 49 3.24 .77 Educational Process 49 3.03 .72 School District Results 49 3.16 .79 Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 93.
    76 Table 23. Meansby Position for Principals. Construct N Mean SD Leadership 86 3.47 .89 Strategic Planning 85 3.32 1.04 Student /Stakeholder Satisfaction 87 3.00 .85 Information & Analysis 87 2.85 1.08 Human Resources 87 2.95 .92 Educational Process 82 3.03 .92 School District Results 82 3.25 .85 Table 24. Means by Position for Teachers. Construct N Mean SD Leadership 99 2.70 1.00 Strategic Planning 101 2.74 1.11 Student /Stakeholder Satisfaction 98 2.52 .92 Information & Analysis 101 2.27 1.17 Human Resources 97 2.23 .94 Educational Process 97 2.42 .92 School District Results 98 2.57 .90 The frequencies and percentages for each item in the seven constructs are charted in Tables 25 through 31. Teacher responses tended to be distributed across all six choices more frequently than those o f superintendents and principals. This was consistent across all constructs. Greater frequencies occurred in item #3 through #5, signifying a more Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 94.
    77 all constructs. Greaterfrequencies occurred in item #3 through #5, signifying a more developed and refined quality approach on the part o f superintendents and principals, while greater frequencies occurred in item #1 through #3, indicating a less developed and a more arbitrary approach by teachers. Teachers also tended to select the “do not know” response more frequently. In Table 25 data collected for the Leadership category suggested that for the districts with 5,000 or more students there was a higher ratio o f superintendents and principals who saw a clearly communicated, fully deployed direction in their district than the ratio o f teachers. In the largest districts, 63% o f superintendents responded to item #3 or #5, compared to 59% o f the principals and 42% of the teachers. In districts with 499 or less students, 88% o f the superintendents responded to item #4 or #5, compared to 60% o f the principals and 25% o f the teachers. Teacher responses occurred more frequently in item #1 and #2 regarding the existence o f a systematic study process o f the performance o f their school district than did superintendents or principals. Across all sizes of districts, 15% o f the superintendents suggested there were only minimal school improvement efforts on the part of district leaders, compared 75% to 44% of teachers. There also appeared to be a greater perception on the part o f superintendents and principals that a participatory approach to management exists in their district. More teachers than superintendents or principals report that there is little involvement of stakeholders in policy development before the local board o f trustees. There was less variability among the groups when responding to items on Responsibility to Public or Legal, Ethical Conduct, with most responses across all positions and district sizes occurring in item #4 and #5. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 95.
    78 In the StrategicPlanning construct in Table 26, more superintendents and principals responded to item #3 through #5 than did teachers regarding both Strategic Development and Focus of the Plan. However, the majority o f all three groups perceived that their district did not have a well-deployed system for implementing or assessing their strategic plan. In the Student and Stakeholder Satisfaction construct in Table 27, teachers in all districts responded to item #1 or #2 describing standardized tests scores as the primary means of determining student needs. Superintendents and principals in the same districts responded with greater frequency to choices #3 through #5. Teachers reported that there were minimal attempts to determine student and stakeholder satisfaction, while superintendents reported that more refined attempts existed. Increased frequencies o f “do not know” responses occurred among teachers in the Information and Analysis construct in Table 28. Teacher responses occurred with greater frequency in item #1 and #2, compared to superintendents and principals who replied with greater frequency to items #3 through #5 regarding the collection, use, and analysis o f information. In the Human Resources construct in Table 29, teacher perceptions o f the learning and working climates, work systems, and employee satisfaction were less positive than the perceptions o f their superintendents and principals. Table 30 illustrates frequencies and percentages for Educational and Operational Process Management. Teachers, more often than administrators, perceived that educational programs and services were primarily designed and delivered based on federal and state regulations, traditional practices, or test results. Teachers selected the “do not know” response with greater frequency than superintendents or principals regarding supply and partnering R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 96.
    79 processes. Table 31illustrates frequencies for the School District Results construct with similar patterns o f responses. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 97.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 25. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 1. Clearly communicated direction: Scale: 1 Frequency 1 2 1 3 1 1 1 2 9 3 4 1 2 Percentage 16.7 15.4 7.7 6.8 5.9 7.1 12.5 40.0 16.4 16.7 25,0 25.0 25.0 Scale: 2 Frequency 1 2 4 1 3 2 1 1 7 4 2 2 1 Percentage 12.5 15.4 30.8 11.1 6.8 11.8 7.1 12.5 12.7 22.2 12.5 50.0 12.5 Scale: 3 Frequency 2 1 3 3 11 2 2 15 3 2 3 Percentage 25.0 16.7 23.1 23.1 25.0 11.8 14.3 27.3 16.7 12.5 37.5 Scale: 4 Frequency 4 3 4 3 4 11 8 4 5 3 17 5 7 1 1 Percentage 50.0 50.0 30.8 23.1 44.4 25.0 47.1 28,6 62.5 60,0 30.9 27.8 43.8 25.0 12.5 Scale: 5 Frequency 1 1 2 2 4 15 4 5 1 6 2 1 1 Percentage 12.5 16.7 15.4 15.4 44.4 34.0 23.5 35.7 12.5 10.9 11.1 6.3 12.5 “Do Not Know” Frequency 1 1 1 Percentage 7.1 1.8 5.6 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 o
  • 98.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 25, cont’d.Item Frequency and Percentage of Response by Position and Size o f District For Items in the Leadership Category. Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 2. Process to study performance: Scale: 1 Frequency 3 1 6 3 1 2 12 3 4 1 3 Percentage 23.1 11.1 13.6 17.6 7.1 25.0 21.8 16.7 25,0 25.0 37.5 Scale: 2 Frequency 1 1 3 4 1 5 6 5 1 3 17 4 4 2 2 Percentage 12.5 16.7 23.1 30.8 11.1 11.4 35.3 35.7 12.5 60.0 30,9 22.2 43.8 50.0 25.0 Scale: 3 Frequency 2 3 7 4 3 12 4 5 1 1 4 4 2 1 Percentage 25.0 50.0 53,8 30,8 33.3 27.3 23.5 35.7 12.5 20,0 7.3 22.2 12.5 25.0 Scale: 4 Frequency 5 2 5 2 8 2 1 2 1 10 2 2 1 Percentage 62.5 33.3 38.5 22.2 18.2 11.8 7.1 25.0 20,0 18.2 111 12.5 12.5 Scale: 5 Frequency 2 12 2 2 2 7 2 1 2 Percentage 22.2 27.3 11.8 14.3 25.0 12.7 11.1 6.3 25.0 “Do Not Know” Frequency 5 3 Percentage 9.1 16.7 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00
  • 99.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 25, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 3. Leadership role in improvement: Scale. 1 Frequency 6 1 14 6 5 1 3 Percentage 13.6 7.1 25,5 33.3 31.3 25.0 37.5 Scale: 2 Frequency 2 5 2 2 1 2 12 2 5 2 2 Percentage 15.4 11.4 11.8 14.3 12.5 40.0 21.8 11.1 31.3 50,0 25,0 Scale: 3 Frequency 1 1 6 3 10 6 5 2 1 9 4 2 1 1 Percentage 12.5 16.7 46.2 23.1 22.7 35.3 35.7 25.0 20.0 16.4 22.2 12.5 25.0 12.5 Scale: 4 Frequency 4 1 5 4 3 9 3 3 2 1 15 1 2 2 Percentage 50,0 16.7 38.5 30.8 33.3 20.5 17.6 21.4 25.0 20.0 27.3 5.6 12.5 25.0 Scale: 5 Frequency 3 4 2 4 6 13 6 3 2 1 4 4 2 Percentage 37.5 66.7 15.4 30.8 66.7 29,5 35.3 21.4 25.0 20.0 7.3 22.2 12.5 “Do Not Know” Frequency 3 1 1 Percentage 37.5 1.8 5.6 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 K>
  • 100.
    73 CD ■ o - 5 o Q. Q. s g>Table 25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. (/)■<f ) o ' o o CD o Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 o ■O -5 cq' 17 4. Participatory management: o o Scale: 1 Frequency 1 3 1 1 13 3 4 2 2 CD —s Percentage 12.5 6.8 5.9 12.5 23.6 16.7 25.0 50.0 25.0 ~n c Scale: 2 Frequency 1 3 11 3 3 1 18 6 6 1 4 CD CD Percentage 7.7 23.1 25 17.6 21.3 20.0 32.7 33,3 37.5 25.0 50.0 ■o - 5 O Q. Scale: 3 Frequency 6 3 8 6 3 7 5 8 4 2 11 4 2 C & o Percentage 75.0 50.0 61,5 46.2 33.3 15.9 29.4 57.1 50.0 40.0 20.0 22.2 12.5 3 ■o o Scale: 4 Frequency 1 4 2 3 11 8 3 3 1 7 2 2 1 1 o ; l-H Percentage 16.7 30.8 15.4 33.3 25.0 47.1 21.4 37.5 20,0 12.7 11.1 12.5 25.0 12.5 CD Q. Scale: 5 Frequency 1 2 2 3 11 1 4 2 1 1 O Percentage 12.5 33.3 15.4 33.3 25.0 20,0 7.3 111 6.3 12.5 ■O CD “Do Not Know” Frequency 2 1 1 o C/)' (/) Percentage 3.6 5.6 6.3 o o (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 u>
  • 101.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 25, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 Board policy: Scale: 1 Frequency 1 1 1 4 3 1 1 3 10 2 5 3 1 Percentage 12.5 7.7 111 9.1 17.6 7.7 12.5 60.0 18.2 111 31.3 75.0 12.5 Scale: 2 Frequency 1 2 4 6 4 9 3 3 2 21 4 4 4 Percentage 12.5 33.3 30.8 46.2 44.4 20.5 17.6 21.4 25.0 38.2 22.2 25.0 50.0 Scale: 3 Frequency 3 2 2 7 7 1 1 1 4 2 4 Percentage 37.5 33.3 15.4 15.9 15.9 5.9 7.5 12.5 7.3 11.1 25.0 Scale: 4 Frequency 3 4 5 3 14 8 7 2 1 9 5 2 1 1 Percentage 37.5 30.8 38.5 33.3 31.8 47.1 50.0 25.0 20.0 16.4 27.8 12.5 25.0 12.5 Scale: 5 Frequency 2 2 2 1 9 2 1 2 1 4 1 1 1 Percentage 33.3 15.4 15.4 11.1 20.5 11.8 7.1 25.0 20.0 7.3 5.6 6.3 12.5 “Do Not Know” Frequency 1 7 4 Percentage 7.1 12.7 22.2 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled OO 4^
  • 102.
    73 CD ■o-5 o Q. Q. s to Table 25,cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. 3 (/)■ <f) o ' o o O’ CD O Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 O ■o cq' O’ 6. Responsibility to public: o 3 Scale: 1 Frequency 1 4 3 1 CD —s T | Percentage 5.9 7.3 18.8 25.0 C O’ Scale: 2 Frequency 1 1 2 2 9 4 2 4 27 6 9 2 4 CD —i CD Percentage 12.5 7.7 15.4 22.2 20,5 28.6 25.0 80.0 49.1 33.3 56,3 50.0 50.0 "O o Q. Scale: 3 Frequency 1 1 4 5 1 8 2 3 2 10 4 2 1 1 ao Percentage 12.5 16.7 30.8 38.5 11.1 18.2 11.8 21.4 25.0 18.2 22.2 12.5 25.0 12.5 "O o Scale: 4 Frequency 6 5 6 5 5 15 12 4 4 4 3 2 1 o ; l-H CD Percentage 75.0 83.3 46.2 38,5 55.6 34.1 70,6 28.6 50.0 7.3 16.7 12.5 12.5 Q. l-H Scale: 5 Frequency 2 1 11 1 5 1 9 3 O’ o c l-H Percentage 15.4 11.1 25.0 5.9 21.4 20.0 16,4 16.7 ■O CD g “Do Not Know” Frequency 1 2 c/5' w o Percentage 1.8 111 o (fable continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 KJ
  • 103.
    73 CD ■o- 5 o Q. Q. s g> Table25, cont’d. Item Frequency and Percentage of Response by Position and Size of District For Items in the Leadership Category. (/)■<f ) o ' 3 O CD o Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 O ■o cq' 17 7. Legal, ethical conduct: o o Scale: 1 Frequency 5 2 1 CD —s —n Percentage 9.1 12.5 25.0 11 C Scale: 2 Frequency 1 2 5 CD CD Percentage 12.5 4.5 9.1 "O- 5 o Q. Scale: 3 Frequency 2 2 1 8 4 1 1 19 5 7 1 2 C & o Percentage 25.0 15.4 11.1 18.2 23.5 7.1 12.5 34.5 27.8 43.8 25.0 25.0 2 ■o o Scale: 4 Frequency 2 4 10 8 7 17 7 8 5 4 10 9 3 1 4 2 ^ o; l-H Percentage 25.0 50.0 76.9 61.5 77.8 38,6 41.2 57.1 62.5 80,0 18.2 50.0 18.8 25.0 50,0 CD Q. | Scale: 5 Frequency 3 4 2 3 1 15 5 5 5 1 7 1 4 1 2 ^ O Percentage 37.5 50.0 15.4 23.1 11.1 34.1 29.4 35.7 25.0 20.0 12.7 5.6 25.0 25.0 ■O CD 3 “Do Not Know” Frequency 1 9 3 2 C/)' C/) o ' 2 Percentage 2.3 16.4 16.7 * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 ON
  • 104.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 26. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1. Strategic development: Scale: 1 Frequency 1 1 1 1 1 Percentage 7.7 1.8 5.6 6.3 25,0 Scale: 2 Frequency 2 2 1 4 3 11 12.5 2 16 5 6 2 3 Percentage 15.4 15.4 11.1 9.1 17.6 78.6 40.0 29.1 27.8 37.5 50.0 37.5 Scale: 3 Frequency 3 3 9 7 4 12 5 2 2 2 20 8 2 1 4 Percentage 37.5 50.0 69.2 53.8 44.4 27.3 29.4 14.3 25.0 40,0 36.4 44.4 12.5 25.0 50,0 Scale: 4 Frequency 5 4 2 14 5 1 4 6 2 3 Percentage 62.5 30,8 22.2 31.8 29.4 7.1 50.0 10.9 11.1 18.8 Scale: 5 Frequency 3 2 3 2 13 3 1 1 11 1 3 Percentage 50.0 15.4 23.1 22.2 29.5 17.6 12.5 20.0 20.0 5.6 18.8 “Do Not Know” Frequency 1 1 1 1 Percentage 1.8 5.6 6.3 12.5 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00
  • 105.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 26, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 2. Focus of plan: Scale: 1 Frequency 1 1 1 3 2 1 7 2 2 2 3 Percentage 7.7 7.7 11.1 6.8 11.8 7.1 12.7 111 12.5 50.0 37.5 Scale: 2 Frequency 1 11 4 3 1 8 4 4 2 2 17 6 5 1 Percentage 12.5 16.7 30.8 23.1 111 18.2 23.5 28.6 25.0 40.0 30.9 33.3 31.3 25.0 Scale: 3 Frequency 1 4 4 2 19 2 2 1 2 12 3 5 1 Percentage 12.5 30.8 30,8 22.2 22.7 11.8 14.3 12.5 40.0 21.8 16.7 31.3 25.0 Scale: 4 Frequency 4 3 1 2 1 9 6 5 2 9 2 1 2 Percentage 50.0 50.0 7.7 15.4 11.1 20.5 35.3 35.7 25.0 16.4 11.1 25.0 25.0 Scale: 5 Frequency 2 2 3 3 4 13 1 2 3 1 9 4 4 Percentage 25.0 33.3 23.1 23.1 44.4 29.5 5.9 14.3 37.5 20.0 16.4 22.2 25.0 “Do Not Know” Frequency 1 1 1 Percentage 1.8 5.6 12.5 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled 00 00
  • 106.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 26, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Strategic Planning Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 3. Implementation and assessment of plan: Scale: 1 Frequency 2 2 3 5 2 6 2 2 2 3 8 5 5 3 1 Percentage 25.0 33.3 23.1 38.5 22.2 13.6 11.8 14.3 25.0 60.0 14.5 27.8 31.3 75,0 12.5 Scale: 2 Frequency 1 3 1 5 5 4 1 16 3 3 3 Percentage 12.5 23.1 11.1 11.4 29.4 28.6 12.5 29.1 16.7 31.3 37.5 Scale. 3 Frequency 3 2 4 3 3 15 4 3 3 2 13 4 3 1 2 Percentage 37.5 33.3 30.8 23.1 33.3 34.1 23.5 21.4 37.5 40.0 23.6 22.2 18.8 25,0 25.0 Scale: 4 Frequency 1 1 1 8 3 3 1 11 1 1 Percentage 16.7 7.7 11.1 20.5 17.6 21.4 12.5 20.0 6.3 12.5 Scale: 5 Frequency 2 1 1 3 2 9 1 2 1 6 2 1 1 Percentage 25.0 16.7 7.7 23.1 22.2 20.5 5.9 14.3 12.5 10.9 111 6.3 12.5 “Do Not Know” Frequency 1 1 1 4 1 Percentage 7.7 2.3 1.8 22.2 6.3 *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled oo
  • 107.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 27. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Student Focus and Satisfaction/Stakeholder Categories. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 How student needs and expectations are determined: Scale: 1 Frequency Percentage 1 2 12.5 15.4 1 7.7 1 11.1 3 6.8 2 25.0 16 29.1 8 44.4 5 31.3 3 75.0 4 50.0 Scale: 2 Frequency Percentage 1 1 3 12.5 16.7 23.1 6 46.2 2 22.2 13 29.5 10 58.8 8 57.1 4 50.0 2 40.0 22 40.0 4 22.2 5 31.3 3 37.5 Scale: 3 Frequency Percentage 1 3 12.5 23.1 2 15.4 1 11.1 10 22.7 2 14.3 2 25.0 1 20.0 3 5.5 2 111 2 12.5 Scale: 4 Frequency Percentage 5 5 5 62.5 83.3 38.5 4 30.8 3 33.3 12 27.3 6 35.3 4 28.6 2 40.0 7 12.7 2 11.1 3 18.8 1 25.0 1 12.5 Scale: 5 Frequency Percentage 2 22.2 6 13.6 4 7.3 2 11.1 1 6.3 “Do Not Know” Frequency Percentage 3 5.5 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO O
  • 108.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 27, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and Satisfaction/Stakeholder Categories. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 2. High expectations for performance of students: Scale: 1 Frequency 1 3 1 5 2 2 1 Percentage 7.7 6.8 5.9 9.1 11.1 12.5 25.0 Scale: 2 Frequency 3 4 7 2 2 3 4 4 1 3 3 Percentage 23.1 30.8 15.9 11.8 14.3 37.5 7.3 22.2 6.3 75.0 37.5 Scale: 3 Frequency 6 5 7 7 6 21 9 9 4 5 25 5 9 3 Percentage 75.0 83.3 53.8 53.8 66.7 47.7 52.9 64.3 50.0 100.0 45.5 27.8 56.3 37.5 Scale: 4 Frequency 2 1 2 1 2 7 3 2 16 2 4 2 Percentage 25.0 16.7 15.4 7.7 22.2 15.9 17.6 14.3 29.1 11.1 25.0 25.0 Scale: 5 Frequency 1 1 6 1 1 3 2 Percentage 7.7 111 13.6 5.9 7.1 5.5 111 “Do Not Know” Frequency 1 2 3 Percentage 12.5 3.6 16.7 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
  • 109.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 27, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and Satisfaction/Stakeholder Categories. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 3. Student and stakeholder satisfaction: Scale: 1 Frequency 2 2 1 2 3 2 2 1 1 13.6 9 5 3 Percentage 25.0 15.4 7.7 22.2 6.8 11.8 14.3 12.5 20.0 23.6 50.0 31.3 75.0 Scale: 2 Frequency 1 2 4 3 1 17 6 5 2 3 21 4 7 1 3 Percentage 12.5 33.3 30.8 23.1 11.1 38.6 35.3 35.7 25.0 60,0 38.2 22.2 43.8 25.0 37.5 Scale: 3 Frequency 2 2 4 5 1 8 3 5 1 7 1 2 1 Percentage 25.0 33.3 30.8 38.5 111 18.2 17.6 35.7 12.5 12.7 5.6 12.5 12.5 Scale: 4 Frequency 3 2 3 4 4 14 5 2 2 1 9 1 1 Percentage 37.5 33.3 23.1 30.8 44.4 31.8 29,4 14.3 25.0 20.0 16.4 5.6 12.5 Scale: 5 Frequency 1 2 2 1 2 1 Percentage 111 4.5 25.0 1.8 11.1 6.3 “Do Not Know” Frequency 4 1 1 Percentage 7.3 5.6 6.3 (table continuesj *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO to
  • 110.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 27, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Student Focus and Satisfaction/Stakeholder Categories. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 4. Future needs of students and stakeholders: Scale: 1 Frequency 1 1 1 1 7 1 1 1 9 2 1 3 2 Percentage 12.5 7.7 7.7 11.1 15.9 7.1 12.5 20.0 16.4 11.1 6.3 75.0 25.0 Scale: 2 Frequency 1 1 1 6 5 2 8 5 6 1 2 Percentage 12.5 7.7 111 13.6 29.4 40.0 14.5 27.8 37.5 25.0 25.0 Scale: 3 Frequency 2 1 2 8 9 3 4 3 1 18 4 5 Percentage 25.0 16.7 15.4 61.5 20.5 17.6 28.6 37.5 20.0 32.7 22.2 31.3 Scale: 4 Frequency 4 5 8 2 3 15 7 8 3 1 11 3 4 Percentage 50.0 83.3 61.5 15.4 33.3 34.1 41.2 57.1 37.5 20.0 20.0 16.7 25.0 Scale: 5 Frequency 1 2 3 6 1 1 1 5 2 1 Percentage 7.7 15.4 33,3 13.6 5.9 7.1 12.5 9.1 11.1 12.5 “Do Not Know” Frequency 1 1 4 2 Percentage 111 2.3 7.3 111 * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled v£> U)
  • 111.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 28. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Information and Analysis Category. Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1. Selection and use; Scale: 1 Frequency 1 1 5 2 1 2 13 3 5 3 4 Percentage 7.7 11.1 11.4 11.8 7.1 25.0 23.6 16.7 31.3 75.0 50.0 Scale: 2 Frequency 1 1 4 5 4 15 5 5 2 2 21 6 7 2 Percentage 12.5 16.7 30.8 38.5 44.4 34.1 29.4 35.7 25.0 40.0 38.2 33.3 43.8 25.0 Scale: 3 Frequency 3 2 7 5 10 5 2 1 6 2 2 1 1 Percentage 37.5 33.3 53.8 38,5 22.7 29.4 14.3 20.0 10.9 111 12.5 25.0 12.5 Scale: 4 Frequency 3 2 1 1 3 7 5 5 4 2 7 4 1 Percentage 37.5 33.3 7.7 7.7 33.3 15.9 29.4 35.7 50.0 40.0 12.7 22.2 6.3 Scale: 5 Frequency 1 1 1 1 1 6 1 3 1 Percentage 12.5 16.7 7.7 7.7 11.1 13.6 7.1 5.5 6.3 “Do Not Know” Frequency 1 5 3 1 Percentage 2.3 9.1 16.7 12.5 0able continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO -fc.
  • 112.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 28, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Information and Analysis Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 2. Selection and use of comparative data: Scale: 1 Frequency 1 1 2 3 2 6 1 1 7 3 3 3 2 Percentage 12.5 16.7 15.4 23.1 22.2 13.6 7.1 20,0 12.7 16.7 18.8 75.0 25.0 Scale: 2 Frequency 1 5 5 2 9 10 5 3 2 15 5 8 1 Percentage 12.5 38,5 38.5 22.2 20.5 58.8 35.7 37.5 40.0 27.3 27.8 50.0 12.5 Scale: 3 Frequency 3 3 6 3 3 13 4 5 2 1 9 4 1 3 Percentage 37.5 33.3 46.2 23.1 33.3 29.5 23.5 35.7 25.0 20.0 16.4 22.2 6.3 37.5 Scale: 4 Frequency 3 1 1 1 6 3 2 3 1 8 2 1 1 1 Percentage 37.5 16.7 7.7 11.1 13.6 17.6 14.3 37.5 20.0 14.5 111 6.3 25.0 12.5 Scale: 5 Frequency 2 1 1 8 5 1 Percentage 33.3 7.7 111 18.2 9.1 6.3 “Do Not Know” Frequency 2 1 11 4 2 1 Percentage 4.5 7.1 20.0 22.2 12.5 12.5 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO
  • 113.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 28, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Information and Analysis Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 3. Analysis and use of school performance data: Scale: 1 Frequency 1 2 2 6 1 1 1 14 5 5 2 2 Percentage 12.5 15.4 22.2 13.6 5.9 7.1 20.0 25.5 27.8 31.3 50.0 25.0 Scale: 2 Frequency 1 2 6 5 1 10 4 5 3 10 5 2 2 3 Percentage 12.5 33.3 46.2 38.5 11.1 22.7 23.5 35.7 37.5 18.2 27.8 12.5 50.0 37.5 Scale: 3 Frequency 3 1 2 4 2 9 6 3 2 1 7 4 1 Percentage 25.0 16.7 15.4 30.8 22.2 20.5 35.3 21.4 25,0 20.0 12.7 25.0 12.5 Scale. 4 Frequency 4 2 3 9 4 2 3 1 10 3 1 2 Percentage 50.0 15.4 23.1 20.5 23.5 14.3 37.5 20.0 18.2 16.7 6.3 25,0 Scale: 5 Frequency 3 1 1 4 8 1 2 7 2 1 Percentage 50.0 7.7 7.7 44.4 18.2 5.9 14.3 12.7 11.1 6.3 “Do Not Know” Frequency 2 1 2 7 3 3 Percentage 4.5 7.1 40.0 12.7 16.7 18.8 *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO ON
  • 114.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers Items 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1. Learning and working climate: Scale. 1 Frequency Percentage 4 9.1 1 12.5 14 25.5 5 4 2 2 27.8 25.0 50.0 25.0 Scale: 2 Frequency Percentage 1 12.5 2 1 15.4 7.7 11 25.0 3 2 17.6 14.3 1 20.0 11 20.0 4 5 1 22.2 31.3 12.5 Scale. 3 Frequency Percentage 2 25.0 1 6 4 16.7 46.2 30.8 1 11.1 14 31.8 6 4 2 35.3 28.6 25.0 2 40.0 19 34.5 4 2 1 4 22.2 12.5 25.0 50.0 Scale. 4 Frequency Percentage 5 62.5 3 4 6 50.0 30.8 46.2 4 44.4 9 20.5 4 5 5 23.5 35,7 62.5 2 40.0 8 14.5 4 5 1 1 22.2 31.3 25.0 12.5 Scale. 5 Frequency Percentage 2 1 2 33.3 7.7 15.4 4 44.4 6 13.6 3 3 17.6 21.4 3 5.5 1 5.6 “Do Not Know” Frequency Percentage (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO
  • 115.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 Work systems; Scale; 1 Frequency 1 6 1 2 1 1 18 5 5 2 5 Percentage 7.7 13.6 5.9 14.3 12.5 20,0 32.7 27.8 31.3 50,0 62.5 Scale: 2 Frequency 4 4 9 3 4 3 1 17 5 5 1 2 Percentage 30.8 30.8 20.5 17.6 28.6 37.5 20.0 30.9 27.8 31.3 25.0 25.0 Scale: 3 Frequency 5 3 6 3 1 10 4 3 1 1 7 5 4 Percentage 62.5 50.0 46.2 23.1 11.1 22.7 23.5 21.4 12.5 20,0 12.7 27.8 25.0 Scale: 4 Frequency 3 1 1 5 7 15 7 5 2 2 11 2 2 1 1 Percentage 37.5 16.7 7.7 38.5 77.8 34.1 41.2 35.7 25.0 40.0 20,0 111 12.5 25.0 12.5 Scale: 5 Frequency 2 1 1 4 1 1 Percentage 33.3 7.7 11.1 9.1 5.9 12.5 “Do Not Know” Frequency Percentage 2 1 4 3.6 5.6 25.0 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o 00
  • 116.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29, cont’d.Item Frequency and Percentage of Response by Position and Size o f District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 2. Work systems: Scale. 1 Frequency 1 6 1 2 1 1 18 5 5 2 5 Percentage 7.7 13.6 5.9 14.3 12.5 20.0 32.7 27.8 31.3 50.0 62.5 Scale: 2 Frequency 4 4 9 3 4 3 1 17 5 5 1 2 Percentage 30.8 30.8 20.5 17.6 28.6 37.5 20,0 30.9 27.8 31.3 25.0 25.0 Scale. 3 Frequency 5 3 6 3 1 10 4 3 1 1 7 5 4 Percentage 62.5 50.0 46.2 23.1 11.1 22.7 23.5 21.4 12.5 20,0 12.7 27.8 25.0 Scale: 4 Frequency 3 1 1 5 7 15 7 5 2 2 11 2 2 1 1 Percentage 37.5 16.7 7.7 38.5 77.8 34.1 41.2 35.7 25.0 40.0 20.0 111 12.5 25.0 12.5 Scale: 5 Frequency 2 1 1 4 1 1 Percentage 33.3 7.7 11.1 9.1 5.9 12.5 “Do Not Know” Frequency 2 1 4 Percentage 3.6 5.6 25.0 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled vO o
  • 117.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 Personal training and development: Scale: 1 Frequency Percentage 1 7.7 9 20.5 2 14.3 1 12.5 17 30.9 6 3.3 8 3 50.0 75.0 3 37.5 Scale: 2 Frequency Percentage 1 12.5 1 16.7 3 23.1 6 46.2 1 111 5 11.4 3 17.6 2 14.3 2 25.0 2 40.0 9 16.4 3 16.7 1 6.3 3 37.5 Scale: 3 Frequency Percentage 4 50.0 1 16.7 5 38.5 2 15.4 1 11.1 10 22.7 6 35.3 3 21.4 1 12.5 2 40.0 12 21.8 4 22.2 1 6.3 Scale: 4 Frequency Percentage 2 25.0 1 16.7 4 30.8 3 23.1 6 66.7 13 29.5 6 35,3 5 35.7 4 50.0 1 20.0 8 14.5 4 22.2 1 25.0 1 12.5 Scale; 5 Frequency Percentage 1 12.5 3 50.0 2 15.4 1 11.1 7 15.9 1 5.9 2 14.3 7 12.7 1 5.6 “Do Not Know” Frequency Percentage 2 3.6 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o o
  • 118.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 4. Performance appraisal: Scale: 1 Frequency 1 1 4 2 6 2 2 2 1 18 7 5 Percentage 16.7 7.7 30.8 22.2 13.6 11.8 14.3 25.0 20.0 32.7 38.9 31.3 Scale: 2 Frequency 5 1 3 6 1 21 6 5 3 3 20 5 9 Percentage 62.5 16.7 23.1 46.2 11.1 47.7 35.3 35.7 37.5 60,0 36.4 27.8 56.3 Scale: 3 Frequency 2 1 7 1 3 3 4 4 1 1 4 1 1 Percentage 25.0 16.7 53.8 7.7 33.3 6.8 23.5 28.6 12.5 20.0 7.3 5.6 6.3 Scale: 4 Frequency 1 3 215. 1 1 11 2 3 1 8 1 1 Percentage 12.5 50.0 4 7.7 11.1 25.0 11.8 21.4 12.5 14.5 5.6 6.3 Scale: 5 Frequency 1 2 3 2 1 3 1 Percentage 7.7 22.2 6.8 11.8 12.5 5.5 5.6 “Do Not Know” Frequency 2 2 Percentage 3.6 11.1 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o
  • 119.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 29, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Human Resource Development and Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 5. Employee satisfaction: Scale: 1 Frequency 6 3 21 8 7 3 3 Percentage 13,6 21.4 38.2 44.4 43.8 75.0 37.5 Scale: 2 Frequency 2 2 4 7 20 6 3 5 2 23 5 5 3 Percentage 25.0 33.3 30.8 53.8 45.5 35.3 21.4 62.5 40.0 41.8 27.8 31.3 37.5 Scale: 3 Frequency 3 2 6 4 2 6 5 1 1 4 3 1 2 Percentage 37.5 33.3 46.2 30.8 22.2 13.6 29.4 7.1 20.0 7.3 18.8 25.0 25.0 Scale: 4 Frequency 2 2 2 1 4 8 6 7 2 2 4 2 1 Percentage 25.0 33.3 15.4 7.7 4.4 18.2 35.3 50.0 25.0 4.0 7.3 11.1 6.3 Scale: 5 Frequency 1 1 1 3 4 1 3 3 Percentage 12.5 7.7 7.7 33.3 9.1 12.5 5.5 16.7 “Do Not Know” Frequency Percentage * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o to
  • 120.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30. ItemFrequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers Items 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1. Design of educational programs: Scale: 1 Frequency Percentage 6 13.6 1 1 5.9 7.1 6 10.9 6 33.3 3 3 4 18.8 75.0 50.0 Scale: 2 Frequency Percentage 1 5 3 16.7 38.5 23.1 4 9.1 4 2 3 23.5 14.3 37.5 1 20.0 16 29.1 1 5.6 7 1 3 43.8 25.0 37.5 Scale: 3 Frequency Percentage 5 62.5 2 5 7 33,3 38.5 53,8 1 11.1 15 34.1 7 4 1 41.2 28.6 12.5 2 40.0 17 30.9 4 22.2 Scale: 4 Frequency Percentage 2 25.0 3 3 50.0 23.1 7 77.8 17 38.6 3 6 3 17.6 42.9 37.5 2 40.0 4 7.3 5 27.8 4 1 25.0 12.5 Scale: 5 Frequency Percentage 1 12.5 3 23.1 1 11.1 2 4.5 1 1 7.1 12.5 7 12.7 1 5.6 1 6.3 “Do Not Know” Frequency Percentage 5 9.1 1 5.6 1 6.3 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o U>
  • 121.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30. cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 2. Delivery of educational programs: Scale: 1 Frequency 1 1 5 1 1 1 9 5 4 2 4 Percentage 7.7 7.7 11.4 5.9 7.1 12.5 16.4 27.8 25.0 50.0 50.0 Scale: 2 Frequency 2 5 3 6 6 3 3 2 2 0 3 7 2 2 Percentage 33.3 38.5 23.1 13.6 35.3 21.4 37.5 40.0 36.4 16.7 43.8 50,0 25.0 Scale: 3 Frequency 3 1 5 4 1 15 3 4 2 1 10 2 1 Percentage 37.5 16.7 38,5 30.8 1 1 1 34.1 17.6 28.6 25.0 2 0 .0 18.2 11.1 6.3 Scale: 4 Frequency 3 2 2 3 6 14 5 5 1 2 6 5 4 2 Percentage 37.5 33,3 15.4 23.1 66.7 31.8 29.4 35.7 12.5 40.0 10.9 27.8 25.0 25.0 Scale: 5 Frequency 2 1 2 2 4 1 1 6 1 Percentage 25.0 16.7 15.4 22.2 9.1 7.1 12.5 10.9 5.6 “Do Not Know” Frequency 4 2 Percentage 7.3 111 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o a
  • 122.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 3. Design and delivery of educational support services: Scale: 1 Frequency Percentage 2 15.4 7 15.9 1 1 12.5 2 0 .0 13 2 23.6 4 1 1 1 2 25.0 1 50.0 12.5 Scale: 2 Frequency Percentage 3 37.5 2 33.3 7 53.8 5 38.5 3 33.3 7 15.9 6 35.3 6 42.9 3 37.5 1 20 .0 17 30.9 6 33.3 7 43.8 2 50.0 4 50,0 Scale: 3 Frequency Percentage 3 37.5 1 16.7 3 23.1 4 30.8 5 55.6 9 20,5 5 29.4 2 14.3 3 37.5 1 20 ,0 3 5.5 3 16.7 1 6.3 Scale: 4 Frequency Percentage 1 12.5 3 50,0 3 23.1 1 7.7 1 III 12 27.3 5 29.4 6 42.9 1 12.5 11 2 0 .0 3 16.7 2 12.5 2 25.0 Scale: 5 Frequency Percentage 1 12.5 1 7.7 8 18.2 2 40.0 3 5.5 1 5.6 “Do Not Know” Frequency Percentage 1 2.3 8 14.5 3 16.7 2 12.5 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o
  • 123.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 4. Data and information processes: Scale: 1 Frequency 1 1 1 2 4 1 2 2 1 16 5 2 3 2 Percentage 12.5 16.7 7.7 15.4 9.1 5.9 14.3 25.0 2 0 .0 29.1 27.8 12.5 75,0 25.0 Scale: 2 Frequency 2 4 7 2 8 7 4 1 3 18 6 10 1 3 Percentage 25.0 30.8 53.8 2 2 .2 18.2 41.2 28.6 12.5 60,0 32.7 33.3 62.5 25.0 37.5 Scale: 3 Frequency 3 3 7 2 2 14 4 1 1 5 5 1 Percentage 37.5 50.0 53.8 15.4 22 .2 31.8 23.5 7.1 12.5 9.1 27.8 6.3 Scale: 4 Frequency 1 2 1 2 3 15 2 6 3 1 8 1 1 Percentage 12.5 33.3 7.7 15.4 33.3 34.1 11.8 42.9 37.5 2 0 .0 14.5 6.3 12.5 Scale: 5 Frequency 1 1 1 2 1 1 3 Percentage 12.5 16.7 1 1 1 4.5 7.1 12.5 5.5 “Do Not Know” Frequency 1 1 5 2 2 Percentage 11.1 2.3 9.1 11.1 12.5 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o ON
  • 124.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 Communication processes: Scale: 1 Frequency 1 1 4 2 1 9 5 3 2 1 Percentage 16.7 7.7 9.1 14.3 2 0 .0 16.4 27.8 18.8 50.0 12.5 Scale: 2 Frequency 1 1 3 8 2 2 2 16 4 7 1 Percentage 12.5 7.7 23.1 18.2 11.8 25.0 40.0 29.1 2 2 .2 43.8 25.0 Scale: 3 Frequency 4 4 8 9 4 13 7 5 3 21 6 3 4 Percentage 50.0 66.7 61.5 69.2 44.4 29.5 41.2 35.7 37.5 38.2 33.3 18.8 50.0 Scale: 4 Frequency 2 1 3 1 4 13 5 4 1 2 6 3 3 1 2 Percentage 25.0 16.7 23.1 7.7 44.4 29.5 29.4 28.6 12.5 40.0 10,9 16.7 18.8 25.0 25.0 Scale: 5 Frequency 1 4 3 2 2 Percentage 11.1 9.1 21.4 25.0 3.6 ‘Do Not Know” Frequency Percentage 12.5 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o
  • 125.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 30, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the Educational and Operational Process Management Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 6 . Supplier and partnering processes: Scale: 1 Frequency 3 1 3 1 1 6 2 2 Percentage 23.1 7.7 6.8 7.1 12.5 10.9 12.5 50.0 Scale: 2 Frequency 2 3 4 5 4 7 8 4 1 3 10 3 11 1 3 Percentage 25,0 50.0 30,8 38.5 44.4 15.9 47.1 28.6 12.5 60.0 18.2 16.7 68 .8 25.0 37.5 Scale: 3 Frequency 3 1 5 6 1 12 6 5 2 5 4 1 2 Percentage 37.5 16.7 38.5 46.2 1 1 1 27.3 35.3 35.7 25.0 9.1 22 .2 6.3 25.0 Scale. 4 Frequency 2 1 1 1 1 9 4 2 2 1 6 1 Percentage 25,0 16.7 7.7 7.7 11.1 20.5 11.8 14.3 25.0 2 0 .0 10.9 6.3 Scale: 5 Frequency 1 1 2 9 1 1 3 Percentage 12.5 16.7 2 2 .2 20.5 12.5 2 0 .0 5.5 “Do Not Know” Frequency 1 4 2 1 25 11 1 1 Percentage 11.1 9.1 14.3 12.5 45.5 61.1 6.3 25.0 *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o 00
  • 126.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31. ItemFrequency and Percentage of Response by Position and Size of District For Items in the School District Performance Results Category. Supervisors Principals Teachers District Size* District Size* District Size* Items 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1. Student performance results: Scale: 1 Frequency Percentage 1 12.5 1 7.7 1 7.7 3 5.5 1 5.6 3 18.8 1 25.0 1 12.5 Scale: 2 Frequency Percentage 2 33.3 3 21.1 4 30,8 2 2 2.2 7 15.9 6 35.3 5 35.7 2 25.0 1 2 0 .0 19 34.5 8 44.4 5 31.3 2 50.0 5 62.5 Scale: 3 Frequency Percentage 2 25,0 4 66.7 5 38.5 7 53.8 5 55.6 14 31.8 6 35.3 5 35.7 3 37.5 2 40.0 12 2 1 .8 4 22 .2 4 25,0 2 25.0 Scale: 4 Frequency Percentage 5 62.5 3 23.1 1 11.1 23 52.3 5 29.4 3 21.4 2 25.0 1 2 0 .0 18 32.7 3 16.7 4 25.0 1 25.0 Scale: 5 Frequency Percentage 1 7.7 1 7.7 1 1 1 1 1 12.5 1 2 0 ,0 2 3.6 “Do Not Know” Frequency Percentage 1 7.1 1 1.8 2 11.1 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o VO
  • 127.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the School District Performance Results Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 2 . Student conduct results: Scale: 1 Frequency Percentage 1 12.5 2 15.4 1 7.7 2 4.5 1 5.9 15 27.3 2 11.1 2 12.5 1 25.0 1 12.5 Scale: 2 Frequency Percentage 1 12.5 1 16.7 4 30.8 1 7.7 1 11.1 8 18.2 3 17.6 9 16.4 4 2 2 .2 5 31.3 2 25.0 Scale: 3 Frequency Percentage 1 12.5 3 50.0 4 30.8 3 23.1 2 22.2 8 18.2 3 17.6 5 35.7 1 12.5 1 2 0 .0 11 2 0 .0 2 11.1 5 31.3 1 25.0 2 25,0 Scale: 4 Frequency Percentage 4 50.0 2 33.3 2 15.4 4 30.8 4 44.4 14 31.8 6 35.3 4 28.6 4 50.0 3 60.0 7 12.7 2 11.1 1 6.3 1 12.5 Scale: 5 Frequency Percentage 1 7.7 3 23.1 2 22.2 3 6.8 2 14.3 2 25.0 1 20 .0 1 1.8 1 5.6 1 25.0 1 12.5 “Do Not Know” Frequency Percentage 1 12.5 1 7.7 9 20.5 3 17.6 3 21.4 1 12.5 12 21.8 7 38,9 3 18.8 1 25.0 1 12.5 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled o
  • 128.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31, cont’d.Item Frequency and Percentage of Response by Position and Size o f District For Items in the School District Performance Results Category. Supervisors Principals Teachers Items 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 3. Student and stakeholder satisfaction results: Scale: 1 Frequency Percentage 1 16.7 1 6.3 4 9.1 5 4 9.1 1 22 .2 1 6.3 1 25.0 12.5 Scale. 2 Frequency Percentage 1 6.3 1 12.5 7 12.7 1 6.3 1 2 25.0 25.0 Scale: 3 Frequency Percentage 3 37.5 3 50.0 5 38.5 4 25.0 5 55.6 14 31.8 7 41.2 4 28.6 2 25.0 2 40.0 13 23.6 3 16.7 4 25.0 2 25,0 Scale: 4 Frequency Percentage 4 50.0 2 33.3 4 30.8 1 6.3 2 22 .2 10 22.7 8 7.1 6 42,9 3 37.5 3 60.0 8 14.5 6 33.3 1 6.3 1 25.0 2 25.0 Scale: 5 Frequency Percentage 1 12.5 1 7.7 9 56.3 2 22.2 5 11.4 2 14.3 1 12.5 3 5.5 “Do Not Know” Frequency Percentage 3 23.1 10 22.7 1 5.9 1 7.1 1 12.5 19 34.5 5 27.8 9 56.3 1 25.0 1 12.5 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
  • 129.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31, cont’d.Item Frequency and Percentage of Response by Position and Size o f District For Items in the School District Performance Results Category. Supervisors Principals Teachers Items District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 District Size* 1 2 3 4 5 4. Human resource results: Scale: 1 Frequency 1 3 1 6 1 1 2 1 Percentage 6.3 6.8 5.9 10.9 5.6 6.3 50.0 12.5 Scale: 2 Frequency 3 4 6 12 2 18 5 7 4 3 25 7 12 1 2 Percentage 37.5 66.7 46.2 75.0 22.2 40.9 29.4 50,0 50.0 60.0 45.5 38.9 75.0 25.0 25.0 Scale: 3 Frequency 2 1 5 2 2 1 6 2 2 1 Percentage 25.0 7.7 11.4 11.8 14.3 20 .0 10.9 11.1 12.5 12.5 Scale: 4 Frequency 1 4 2 3 8 5 1 1 7 3 1 Percentage 12.5 30.8 12.5 33.3 18.2 29.4 12.5 20 .0 12,7 16.7 12.5 Scale: 5 Frequency 2 2 2 4 6 4 2 5 1 1 1 Percentage 25,0 33.3 15.4 44.4 13.6 28.6 25.0 9.1 5.6 25.0 12.5 “Do Not Know” Frequency 1 4 1 1 6 4 1 1 Percentage 6.3 9.1 7.1 12.5 10.9 2 2 .2 6.3 12.5 (table continues) * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
  • 130.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the School District Performance Results Category. Items Supervisors Principals Teachers 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 5. Educational program and service results: Scale: 1 Frequency 2 1 5 2 1 2 2 12 5 4 2 2 Percentage 25.0 16.7 11.4 11.8 7.1 25.0 40.0 21.8 27.8 25.0 50.0 25.0 Scale: 2 Frequency 1 3 1 1 8 4 1 1 2 Percentage 12.5 23.1 2.3 5.9 14.5 2 2 .2 6.3 25.0 25.0 Scale: 3 Frequency 3 2 4 3 10 7 5 1 10 3 4 1 Percentage 37.5 33.3 30.8 33.3 22.7 41.2 35.7 20.0 18.2 16.7 25.0 12.5 Scale: 4 Frequency 2 3 5 4 18 4 4 4 1 10 3 3 1 2 Percentage 25.0 50.0 38.5 44.4 40.9 23.5 28.6 50.0 2 0.0 18.2 16.7 18.8 25.0 25.0 Scale: 5 Frequency 2 2 4 2 2 1 1 1 Percentage 15.4 2 2.2 9.1 14.3 25.0 2 0,0 1.8 12.5 “Do Not Know” Frequency 1 6 1 2 13 3 4 Percentage 7.7 13.6 5.9 14.3 23.6 16.7 25.0 (table continues) *1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
  • 131.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Table 31, cont’d.Item Frequency and Percentage of Response by Position and Size of District For Items in the School District Performance Results Category. Supervisors Principals Teachers Items 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 1 District Size* 2 3 4 5 6 . Educational support services results: Scale: 1 Frequency Percentage 2 33.3 1 2.3 2 11.8 1 12.5 4 7.3 1 5.6 1 25.0 1 12.5 Scale: 2 Frequency Percentage 1 12.5 4 1 66.7 7.7 1 11.1 7 15.9 11 3 64.7 21.4 5 9.1 2 11.1 3 1 18.8 25.0 4 50.0 Scale: 3 Frequency Percentage 4 50.0 4 30.8 3 33,3 5 11.4 3 1 17.6 7.1 1 2 0 .0 10 18.2 4 2 2 .2 6 37.5 3 37.5 Scale: 4 Frequency Percentage 3 37.5 8 61.5 2 2 2 .2 10 22.7 4 28.6 1 12.5 1 2 0 .0 10 18.2 3 16.7 Scale: 5 Frequency Percentage 1 i l l 7 15.9 5 35.7 4 50.0 2 40.0 1 25.0 “Do Not Know” Frequency Percentage 2 2 2 .2 14 31.8 1 7.1 2 25.0 1 2 0 .0 25 45.5 8 44.4 7 1 43.8 25.0 * 1 = 5,000 + enrolled; 2 = 2,500 - 4,999 enrolled; 3 = 1,000 - 2,499 enrolled; 4 = 500 - 999 enrolled; 5 = 1 - 499 enrolled
  • 132.
    115 Inferential Statistical Analysis Ageneral linear models procedure was applied, and a 3 x 5, two-way factorial analysis o f variance was used to test the hypotheses. The assumptions o f homogeneity o f variance, independence o f variables, and continuous dependent variables were met. Type III sum o f squares for each construct was used since there were different numbers in the cells. The null hypothesis for the Leadership construct was, “Hoi: There are no significant differences in the Leadership category o f the Performance Analysis for School Districts by type or size.” Table 32 illustrates a significant difference was found with the variable position at .001. No significant differences for p values at .05 were found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. This test is the most conservative measure and controls for Type 1 error rate. The alpha level o f .05 determined that there were no significant differences when comparing superintendents and principals. However, there was a significant difference at .05 when teachers were compared to both superintendents and principals. Therefore, the null hypothesis is rejected. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 133.
    116 Table 32. Two-WayANOVA Leadership Construct. Source df SS MS F value Pr > F Model: Size 4 1.94558390 0.48639598 0.59 0.6736 Type 2 25.06159973 12.53079986 15.08 0.0 0 1 Size x Type 8 2.71734755 0.33966844 0.41 0.9148 Error 218 181.14322972 0.83093225 Total 232 221.53565721 The null hypothesis for the Strategic Planning construct was, “H0 2 : There are no significant differences in the strategic planning category o f the Performance Analysis for School Districts by type or size.” Table 33 illustrates a significant difference was found with the variable position at .001. No significant differences for p values at .05 were found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05 determined that there were no significant differences when comparing superintendents and principals. There was a significant difference, however, at .05 when teachers were compared to both superintendents and principals. Therefore, the null hypothesis is rejected. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 134.
    117 Table 33. Two-WayANOVA Strategic Planning Construct. Source df SS MS F value Pr > F Model: Size 4 8.04564589 2.01141147 1.75 0.1397 Type 2 22.17052545 11.08526273 9.66 0 .001 Size x Type 8 7.73834769 0.96729346 0.84 0.5661 Error 218 250.27735473 1.14806126 Total 232 287.11826419 The null hypothesis for the Student and Stakeholder Satisfaction construct was, “H03: There are no significant differences in Student and Stakeholder Satisfaction category o f the Performance Analysis for School Districts by type or size.” Table 34 illustrates a significant difference was found with the variable position at .001. No significant differences for p values at .05 were found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05 determined that there was no significant difference when comparing superintendents and principals. However, the difference between the mean o f the teachers and principals and the mean of teachers and superintendents was large enough to be significant at .05. Therefore, the null hypothesis is rejected. The null hypothesis for the Information and Analysis construct was, “Ho4: There are no significant differences in the Information and Analysis category o f the Performance Analysis for School Districts by type or size.” Table 35 illustrates a significant difference was found for position at .001. No significant differences for the p R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 135.
    118 values at .05were found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05 determined that there was no significant difference when comparing superintendents and principals. However, the difference between the mean o f the teachers and principals and the teachers and superintendents was large enough to be significant at .05. Therefore, the null hypothesis is rejected. Table 34. Two-Way ANOVA Student/Stakeholder Satisfaction Construct. Source df SS MS F value Pr > F Model: Size 4 3.81446595 0.95361649 1.27 0.2820 Type 2 18.42615628 9.21307814 12.29 0.001 Size x Type 8 5.01819420 0.62727427 0.84 0.571 Error 219 164.20135256 0.74977787 Total 233 188.16503443 The null hypothesis for the Human Resource Development and Management construct was, “H0 5 : There are no significant differences in the Human Resource Development and Managemen/ category of the Performance Analysis for School Districts by type or size.” Table 36 illustrates a significant difference was found for factor position at .001. No significant differences for the p values at .05 were found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05 determined that there was no significant difference when comparing Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 136.
    119 superintendents and principals.The difference between the mean o f the teachers and principals and the teachers and superintendents was large enough to be significant at .05. Therefore, the null hypothesis is rejected. Table 35. Two-Way ANOVA Information and Analysis Construct. Source df SS MS F value Pr > F Model: Size 4 5.10247757 1.27561939 1.05 0.3822 Type 2 22.98733438 11.49366719 9.46 0.001 Size x Type 8 5.87610935 0.73451367 0.60 0.7735 Error 222 269.64388447 1.21461209 Total 236 301.32301922 Table 36. Two-Way ANOVA Human Resource Development / Management Construct. Source df SS MS F value Pr > F Model: Size 4 3.30666687 0.82666672 1.02 0.3975 Type 2 36.47055806 18.23527903 22.51 0.001 Size x Type 8 6.18997506 0.77374688 0.96 0.4721 Error 218 176.57388647 0.80997196 Total 232 227.56105150 R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 137.
    120 The null hypothesisfor the Educational and Operational Process Management construct was, “Ho6: There are no significant differences in the Educational and Operational Process Management category of the Performance Analysis for School Districts by type or size.” Table 37 illustrates a significant difference was found for position at .001. No significant difference for the p values at .05 was found for district size or interaction. Scheffe’s post hoc test for multiple comparisons was done. The alpha level of .05 determined that there were no significant differences when comparing superintendents and principals. The difference between the means o f the teachers and principals and the teachers and superintendents was large enough to be significant at .05. Therefore, the null hypothesis is rejected. Table 37. Two-Way ANOVA Educational / Operational Process Management Construct. Source df SS MS F value Pr > F Model: Size 4 4.86845737 1.21711434 1.59 0.1772 Type 2 21.22474914 10.61237457 13.89 0.001 Size x Type 8 7.513 14414 0.93914302 1.23 0.2830 Error 213 162.69105458 0.76380777 Total 227 196.23901925 The null hypothesis for the School District Results construct was, “H0 7 : There are no significant differences in the school district results category o f the Performance Analysis for School Districts by type or size.” Table 38 illustrates a significant R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 138.
    121 difference was foundfor factor position at .001. No significant difference for the p values at .05 was found for factor district size or interaction o f the factors. Scheffe’s post hoc test for multiple comparisons was done. The alpha level o f .05 determined that there were no significant differences when comparing superintendents and principals. The difference between the means o f the teachers and principals and the teachers and superintendents was large enough to be significant at .05. Therefore, the null hypothesis is rejected. Table 38. Two-Way ANOVA School District Performance Results Construct. Source df SS MS F value Pr > F Model: Size 4 4.02605474 1.00651369 1.37 0.2444 Type 2 19.01940018 9.50970009 12.97 0.001 Size x Type 8 4.73059064 0.59132383 0.81 0.5974 Error 214 156.88729215 0.73311819 Total 228 189.84039301 Analysis o f “Do Not Know” Responses Chi square analysis was done to analyze the “Do Not Know” responses (See Table 39). There were no significant differences when positions were combined and district sizes were compared for responses to the “Do Not Know” choice. When districts were combined and positions were compared, significant differences were found at the .05 level. Significant differences were found in all but the Strategic Planning category. Teachers were more likely to select the “Do Not Know” category than superintendents or Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 139.
    122 Teachers were morelikely to select the “Do Not Know” category than superintendents or principals. In the Strategic Planning category, the p values was .061, which could be considered significant if alpha was set at .10. Table 39. Chi Square Analysis for “Do Not Know” Responses. Construct DF Value Prob Leadership 8 32.584 .001 Strategic Planning 4 8.989 .061 Student Stakeholder Satisfaction 4 16.521 .002 Information & Analysis 6 25.136 .001 Human Resource 4 9.783 .044 Educational Process 10 47.208 .001 School District Results 10 31.830 .001 Usefulness o f the Instrument as a Tool All educator types and district sizes were combined to analyze the perceived usefulness o f the instrument for self-study o f their school district (See Table 40). A total of 81% o f the sample found the instrument to have some use. Thirteen percent found it to have little or no use. Seventy-seven percent o f the sample responded positively to the potential use o f the instrument for school improvement. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 140.
    123 Table 40. CombinedPercentages for Usefulness of Instrument. Extremely Somewhat Little No As a study tool 24.4 61.8 13.0 .4 As a tool for school improvement 25.2 52.1 12.6 5.9 Comments Qualitative analysis was done using a constant comparative approach for emerging categories o f focus in the written comments o f the respondents. Two categories o f comment emerged: 1. Usefulness o f instrument in school improvement process. 2. Perceptions o f current climate and approach to school improvement. For the first category o f written comments, respondents who felt the instrument to have potential utility commented on its thoroughness, scope, and organization. Some suggested it could assist in focusing attention to specific components o f the organization and could force reflective thought about school improvement. Some indicated that it could be a “rubric” for analysis, while others indicated it could show “growth” and development in school improvement. Those who felt that the instrument had little or no use commented that it was too long, complicated, and time consuming. Several comments were related to use o f “jargon.” Two respondents suggested that since each district was different, one instrument would not be appropriate. The responses of participants from the smaller size districts indicated that this approach would not be necessary in their district. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 141.
    124 For the secondcategory o f written comments dealing with perceptions o f the current climate and approach to school improvement, the majority o f comments had to do with respondent’s opinions regarding school improvement and how it is designed in their district. There were several comments, mostly by teachers, which communicated a lack of confidence that their opinions would be seriously considered in decision-making or that their involvement would be desired by the administration. The teachers were also more likely to comment on their lack o f access to information needed to answer some of the items on the instrument. There were several comments which reflected frustration with school change and the lack o f resources to do anything about it. Teachers also commented that they were unaware o f how school improvement and decisions were made currently. Summary The instrument was determined to be a reliable tool for internal consistency. The results o f this study found a significant difference between the perceptions o f teachers and those o f superintendents and principals related to seven constructs in a school district organization. No significant differences were found between superintendents and principals or among the five sizes of school districts in Idaho. No significant interaction between the factors o f position and size of district was found. The majority o f respondents felt the instrument could be potentially useful for self-study o f their school district. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 142.
    125 Chapter 5 Summary, Conclusions,and Recommendations Summary The evolution of school reform calls for an increased emphasis on a clear sense o f focus, clear expectations from stakeholders, analysis o f root causes o f the problems, and the realization that the call for increased evidence is not subsiding. School reform suffers from too much scatter and the absence o f a framework, the implementation o f solutions, and the measurement of the effectiveness o f programs (Consortium on Productivity in Schools, 1995). The lessons learned from reform to date indicate poorly defined problems, lack o f process for improvement, the absence o f measurement and the intractability o f the educational system are significant factors for the lack of impact (Bernhardt, 1994; Deming, 1994; English & Hill, 1994; Fields, 1994; Fullan, 1992; Fullan, 1993; Fullan, 1997; Fullan and Miles, 1992; Sagor, 1995; Scholtes, 1995; Senge, 1990). The emerging emphasis on improving the performance of organizations is to adopt a systems model. Increasingly, employees and consumers are looking for the interconnectedness o f work function, process, and productivity (Senge, 1990). Continuous improvement is not a new approach for some businesses or service agencies such as hospitals. Quality theory and practice have gone through an evolution, much like school reform. Early stages involved costly and inefficient inspection-based practices (Wu, 1996). Quality control systems emerged that identified defects upon inspection. As quality theory and application developed, a focus on prevention emerged. This emphasis led to an emphasis on the proactive assessment o f the organization in order to improve management systems, service, and product quality. What emerged was a need Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 143.
    126 to learn fromour work organization and understand the problem before the solution was prescribed. The Malcolm Baldrige National Quality Award was originally developed to motivate businesses to excel in productivity and operations as well as to recognize their performance. Award winners were obligated to share their lessons with other companies who wished to achieve the same high standard o f performance. An unanticipated result of the award process was the development o f criteria for the organizational self-study o f performance for the purpose o f improvement (Caravatta, 1997). Models of school district performance have traditionally been inspection-based and have reflected a minimal standard of compliance. Effective models for demonstrating performance were needed, with clearly defined indicators o f excellence that reflected the comprehensive scope of the school district as an interconnected system. This study examined the current literature on organizational improvement and traditional approaches used by schools to assess their effectiveness, and proposed a model for improvement grounded in quality theory. The research also investigated the perceptions of Idaho educators about the performance o f their school districts, using an instrument adapted from the Malcolm Baldrige National Quality Award, and based on a systems model and quality theory. Conclusions The instrument developed— Performance Analysis for School Districts— was found to be reliable and demonstrate internal consistency. The study examined perceptual differences o f current performance all seven areas and the influence o f two variables— educator position and size o f district— were explored. Hypotheses for the R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 144.
    127 categories o fLeadership, Strategic Planning, Student and Stakeholder Satisfaction, Information and Analysis, Human Resource Development, Educational Process Management, and School District Results were tested using a factorial two-way analysis o f variance. Each hypothesis was rejected based on the significance o f one o f the main effects. A significant difference was found in the responses of teachers when compared to both superintendents and principals in each of the seven areas. No significant difference was found between the responses o f superintendents and principals. The size o f the district was not found to be a factor in responses in any category. There was no significant interaction between size o f district and educator position. Teachers responded at a significantly higher rate with the “Do Not Know” selection. The majority o f responses were answered positively regarding the potential usefulness o f the instrument for organizational analysis and school improvement. However, comments made by respondents suggested that the length, complexity, and language might be obstacles. Lack o f knowledge about the scope o f operations was apparent on the part of respondents and they commented on the fact consistently. Teacher comments also indicated doubts that administrators desired their sincere involvement or opinion. The Performance Analysis for School Districts is a reliable tool for consistently assessing perception o f the current situation in each construct examined. The reliability o f the School District Results construct could be improved to increase reliability, but findings were still positive.. Although the response rate was only 36%, the total number was large enough to positively influence the power o f the statistical analysis. The use of a stratified random sample increased the generalization o f the findings to the larger Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 145.
    128 population o fIdaho educators. The findings only measured perceptions o f current performance and did not account for varying levels o f prior knowledge o f quality or organizational performance. The differences in the perceptions o f teachers and administrators were statistically highly significant. The instrument has the potential to be useful in discovering differences in perception among various components o f operation and performance. Teachers clearly do not have comprehensive knowledge o f the system in which they work. Administrators consistently perceived the current situation to be better than the teachers did. Comments suggested that educators desire a simple, quick approach to school district performance and generally feel cynical toward their role in impacting school improvement. The findings o f this study are o f practical significance for leadership and teachers as school improvement strategies and management systems are designed. The need for a model to be used as a framework for improving school district performance is apparent. Figure 3 suggests such a framework to improve school productivity based on the theory of quality within a systems perspective. Recommendations The Malcolm Baldrige National Quality Award takes into account the systems and multiple constituencies and goal attainment models for organizational effectiveness described in this study. Increasingly studies are examining the applications of these models in a variety o f settings (Danne, 1992; Fritz, 1993; Miller, 1993; Smith, 1995; Thompson, 1996; Wu, 1996). While most studies measured perception and opinion, they often neglect to examine objective measures o f actual improvements in organizations that have applied a self-study process, quality applications, and a systems approach. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 146.
    129 Leadership K-12 Educational System Process Management Stakeholderand Student Expectations and Satisfaction Strategic Planning Information & Analysis Human Resource Development & Management Results Figure 2. A Quality Systems Model for Performance Improvement. approach to improving organizational performance. Continued research is needed to study the application and the outcomes achieved as a result of this approach. Consideration might be given to a causal-comparative design for such investigations R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 147.
    130 The findings relatedto differences in the perceptions o f educators should be further investigated, and the relationship between how administrators lead and the teacher perceptions o f the system as a whole are also important issues. Exemplary companies that have been recognized as benchmarks for human resource practices could be investigated to determine possible applications for schools o f the 21st century. Studies could also explore how teacher performance is effected by teacher attitudes and levels o f pride in the organization. Larger samples selected from other states could further increase the generalization potential of the findings. Other strategies, besides the mail survey approach, should be considered to simulate actual use of the instrument. The instrument should be applied in a school district(s) to determine its potential usefulness as a starting point for organizational improvement. A case study model which simulates the MBNQA preparation and review process may be an effective research design. Case studies for in-depth analysis could contribute significantly to the body of knowledge about using quality approaches in schools. The necessity for an agreed upon framework and criteria to measure productivity and accountability is apparent. Fullan (1980) suggested that, “because it is so difficult to evaluate the effectiveness o f organizations with anarchistic characteristics, managers tend to seek, simple, uncomplicated indicators to justify their effectiveness. Decision theorists have found that when complex and ambiguous situations are encountered, overly simplistic decisions are applied. Overly simplistic indicators are frequently relied on. The results o f this study and the comments made by the respondents support this notion. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 148.
    131 References Anderson, G., Herr,K. & Nihlen, A. (1994). Studying your own school. Thousand Oaks, CA: Corwin Press, Inc. Bass, B. M. (1952). Ultimate criteria o f organizational worth. Personnel Psychology, 5, 57- 73. Bemowski, K. (1996). Baldrige Award celebrates its 10thbirthday with a new look. Quality Progress, 29, 49-54. Bemowski, K. & Stratton, B. (1995). How do people use the Baldrige Award criteria? Quality Progress, 28, 43-47. Bennis, W. (1976). The planning o f change, (3rded.). New York: Holt, Rinehart and Winston. Bernhardt, V. (1994). The schoolportfolio. Princeton, NJ: Eye on Education. Bonstingl, J. (1996). Schools o f quality. Alexandria, VA: Association for Supervision and Curriculum Development. Bracey, G. (1997). Setting the record straight. Alexandria, VA: Association for Supervision and Curriculum Development. Bradley, L. (1993). Total quality managementfo r schools. Lancaster, PA: Technomic Publishing. Brassard, A. (1993). Conceptions of organizational effectiveness revisited. The Alberta Journal o f Educational Research, 39, 143-62. Berman, McLaughlin. (1977). Brown, M. (1994). Baldrige Award Winning Quality. Milwaukee, WI: ASQC Quality Press. Bushweller, K., Ed. (1996). Education vital signs. The American School Board, 183, A l- A31. Cameron, K. Critical questions in assessing organizational effectivness. Organizational Dynamics, 9, 66-80. Campbell, J. P., Brownas, E. A., Peterson, N. G. & Dunnetee, M. D. (1974). The measurement o f organizational effectiveness: A review o f relevant research and opinion. San Diego: Naval Personnel Research Center. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 149.
    132 Caravatta, M. (1997).Conducting an organizational self-assessment using the Baldrige Award criteria. Quality Progress, 30, 87-91. Chappell, R. (1993). Effects o f the implementations o f total quality management on the Rappahannock County, Virginia public schools. (Doctoral dissertation, Virginia Polytechnic Institute and State University, 1993). Dissertation Abstracts International. Collins, B. & Huge, E. (1993). Management by policy. Milwaukee, WI: ASQC Quality Press. Consortium on Productivity in Schools. (1995). Using what we have to get the schools we need. New York: Institute on Education and the Economy. Council on Competitiveness. (1995). Building on the Baldrige: American qualityfo r the 21st century. Washington, DC: Council on Competitiveness. Crosby, P. (1984). Quality without tears. New York: Plume Press. Crosby, P. & Reimann, C. (1991). Criticism and support for the Baldrige Award. Quality Progress, M ay, 41-44. Danne, D. J. (1991). Total quality management and its implications for secondary education. (Doctoral dissertation, Pepperdine University, 1991). Dissertation Abstracts International. Deer, C. (1976). ‘O. D .’ won’t work in schools. Education and Urban Society, 8, 227. Deming, W. E. (1986). Out o f the crisis. Cambridgem MA: Massachusetts Institute of Technology, Center for Advanced Engineering Study. Deming, W. E. (1989). Foundationfo r management o f quality in the western world. Paper presented at the meeting o f the Institute o f Management Science, Osaka, Japan. Deming, W. E. (1994). The new economicsfo r industry, government and education. Cambridge, MA: Massachusetts Institute o f Technology, Center for Advanced Engineering Study. DeMont, B. & DeMont, R. (1973). A practical approach to accountability. Education Technology, 40-45. Dubin, R. (1976). Organizational effectiveness: Some dilemmas o f perspective. Organizational and Administrative Sciences, 7, 7-14. Edmonds, R. R. (1979). Some schools work and more can. Social Policy, 9, 28-32. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 150.
    133 Edmonds, R. (1980).Effective schools for the urban poor. Educational Leadership, 40, 15- 23. Eisenberg, E. & Goodall, H. (1993). Organizational communication. New York: St. Martin’s Press. Elam, S., Rose, L., & Gallup, A. (1996). The 28thannual Phi Delta Kappa/Gallup poll o f the public’s attitudes toward the public schools. Phi Delta Kappan, 78, 41-59. English, F. (1988). Curriculum auditing. Lancaster, PA: Technomic Publishing Co. English, F. & Hill, J. (1994). Total quality education. Thousand Oaks, CA: Corwin Press. Etzioni, A. (1960). Two approaches to organizational analysis: A critique and a suggestion. Administrative Science Journal, 5, 257-278. Feigenbaum, A. V. (1983). Total quality control. New York: McGraw-Hill. Fields, J. (1994). Total qualityfo r schools a guidefo r implementation. Milwaukee, WI: ASQC Quality Press. Fritz, S. (1993). A quality assessment using the Baldrige criteria: Non-academic service units in a large university. (Doctoral dissertation, University of Nebraska— Lincoln). Dissertation Abstract International. Fullan, M. (1980). Organizational development in schools: The state o f the art Review o f Educational Research, 50, 121-183. Fullan, M. & Miles, M. (1992). Getting reform right: What works, what doesn’t. Phi Delta K appan, 73, 744-52. Fullan, M. (1993). Innovation, reform, and restructuring strategies. In Cawelti, G. (Ed.), Challenges and achievements o fAmerican education innovation, reform, and restructuring strategies (pp. 116-133). Alexandria, VA: Association for Supervision and Curriculum Development. Fullan, M. (1997). Emotion and hope: Constructive concepts for complex times. In Hargreaves, A. (Ed). Rethinking educational change with heart and mind (pp. 216- 233). Alexandria, VA: Association for Supervision and Curriculum Development. Gall, M., Borg, W. & Gall, J. (1996). Educational Research An Introduction (6thed.). White Plains, NY: Longman. Garvin, D. (1988). Managing quality: The strategic and competitive advantage. New York: Free Press. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 151.
    134 Georgopoulos, B. &Tannenbaum, A. (1957). The study o f organizational effectiveness. American Sociological Review, 22, 534-550. Gerstner, L., Semerad, R., Doyle, D. & Johnston, W. (1995). Reinventing education entepreneurship in Am erica’s public schools. New York: Plume Press. Glasser, W. (1992). The quality school. New York: Harper Collins. Goodlad, J. (1984). A place called school. New York: McGraw-Hill. Green, D. (1996). A case for the Koalaty Kid. Quality Progress, 29, 97-99. Guba, E. (1981). Investigative journalism. In Smith, N. L. (Ed.). New techniquesfo r evaluation. Beverly Hills, CA: Saga Publications. Hannan, M. & Freeman, J. (1977). Obstacles to the comparative study and organizational effectiveness. In Goodman, P. & Pennings, J. (Eds.), New perspectives on organizational effectiveness. San Francisco: Jossey-Bass. Hargreaves, A. (1997). Rethinking educational change: going deeper and wider in the quest for success. In Hargreaves, A. (Ed.), Rethinking educational change with heart and mind (pp. 1-26). Alexandria, VA: Association for Supervision and Curriculum Development. Hersey, P. & Blanchard, K. (1982). Management o f organizational behavior: Utilizing human resources. Englewood Cliffs, NJ: Prentice-Hall. Hodgkinson, H. (1996). Why have Americans never admired their own schools? The School Administrator, 53, 18-22. Hoy, W. & Ferguson, J. (1985). A theoretical framework and exploration o f organizational effectiveness of schools. Educational Administration Quarterly, 21, 117-134. Houston, P. (1996). From Horace Mann to the contrarians. The School Administrator, 53, 10-13. Huelskamp, R. (1993). Perspectives on education in America. Phi Della Kappan, 7 4 , 1 18- 721. Idaho State Department o f Education. (1996). Accreditation Standards and Procedures For Idaho Schools. Boise: Author. Idaho State Department o f Education. (1996). Idaho School Profiles, 1995-96. Boise: Author. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 152.
    135 Imai, M. (1986).Kaizen: The key to Japan's competitive success. New York: McGraw-Hill Publishing. Jaeger, R. & Hattie, J. (1996). Artifact and artifice in education policy analysis: It’s not all in the data. The School Administrator, 53, 24-29. Joint Committee on Standards For Educational Evaluation. (1994). The program evaluation standards, (2nded.). Thousand Oaks, CA: Sage Publications. Juran, J. (1988). Juran on planningfo r quality. Cambridge, MA: Productivity Press. Kamen, H. (1993). A study o f the impact o f the curriculum audit process in three school systems. (Doctoral dissertation, University o f Cincinnati, 1993). Dissertation Abstracts International. Katz, D. & Kahn, R. (1978). The social psychology o f organizations, (2nd ed.). New York: Wiley. Kearns, D. & Doyle, D. (1988). Winning the brain race. San Francisco, CA: Institute for Contemporary Studies Press. Kennedy, M. (1995). An analysis and comparison o f school improvement planning models. (Doctoral Dissertation, University o f Central Florida, 1995/ Dissertation Abstracts International. Klaus, L. (1996). Quality Progress sixth quality in education listing. Quality Progress, 29, 29-45. Krejcie, R. & Margan, D. (1970). Determining sample size for research activities. Educational and Psychological Measurement, 30, 607-610. Langford, D. & Cleary, B. (1995). Orchestrating learning with quality. Milwaukee, WI: ASQC Quality Press. Levine, R. & Fitzgerald, H. Living systems, dynamical systems, and cybernetics. In Levine, R. & Fitgerald, H. (Eds.). (1992/ Analysis o f Dynamic Psychological Systems, Volume I. New York: Plume Press. Lezotte, L. (1989). Base school improvement on what we know about effective schools. The American School Board Journal, 176, 18-20. Lezotte, L. W. & Bancroft, B. A. (1985). School improvement based on effective schools research: A promising approach for economically disadvantaged and minority students. Journal o f Negro Education, 54, 301-312. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 153.
    136 MacLellan, D. (1994).Towards a new approach for school system evaluation. (Doctoral dissertation, Dalhousie University, Nova Scotia). Dissertational Abstracts International. Mann, D. (1976/ Policy decision-making in education: An introduction to calculation and control. New York: Teachers College Press. McCIanahan, E. & Wicks, C. (1994). Futureforce. Glendale, CA: Griffen Publishing. Miller, S. (1993). The applicability of the Malcolm Baldrige National Quality Award criteria to assessing the quality o f student affairs in colleges. (Doctoral dissertation, Ohio University— Athens, 1993). Dissertation Abstracts International. National Institute o f Standards and Technology. (1994). Malcolm Baldrige national Quality award. Gaithersburg, MD: United State Department o f Commerce and Technology Administration. National Institute o f Standards and Technology. (1995). Malcolm Baldrige national quality award education pilot. Gaithersburg, MD: United State Department of Commerce and Technology Administration. Nicolis, G. & Prigogine, I. (1977). Self-organization in nonequilibrium systems: From dissipative structures to order through fluctuations. New York: Wiley. Northwest Association o f Schools and Colleges. (1996). Setting World Standards For Accreditation. Nowakowski, J., Bunda, M., Working, B. & Harrington, P. (1985). A handbook of educational variables. Boston, MA: Kluover-Nijhoff. O’Neil, John. (1995). On schools as learning organizations: A conversation with Peter Senge. Educational Leadership, 52, 20-23. Owens, R. (1970). Organizational Behavior in Schools. Englewood Cliffs, NJ: Prentice- Hall. Pannirselvam, G. (1995). Statistical validation o f the Malcolm Baldrige National Quality Award model and evaluation process. Unpublished doctoral dissertation, Arizona State University, Tempe. Partin, J. (1992). A measurement o f total quality management in the two-year college districts o f Texas. (Doctoral dissertation, East Texas State University). Dissertation Abstracts International. Patterson, J. (1993). Leadershipfo r tomorrow’s schools. Alexandria, VA: Association for Supervision and Curriculum Development. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 154.
    137 Patterson, J., Purkey,S. & Parker, J. (1986). Productive school systemsfo r a rational world. Alexandria, VA: Association for Supervision and Curriculum Developnment. Patton, M. (1983). Qualitative evaluation methods. Beverly Hills, CA: Sage Publications. Pines, E. (1990). From top secret to top priority: The story of TQM. Aviation Week & Space Technology, S5-S24. Portner, J. (1997, March). Once a status symbol for schools, accreditation becomes rote drill, Education Week, xvi (1), 30-31. Provus, M. (1971). Discrepancy evaluationfo r education program: Improvement and assessment. Berkley, CA: McCutchan. Purkey, S. & Smith, M. (1982). Too soon to cheer?: Synthesis o f research on effective schools. Educational Leadership, 43, 64-69. Regeuld, M. (1993). A study o f continuous improvement processes based on total quality management principles as applied to the educational environment. (Doctoral dissertation, Pennsylvania State University). Dissertation Abstracts International. Rhodes, L. (1990). Beyond your beliefs: Quantum leaps toward quality schools. The School Administrator, 47, 23-26. Rotberg, Iris. (1996). Five myths about test score comparison. The School Administrator, 53, 30-35. Rubin, S. (1994). Public schools should learn to ski. Milwaukee, WI: ASQC Press. Sagor, R. (1995). Overcoming the one-solution syndrome. Educational Leadership, 52, 24- 27. Sarason, S. (1990). The predictable failure o f educational reform. San Francisco, CA: Jossey-Bass Publishers. Schmoker, M. (1996). Results the key to continuous improvement. Alexandria, VA: ASCD. Scholtes, P. (1993, February). Quality Learning Series. Paper presented by the Quality Learning Services Division o f the U. S. Chamber of Commerce. Madison, WI: Joiner Associates. Scriven, M. (1973). Goal-free evaluation. In House, E. R. (Ed.), School evaluation: The politics and process. Berkley, CA: McCutchan. Seigal, P. & Byrne, S. (1994). Using quality to redesign school systems. San Francisco, CA: Jossey-Bass Publishers. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 155.
    138 Senge, P. (1990).The Fifth Discipline. New York: Currency Doubleday. Senge, P., Kleiner, A., Roberts, C., Ross, R. & Smith, B. (1994). The Fifth Discipline Handbook. New York: Doubleday. Sergiovanni, T. (1992). M oral Leadership. San Francisco, CA: Jossey -Bass Publishers. Shipley, J. & Collins, C. (1997). Going to scale with TOM. Tallahassee, FL: Southeastern Regional Vision for Education. Skrtic, T. (1991). Behind Special Education. Denver, CO: Love Publishing. Smith, R. (1996). The impact o f training based on the Malcolm Baldrige National Quality Award on employee perceptions. Unpublished doctoral dissertation, University of Idaho, Moscow. Stake, R. (1967). Toward a technology for the evaluation o f educational programs. In Tyler, R., Gagne, R. & Scriven, M. (Eds.). Perspectives o f curriculum evaluation (pp. 1-12). Chicago: Randy McNally. Stampen, J. (1987). Improving the quality o f education: W. Edwards Deming and effective schools. Contemporary Education Review, 3,423-433. Stefanich, G. (1983). The relationship o f effective schools research to school evaluation, North Central Association Quarterly, 88, 343-349. Stufflebeam, D. (1983). The CEPP model for program evaluation. In Maciaus, G., Scriven, M. & Stufflebeam, D. (Eds.). Evaluation models: Viewpoints in educational and human services evaluation. Boston, MA: Kluver-Nijhoff Publishing. Thomas, W. & Moran, K. (1992). Reconsidering the power o f the superintendent in the progressive period. American Educational Research Journal, 29, 22-50. Timpane, M. & Reich, R. (1997). Revitalizing the ecosystem for youth. Phi Delta Kappan, 464-470. Tribus, M. (n.d.). The transformation o fAmerican education to a system fo r continuously improved learning. Hayward, CA: Exergy, Inc. Vertiz, V. (1995). What the curriculum audit reveals about schools. The School Administrator, 25-27. Vertiz , V. & Bates, G. (1995, April). The Curriculum Audit: Revelations About Our Schools. Paper presented at the meeting o f the American Educational Research Association, Division B. Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 156.
    139 Wagner, T. (1996).Bringing school reform back down to Earth. Phi Delta Kappan, 78, 145-149. Wagner, T. (1993). Systemic change: Rethinking the purpose o f school Educational Leadership, 51, 24-28. Walton, Mary. (1990). Deming management at work. New York: Putnam Publishing Co. Weick, E. (1976). Educational organizations as loosely couple systems. Administrative Science Quarterly, 21, 1-19. Wiersma, W. (1995). Research M ethods in Education. Needham Heights, MA: Allyn and Bacon. Worthen, B. & Sanders, J (1987). Educational Evaluation. White Plains, NY: Longman. Wu, Hung-Yi. (1996). Development o f a self-evaluation system fo r total quality management using the Baldrige criteria. Unpublished doctoral dissertation, University of Missouri, Rolla. Yuchtman, E. & Seashore, S. (1967). A system resource approach to organizational effectiveness. American Sociological Review, 32, 891-903. Zammuto, R. (1982). Assessing organizational effectiveness. Albany, NY: State University o f New York Press. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 157.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Appendix A: Instrument 1.0Leadership This category examines the school district’s leadership (defined as the board of trustees and senior administrators, at the district level) personal leadership and involvement in creating and sustaining a student focus, clear goals, high expectations, and a leadership system that promotes performance excellence. Please select I from thefive choices in each box which most closely describes your school district in that area. Fill in the circle completely next to the statement selected with pencil or black ink.__________________________________________________________ Item # Description 1.1 The extent to which there is clear direction throughout the district Definitions: *Stakeholder- Individuals or groups, both internal to the school (students, all personnel) and external (parents, community members, business) which are affected by the conditions and quality of education and the preparedness o f graduates. 1.1.1 1.1.2 1.1.3 1.1.4 1.1.5 1.1.6 A clearly communicated and consistent direction of the district focus based on stakeholder* needs and expectations does not exist. A clearly communicated and consistent direction of the district focus based on stakeholder* needs and expectations exists. It is not widely disseminated or used by personnel. A clearly communicated and consistent direction of the district focus based on stakeholder* needs and expectations exist and is known throughout the district. It is does not appear to be used consistently in district decision-making. A clearly communicated and consistent direction of the district focus based on stakeholder* needs and expectations exists and appears to be considered to some degree when making planning decisions. This information is communicated to parents. A clear direction of district focus is broadly communicated throughout the community. It guides all major decisions throughout the entire system. The district direction is systematically re-evaluated through an improvement cycle, involving multiple stakeholders * and sources of information. I do not know.
  • 158.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 1.2The extent to which a review process exists to study the performance** of the district Definitions: *Stakeholder- Individuals or groups, both internal to the school (students, all personnel) and external (parents, community members, business) which are affected by the conditions and quality o f education and the preparedness o f graduates. **Performance refers to the results produced by the school district as illustrated by multiple sources of information. 1.2.1 1.2.2 1.2.3 1.2.4 1.2.5 1.2.6 A review process is done to meet state accreditation standards as often as required. There does not appear to be any other structured review process to determine how the school district is meeting or exceeding the needs and expectations of its stakeholders*. Leaders study performance** information on an annual basis, using standardized test scores, attendance, enrollment and financial information. A systematic process is being developed district wide to establish multiple reliable and valid indicators of student and district performance* *. A study process exists to regularly assess the school district performance based on multiple sources of information. All personnel regularly assess the performance of their programs which is aligned to the district study process. This information is used to create district direction and areas for improvement. A systematic process for review of the district’s performance exists which exceeds the requirements for state accreditation. Multiple stakeholders are involved. Decisions are based on multiple indicators of performance information, community demographics, and forecasting future needs. The review process is systematically evaluated and improved as necessary. I do not know. 1.3 Leadership’s role in improvement efforts Definition: *Stakeholder- Individuals or groups, both internal to the school (students, personnel) 1.3.1 1.3.2 Leaders** initiate few district-wide improvements. Most improvements are initiated at the building level. Leaders** initiate district-wide improvements and allocate existing resources to support those improvements.
  • 159.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description andexternal (parents, community members, business) which are affected by the conditions and quality of education or the preparedness ofgraduates **Leaders defined as Board o f Trustees and senior administrators at the district level. 1.3.3 1.3.4 1.3.5 1.3.6 Leaders** personally encourage and advocate for school or district wide program improvements that are consistent with district directions. Leaders** are committed to continuous improvement of district performance based on a thorough understanding o f the needs and expectations of stakeholders**. District resources are allocated to accomplish the targeted improvement areas. Leaders** are visibly involved and facilitate improvements throughout the system. Leaders** monitor the system for progress and view themselves as accountable for the performance of the district. There is a systematic review of the role of leadership in improvement efforts. 1 do not know. 1.4 The extent to which there exists a district- wide collaborative and participatory approach to management* Definitions. *Defined asjointly working to identify problems and determining improvements with others in the organization who are knowledgeable, involved and affected by any decisions made. 1.4.1 1.4.2 1.4.3 1.4.4 There appears to be little collaboration or substantive involvement of appropriate personnel in decision-making. Important decisions are made at a district level. There are vehicles in school district units** to make some decisions. District-level decision making and approval is still the primary vehicle for significant decisions. Leaders*** value participatory decision-making and are actively working to improve and develop these processes in all school district units**. Effective processes to involve personnel, parents and community stakeholders are in place in all school district units**. Decision-making responsibilities and expectations are clearly defined for the school district unit**. to
  • 160.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description **Schooldistrict units are defined as specific schools, departments, or services of that school district. **Leaders defined as Board o f Trustees and senior administrators at the district level 1.4.5 Collaborative and participatory management is institutionalized, valued and a part of the culture. All personnel of the school community and significant stakeholders are included. Collaborative processes are systematically evaluated to determine effectiveness and improvements are made as necessary. 1.5. Board policy Definitions: *Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which are affected by the conditions and quality o f education or the preparedness o fgraduates 1.5.1 1.5.2 1.5.3 1.5.4 1.5.5 1.5.6 Board policies are developed by the Board of Trustees with minimal involvement of stakeholders* and reviewed as becomes necessary. There is some involvement of stakeholders in the development and review of Board Policies. The district is developing a systematic process to review and design board policy involving stakeholders to support the district direction and strategic plan. Board policies have been revised to support the district direction and strategic plan. Polices support the delivery of curriculum and instruction. Processes exist to communicate this information to personnel who implement policy. A systematic review process of board policy exists to insure alignment to district direction and strategic plan. A review process of district performance is aligned to strategic planning processes. Board polices are clearly communicated to all personnel. I do not know. U>
  • 161.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 1.6.School District Responsibility to the Public Definition: *Leaders defined as Board o f Trustees and senior administrators at the district level 1.6.1 1.6.2 1.6.3 1.6.4 No defined processes exist to anticipate public concerns or expectations. Some activities have been done to involve the community, but are usually reactive to an expressed public concern. There is some involvement of the leadership in activities to strengthen and support their communities. Leaders* recognize the importance of anticipating public needs and expectations and are developing processes to do so. Leaders* are actively involved in the community. Leaders* invite stakeholders into operations of the district and are actively involved in community groups to solve community problems. 1.6.5 The district serves as a role model of outreach and public service to the community through effective processes to anticipate the public’s interests and needs. The district systematically evaluates its own effectiveness in involving the public and being involved in the community. 1.6.6 I do not know. 1.7 State, Legal and Ethical Conduct in Operations 1.7.1 Citations for non-compliance to required federal, state regulations have frequently been found. Expectations for ethical conduct are not clarified. Definition: *School district units are defined as specific schools, departments, or services of that school district. 1.7.2 1.7.3 Citations for non-compliance are being addressed. Ethical conduct is demonstrated by leadership. Efforts are made to achieve compliance throughout all school district units*. £
  • 162.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 1.7.4 1.7.5 1.7.6 Leadersdemonstrate high commitment to legal and ethical conduct in all aspects of the school district through existing policies and practices. Prevention processes are in place to ensure performance above minimal requirements. The school district exceeds minimal requirements for compliance to federal, state regulations. Leaders are role models for ethical conduct in all activities within the district and the community. There is a systematic process in place to review policies and practices related to ethical standards and compliance to legal requirements. 1 do not know.
  • 163.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 2.0 Strategic Planning Thiscategory examines how the school district sets strategic directions and how it determines stakeholders* expectations of the school district. Please consider how these stakeholder expectations and requirements are translated into an effective performance management system, with a primary focus on student performance. (*Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which are affected by the conditions and quality o f education or the preparedness o f graduates; also includes state agencies and their requirements) Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle Item # Description 2.1 Strategic Development* Definitions: 2.1.1 No school district improvement plan exists. There appears to be little organized effort to examine performance of the school district, including student performance and district operations. *Defined as the process by which members of an organization clarify 2.1.2 The school district does have some goals and objectives, usually associated with state or federal requirements. Traditional indicators including standardized tests scores, compliance reviews and accreditation ratings are used to determine district improvements. the purpose and develop the necessary procedures and operations to achieve that purpose. A 2.1.3 District is developing a more comprehensive approach to strategic planning by examining needs of students, analyzing the current performance of students, focusing on the needs and expectations of stakeholders through the collaborative involvement of all school district units** and community. strategic plan is designed. 2.1.4 District considers extensive sources of information both internal to the district and external, such as community demographic information. Representatives from a variety o f stakeholder groups participate in strategic planning. Clear long and short term goals and objectives exist. Measures are defined, with clearly specified timelines and responsibility. School district units** develop plans consistent with district strategic plan. O n
  • 164.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description **SchooIdistrict units are defined as specific schools, departments, or services of that school district. 2.1.5 2.1.6 District has specific short and long term goals and objectives in place established as a result of a comprehensive assessment of student performance and district operations, stakeholder needs and expectations through the collaboration of stakeholders. All aspects of the school district are examined to support the implementation and accomplishment of the goals and objectives. School district units** have plans aligned to the district strategic plan. A systematic process is in place to review the strategic plan to make necessary improvements. 1 do not know. 2.2 Focus of strategic plan Definition: *School district units are defined as specific schools, departments, or services of that school district. 2.2.1 2.2.2 2.2.3 2.2.4 2.2.5 2.2.6 There appears to be a fragmented focus on improved student performance in the district. Emphasis exists on improving student performance on standardized and state tests. Improved student performance is the focus of strategic plan. Improvements are being identified throughout school district units* to support high student performance. District plan reflects integrated efforts of involved school district units* to support and accomplish improved performance for all students. District focus on improved student performance is evident throughout the school district strategic plan. The district regularly assesses its plan for its focus on improved student performance. 1 do not know. 2.3 Implementation and assessment of strategic plans 2.3.1 2.3.2 District strategic plan may exist but is not disseminated and reviewed with all school district units*. Responsibility for implementation is unclear. District plan is disseminated to all personnel in school district. Responsibility for implementation is noted. Processes to accomplish the goals are unclear with no evaluation measures specified. -p* - j
  • 165.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description Definition: *Schoolunits are defined as specific schools, departments, or services of that school district. 2.3.3 2.3.4 District strategic plan is disseminated and reviewed with all school district units* and with all personnel throughout the school district. Plan is published and disseminated to the community. A clear implementation process exists, with responsibilities and timelines clearly delineated. Work teams, involving personnel in appropriate school district unit* and external stakeholder, are established. Evaluation measures are specified. Monitoring processes support successful implementation. 2.3.5 Implementation of district strategic plan is clear, and aligned throughout all school district units*. District and building plans are implemented and progress is assessed continually throughout the district. A systematic process exists to evaluate the implementation of strategic plans throughout the school district for continual improvement. 2.3.6 1do not know. 4^ 00
  • 166.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 3.0 Student Focusand Satisfaction/Stakeholder Satisfaction This category examine how the school district determines student and stakeholder* needs and expectations. Please consider how the school district enhances stakeholder relationships and determines their satisfaction. *{Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which are affected by the conditions and quality of education or the preparedness o f graduates; also includes state agencies and their requirements) Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle completely next to the statement selected with pencil or black ink._____________________________________________ Item Description 3.1 The ways in which the school district determines students’ needs from and expectations of educational programs Definition: *School district units are defined as specific schools, departments, or services of that school district. 3.1.1 3.1.2 3.1.3 3.1.4 Standardized test scores are the primary source o f information to determine student needs. Little or no analyses of the data are done nor disseminated to appropriate school district units*. Students are not asked in any formal way for their needs or expectations. Besides state test scores, the district uses other sources of locally generated and collected information to determine student needs and expectations Some analysis is done as determined by individual school district unit *. District is developing comprehensive strategies to determine all student needs. District is in early stages of using multiple strategies, both inside and outside the school district, to determine students’ comprehensive needs throughout all grade levels and at specific times. Information is used in the district’s strategic planning process. 149
  • 167.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 3.1.5 3.1.6 Districthas a well developed and extensive system of determining student needs and expectations. The emphasis is on prevention of school failure and the analysis of multiple sources of information to determine trends. Information is analyzed, compared and used in all district decision-making processes. The process to determine needs and expectations of students is systematically evaluated. I do not know. 3.2 High expectations for the performance of students Definitions: *Performance expectations refer to clearly defined statements describing specific academic, behavioral or social criteria to measure achievement, often referred to as standards. **Stakeholder-Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which are affected by the conditions and quality of education or the preparedness ofgraduates; includes state agencies and requirements 3.2.1 3.2.2 3.2.3 3.2.4 3.2.5 3.2.6 Performance expectations* for students do not exist at any level. Performance expectations* are being developed for some areas. Performance expectations* exist for some grades/subject areas. Performance expectations*, including exit standards for graduation, exist throughout the K-12 system in all grade levels, courses and social conduct. Strong stakeholder support exists for these criteria. Performance expectations* are systematically evaluated and adjusted to reflect stakeholder** expectations. 1do not know. o
  • 168.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 3.3Student and stakeholder satisfaction Definitions. *School district units are defined as specific schools, departments, or services of that school district. **Satisfaction with educationalprograms and performance of school district 3.3.1 3.3.2 3.3.3 3.3.4 3.3.5 3.3.6 There are no processes district-wide to determine student or stakeholder satisfaction**. Anecdotal information often used to determine student or stakeholder satisfaction*. Some attempts to determine student or stakeholder satisfaction are made by individual school district units*. District is developing a process to determine student and stakeholder satisfaction. District is making deliberate and consistent efforts to utilize a variety o f strategies to determine student and stakeholder satisfaction throughout the district for all school district units*. This information is disseminated throughout the district to all school district units*. District has clearly defined processes in place to measure, analyze and compare student and stakeholder satisfaction** results at specific points and in all school district units*. Information is used in strategic planning process. Processes are systematically evaluated for effectiveness on a regular basis. I do not know 3.4 Identifying future needs and expectations of students and stakeholders*. Definition: *Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, 3.4.1 3.4.2 3.4.3 District focuses on immediate needs of students as they occur. District considers local demographic factors and trends which affect enrollment and student needs and stakeholder* expectations. District considers local, state, and national trend data and demographics to determine future needs of students and expectations of stakeholders*.
  • 169.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description communitymembers, business) which are affected by the conditions and quality of education or the preparedness o fgraduates; also includes state agencies and their requirements 3.4.4 3.4.5 3.4.6 District is in early stages o f using demographic factors, changing state, federal requirements or trends, changing expectations and needs of higher education and the workplace as part of the strategic planning process. Multiple strategies to consider future needs/expectations of students and stakeholders* from multiple sources are systematically used in all strategic planning processes. These processes are evaluated on a regular basis to determine need for improvement. Processes used are compared to other organizations who have demonstrated exemplary ways to forecast student and stakeholder* needs. I do not know. LA N>
  • 170.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 4.0 Information andAnalysis This category examines the management of and effective use of data and information to drive mission-related performance excellence in a school district. Please consider how your school district collects and uses information and data to make decisions. Please select / from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle completely next to the statement selected with pencil or black ink.___________________________________________________ Item # Description 4.1 Selection and Use of Information and Data Definitions: *Conventional information is defined as standardized and state test scores, enrollment, attendance, dropout, discipline, operating budget **School district units are defined as specific schools, departments, or services of that school district. 4.1.1 4.1.2 4.1.3 4.1.4 4.1.5 4.1.6 Conventional information* is used primarily by district office and board in planning. This information is not widely or regularly disseminated throughout the district. Conventional information* is disseminated at least annually to all school district units**. This information is used by school district units to assess student or operational performance. A process is being developed to collect, manage, and use specific information and data which are needed to meet the mission and develop key district goals which focus on improving student performance. District has a systematic process in place to collect, analyze, and disseminate critical information, beyond conventional information*, to all school units** related to district key goals and objectives. Information used comes from a variety of sources used to determine student performance, student needs and satisfaction, stakeholder needs and satisfaction. A systematic process for collecting, analyzing, and using comprehensive information and data from multiple sources is fully deployed throughout all school district units**. All personnel have easy access to the information and use it to evaluate, adjust, and create key goals for school district unit** plans. The information and data process is systematically evaluated and improvements are made as necessary. 1do not know. u>
  • 171.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item U Description 4.2Selection and Use 4.2.1 No process is in place currently to seek or use comparative data*. of Comparative Information and Data 4.2.2 Some practices exist to compare conventional data*. Practice is limited to specific school district units** or personnel. Definitions: *Comparative data, or 4.2.3 District is developing a process to determine needs and priorities, criteria, and use of benchmarking, is an comparative data*. improvement process in which a school district 4.2.4 A systematic process exists to use comparative data* and is fully implemented in all school compares its district units**. A process for benchmarking is used to set improvement targets and integrate performance against best practices. against best-in-class school districts and uses 4.2.5 The use of comparative data* is widely used throughout all school district units** to set the information to targets, integrate new practices, and evaluate performance. The benchmarking* process is improve its own systematically evaluated and improved as necessary. performance. 4.2.6 1 do not know. **School district units are defined as specific schools, departments, or services o f that school district. 4.3 Analysis and Use of 4.3.1 Analysis of conventional information** is done at the district level. School District Performance* Data 4.3.2 Analysis of conventional information** is done by the school district unit.
  • 172.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description Definitions:4.3.3 Information and data are collected, disaggregated, analyzed and disseminated district wide. *Performance data Information is used to gain understanding of student and student group performance and includes data or school district unit performance. informationfrom all aspects o f the 4.3.4 Performance data* from all school district units are integrated and analyzed to assess overall organization including district performance. Comparative data is used in the analysis. student performance measures, enrollment, 4.3.5 Comprehensive performance data* is analyzed, disseminated throughout the district and discipline, human accessible to all school district units. This information is an integral part of planning process resources, business and used to adjust and establish key objectives essential to all decisions by all school district operations and units. community. 4.3.6 I do not know. **Conventional information is defined as standardized and state test scores, enrollment, attendance. dropout, discipline, operating budget
  • 173.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 5.0 Human ResourceDevelopment and Management This category examines how administrative, faculty and support personnel (all non-certified, business and operations staff throughout the district) are empowered to develop and utilize their full potential in order to enable the school district to realize its mission through the accomplishment of key performance goals. Please consider the district driven efforts to build and maintain an environment conducive to performance excellence, full participation, and personal and organizational growth. Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle Item # Description 5.1 Learning and 5.1.1 Insufficient attention is given to the learning and/or working climate by the leadership*. Working Climate 5.1.2 Some attention is given to creating a high performance environment for both students and Definitions: personnel, but usually driven by school district unit ** administrator. *Leadership is defined as district level senior 5.1.3 Efforts to create a positive, productive, safe environment are evident throughout the administrators and board district. of trustees. 5.1.4 A strong positive, productive and safe environment is visible throughout the district. High **School district units performance is fostered among personnel. Morale is high among personnel district-wide. are defined as specific schools, departments, or 5.1.5 Feedback from personnel is systematically obtained and used to guide decisions regarding services o f that school the working and learning environment. This area is systematically reviewed and aligned district. with the strategic planning process in the district. 5.1.6 I do not know. ON
  • 174.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 5.2Work Systems* 5.2.1 Positions and work tasks are organized within traditional positions, roles, and responsibilities with most of the authority for decision-making with district level leadership. 5.2.2 Efforts are being made to move decision-making to teams within school district units**. There have been some decisions to de-centralize specific functions. Definitions: 5.2.3 There are deliberate efforts underway to improve the district’s work systems through *Work systems are increased opportunities for self-directed responsibility of personnel in all school units** in defined as how jobs, work designing, managing, and improving the district’s operations in order to accomplish the and decision-making are designed at all levels district’s mission and key goals. within the organization. 5.2.4 Work processes exist which allow for all personnel to contribute optimally in their school unit** through self-directed teams which foster flexibility, communication among school **School district units units, and accomplishment of key goals. Labor and management have collaboratively are defined as specific schools, departments, or designed personnel practices to support this. services o f that school 5.2.5 Work and job functions are designed to accomplish district goals. Effective district. communication and collaboration exists across work functions. Processes for evaluation, compensation, promotion, and recognition is exemplary. Work system processes are systematically evaluated on a regular basis. 5.2.6 1 do not know. 157
  • 175.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 5.3Personnel Education, Training, and Development Definitions: *School district units are defined as specific schools, departments, or services o f that school district. 5.3.1 5.3.2 5.3.3 Opportunities for staff development are limited to in-service days. Personnel have some input into topics or design. Needs assessments are conducted for faculty and support personnel. Staff development is based on the needs identified. Topics may or may not be aligned with the district’s strategic plan. Processes are being designed to align education and training decisions to the district’s mission and key performance goals. Orientation for new personnel provides training on the district’s mission, goals, and work systems. 5.3.4 A process for determining, designing, and evaluating education and training is established and implemented across all school district units*. Education and training are provided in a variety of ways. Application of knowledge and skills is expected and supported through specifically designed strategies. Reaction to training is regularly assessed and evaluated for necessary improvements. 5.3.5 Processes for education and training are aligned to strategic planning processes. Decisions for education and training are made on the basis of key performance goals and district competencies needed to achieve performance expectations. Staff development activities are measured for impact on learning, performance of staff and effect on students’ performance. The design and delivery processes for staff development are evaluated on a regular basis. 5.3.6 1do not know. L /l 00
  • 176.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 5.4Performance 5.4.1 The performance appraisal* process is completed on an annual basis involving the Appraisal Systems individual and the immediate supervisor. Little information is generated from the process to promote further development. Personnel find little value in the process. Definition: *Performance appraisal 5.4.2 Performance appraisals* are done annually and result in the development of specific goals refers to the regular established cooperatively by the individual and supervisor to promote further development. evaluation procedures o f Personnel find some value in the process, but agree it could be improved. (he performance o f personnel. 5.4.3 Personnel are engaged in developing performance appraisal* processes to support high performance, stakeholder satisfaction, continuous improvement and collaboration between management and personnel. 5.4.4 Each personnel unit, including leadership, uses a performance appraisal* process which involves feedback from key identified stakeholders with whom they work closely. Results of the appraisal process are linked to district’s professional development plans and continual development of performance. 5.4.5 Performance appraisal* processes meet the professional needs of all personnel and provide useful feedback from identified key stakeholders. Personnel work collaboratively with management to identify areas of growth to further the district’s mission. Performance appraisal* processes are systematically reviewed to identify improvement areas. 5.4.6 I do not know. sO
  • 177.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 5.5Employee Well- Being and Satisfaction 5.5.1 5.5.2 Employee motivation and satisfaction are given little attention. Employee motivation and satisfaction are addressed through employee recognition events, or individual administrator’s efforts. 5.5.3 Processes are being developed to determine employee needs and expectations. The work environment is being improved to accommodate needs to the extent possible. 5.5.4 Employee satisfaction, well-being, and motivation are seen as key requirements to develop capabilities to realize its mission and reach its performance goals. Satisfaction surveys are administered on a regular basis. Personnel are able to identify, recommend, and make improvements. Multiple strategies for reward and recognition exist. 5.5.5 The accomplishment of performance goals is recognized as fundamental to employee satisfaction. Processes to measure employee needs and satisfaction are integrated into strategic planning process. A variety of opportunities are available to promote well-being, satisfaction and motivation throughout the district. These processes are systematically evaluated to make improvements as necessary. 5.5.6 1do not know. O n o
  • 178.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 6.0 Educational andOperational Process* Management This category examines how work is designed, managed, accomplished, and evaluated for effectiveness. Please think about the extent to which all work processes are designed, managed, accomplished and evaluated with a focus on stakeholder satisfaction in all work units. Also consider how processes are designed, effectively managed, and improved to achieve better performance. (*Process refers to the steps and sequence o f steps done in a specific work activity; e.g. enrolling a new student in school; requisition of instructional supplies; hiring new personnel.) Please select 1from thefive choices in each box which most closely describesyour school district in that area. Fill in the circle completely next to the statement selected with pencil or black ink.__________________________________________________ Item Description 6.1 The design and management of educational programs* Definitions: *Educationalprograms are defined as all programs and services provided to students and conducted by professional, certified personnel or non-certified personnel under the supervision of certified personnel 6.1.1 6.1.2 6.1.3 6.1.4 Design and management of programs are based on federal or state regulations, traditional practices and individual preferences or opinions. Design and management of some programs are based on student performance on state­ wide testing, perceived student needs, and/or preferences of management. Processes are being designed and developed to base decisions for educational programs and services on student needs, performance results, stakeholder satisfaction and research- proven practices. Student performance expectations and curricula are being developed to align with these needs, expectations, and practices. Many programs and services are designed and managed through established processes based on a review of student needs, performance results, stakeholder satisfaction and research-based best practices. Curricula are aligned to performance standards in most units and levels. O n
  • 179.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description 6.1.5 6.1.6 Asystematic process exists for the design and management of all educational programs. All educational programs and services meet established standards to assure high quality of programs and services. Programs and services are based on a systematic review of student needs, performance results, community demographics, stakeholder expectations and cost analyses. All curricula are aligned to performance expectations for all grades and courses. This process is systematically evaluated and improvements are made as determined necessary. 1 do not know. 6.2 Delivery of educational programs and services Definition: *Educational Programs and services are defined as all programs and services provided to students and conducted by professional, certified personnel or non­ certifiedpersonnel under the supervision o f certified personnel 6.2.1 6.2.2 6.2.3 6.2.4. Delivery of educational programs and services* are based on federal or state regulations, traditional practices and individual preferences, opinions. Local curricula are inconsistently delivered. Delivery of some educational programs and services* are based on student performance information and in some cases, research-proven effective practices. Curricula are delivered consistently in some school district units. Processes are being developed to improve the delivery of educational programs* through a systematic review of performance results, research-based best practices and stakeholder expectations. Educational programs and services* are delivered to meet student needs, prevent school failure, optimize student achievement, integrate research-based practices and respond to stakeholder expectations. Delivery of services is evaluated based on performance information and improvements are made as necessary. On to
  • 180.
    Item # Description 6.2.5 6.2.6 Asystematic process exists to insure delivery of educational programs and services* that optimally meet student needs, insure student success and exceed stakeholder expectations. Processes are proactive to prevent student failure. The process for program review is systematically evaluated as an integral part of the strategic planning process and improvements made as necessary. 1 do not know. 6.3 Design, management and delivery of educational support services *Educational support services include all programs and services which support the educational programs, such as business operations, transportation, public relations, purchasing, clerk, legal, volunteers, food service, records, buildings and grounds. **Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which 6.3.1 6.3.2 6.3.3 6.3.4 Educational support services are designed, managed and delivered based on traditional practices and individual preferences. Frequent complaints from internal and external stakeholders occur with no defined process to address them. Decisions are made with little to no input or involvement from stakeholders**. Educational support services* are designed with some input from stakeholders**. Cost analyses of operations are routinely done. Some stakeholder** satisfaction information has been acquired. Required audits are performed for district finances. Processes are being developed to include problem identification, analyses of performance results, cost analyses, and stakeholder** requirements. Stakeholders of services are included in the design. Information is being acquired to determine stakeholder satisfaction. Most services have processes*** which support the mission of the district through a focus on improved productivity, efficiency and quality to support optimal student performance. Processes include early identification of problems, corrective action processes, and comparing processes to other external organizations. Stakeholder** requirements and satisfaction are regularly assessed and used to make improvements.
  • 181.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description areaffected by the conditions and quality o f education or the preparedness o f graduates; also includes state agencies and their requirements ***Process refers to the steps and sequence o f steps done in a specific work activity; e.g. enrolling a new student in school; requisition of instructional supplies; hiring new personnel 6.3.5 6.3.6 A systematic process exists to design, manage and deliver all support services v/hich involve stakeholders** and is based on the educational needs of students, performance results o f that service, cost analyses, productivity measures, and stakeholder** satisfaction. This process is systematically reviewed and improvements are made as necessary. 1 do not know. 6.4 Data and Information Processes* Definition: *Includes the collection, management and dissemination of data on enrollment, achievement, operations, stakeholder satisfaction and other pertinent information which are used in evaluation and planning processes 6.4.1 6.4.2 6.4.3 Data collected includes district, school, grade-level enrollment, attendance, dropout, and student performance on statewide testing. Information is not widely disseminated throughout the district. Some decisions made by leadership are based on this information. Additional data regarding enrollment, district demographics and utilization of specific educational programs and services is collected. Data collected is determined by program managers or building principals. Some analyses occur and are used to make recommendations and decisions for improvement. Processes are being developed to collect, analyze, disseminate and use more comprehensive information necessary to determine improvement areas. Additional measures of student performance are being developed for frequent indicators of learning. Stakeholder needs and satisfaction data are included. CT 4^
  • 182.
    Item # Description 6.4.4Decision-making processes are based on information on student and operational performance. Information includes all measures of student achievement by grade level, assessment of educational programs and services, assessment of support services, and comprehensive building level data including school, grade and course enrollment, disciplinary, graduation, drop-out, parent involvement, and stakeholder satisfaction. Processes exist to collect, analyze and disseminate specific data which may be required to make improvements. 6.4.5 A systematic process exists for the collection, analysis, dissemination and use of data and information. Strategic decisions affecting the direction of the district and target goals are made using this information. Comprehensive assessments exist for monitoring student performance, evaluating effectiveness of all programs and services, and for stakeholder satisfaction. Data and information processes are systematically evaluated and necessary improvements are made. 6.4.6 I do not know. 6.5 Internal and external* communication process** Definition: *Internal refers to personnel and students within the school district and external refers to parents and community stakeholders 6.5.1 6.5.2 6.5.3 There is unclear and inconsistent communication regarding the direction and goals within the district. Some efforts made to communicate with parents and community. Communication regarding the direction and goals within the district is usually done on an annual basis, but not frequently. School newsletters or district newsletters are sent to parents. Internal communication is improving but still inconsistent. Efforts are being made to improve communication with external stakeholders.
  • 183.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description *Communicationprocess refers to methods used to inform and seek opinion form others. 6.5.4 6.5.5 6.5.6 Communication is consistent, timely and thorough, both with internal and external stakeholders. A clear and understood process for thorough communication both internally and externally exists. District personnel, students, parent and community are fully informed in a timely manner. Communication process** is systematically evaluated for improvement opportunities. I do not know. 6.6 Management of supplier and partnering* processes Definitions: *Suppliers are those businesses or individuals with which the district contracts for specific services such as training, consulting, transportation, legal, etc. Partnering processes are defined as those relationships with other organizations, community agencies, businesses to design, implement, provide services 6.6.1 6.6.2 6.6.3 6.6.4 No specifications are established by school district for expectations from suppliers. There is little effort to develop collaborative relationships with stakeholders. Some specifications and expectations exist for some supplier areas. Problems are addressed as they occur. The district has developed collaborative relationships with parent, local businesses and some community organizations. Efforts are being made to develop a proactive approach to problem identification and prevention with suppliers*. Requirements are being developed for some suppliers and communicated to achieve improved quality of supplies and materials. District is making increasing efforts to develop partnerships with stakeholder groups to support mission and goals. Efforts are being made to involve them in decision-making processes. Established processes exist to determine supplier and partner* requirements for most areas. Requirements are communicated to assure expected performance by supplier and partner. Stakeholders** are asked to evaluate effectiveness. O n O n
  • 184.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Item # Description forthe students or stakeholders of the district **Stakeholder- Individuals or groups, both internal to the school (students, personnel) and external (parents, community members, business) which are affected by the conditions and quality of education or the preparedness o f graduates; also includes state agencies and their requirements 6.6.5 6.6.6 A systematic process exists to establish and communicate requirements to suppliers and partners*. Processes include regular evaluation of their effectiveness, quality and costs. Suppliers and partners* share in the district’s goals. Processes are regularly evaluated to determine improvements necessary. 1 do not know. o
  • 185.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 7.0 School DistrictPerformance Results This category examines the district’s results in student achievement, quality of programs and services, and operations. Please think about the current results that your school district can demonstrate in student achievement, human resources and stakeholder satisfaction. Please consider how your school district performance results compare with comparable districts. Please select one of thefive choices in each item which most closely describesyour school district byfilling in the circle next to it with a pencil or black pen. # Description 7.1 Student performance results *Stakeholder- Individuals or groups, both internal to the school (students, all personnel) and external (parents, community members, business) which are affected by the conditions and quality o f education and preparedness o fgraduates. **Benchmarking is an improvement process in which an organization compares its performance against best-in-class organizations, determines how these organizations 7.1.1 7.1.2 7.1.3 7.1.4 7.1.5 7.1.6 Student performance, as revealed by national, state or local measures, reveal significant deficiencies and are below expectations of stakeholders* and other comparable districts Performance by some students on national, state or local measures reveals satisfactory results when compared to state or national results. Below average performance results exists for other students. Student performance on national, state or local measures is improving in specific areas and by increasing numbers of students. Data is being accumulated from past few years to determine if this is a trend. Student performance on national, state or local measures is showing consistent upward trends over time. Performance indicates improvement over the past few years compared to other comparable districts, both in and out of state. Student performance results are sustained on all measures and by all student groups. All student groups recognize school district for outstanding performance. Other school districts use performance results for benchmarking**. Graduate follow-up information reveals successful employment or completion of higher education. 1 do not know. O n 00
  • 186.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. # Description 7.2 StudentConduct Results 7.2.1 Unacceptable rates of student absenteeism, tardies, suspensions, expulsions for students exist. *Indicators involving student behavior such as disciplinary infractions, 7.2.2 Recent improvement in at least one student conduct indicator*. Results are consistently reviewed for progress. suspensions, expulsions, arrests, etc. **Benchmarking is an 7.2.3 Improvement trends are beginning to emerge in 3 or more indicators*. Improvements are emerging over time when compared to other comparable school districts. Some student groups show little improvement. improvement process in which an organization compares its performance 7.2.4 Significant gains are noted in student conduct areas over a three year period among most student groups, compared to in state and out of state comparable districts. against best-in-class organizations, determines how these organizations achieved their performance levels and uses the 7.2.5 Sustained results are demonstrated in all areas of student conduct and among all student groups. District performance in this area is used for a state or national benchmarking**. Graduate follow-up information reveals successful employment or completion of higher education. information to improve its own performance. 7.2.6 I do not know. ON NO
  • 187.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. # Description 7.3 Studentand Stakeholder Satisfaction Results * Stakeholder-Individuals or groups, both internal to the school (students, personnel ) and external (parents, community members, business) which are affected by the conditions and quality o f education or the preparedness o fgraduates; also includes state agencies and their requirements **Benchmarking is an improvement process in which an organization compares its performance against best-in-class organizations, determines how these organizations achieved their performance levels and uses the information to improve its own performance_________ 7.3.1 7.3.2 7.3.3 7.3.4 7.3.5 7.3.6 Anecdotal information suggests low satisfaction among students and stakeholders*. Results collected through a systematic process reveal significant low satisfaction among students and stakeholders with school district performance and operations. Some areas reveal improvement in stakeholder* and student satisfaction when compared to baseline data. Many areas reveal significant gains in stakeholder * and student satisfaction. Long term trends are emerging. District is comparing trends with other comparable school districts. Sustained results over several years reveal high satisfaction of all student and stakeholder* groups for the performance of student achievement, quality of educational programs and services, and support services. The school district is used for benchmarking** in and out of state. I do not know.
  • 188.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Description 7.4 Human Resource* Results *Humanresource indicators include employee well-being, labor relations, satisfaction, professional development, work system performance and effectiveness. **Benchmarking is an improvement process in which an organization compares its performance against hest-in-class organizations, determines how these organizations achieved their performance levels and use the information to improve its own performance. 7.4.1 7.4.2 7.4.3 7.4.4 7.4.5 7.4.6 Anecdotal information reveals low satisfaction of personnel and poor employee-employer relationships. High absence and turnover rates by personnel exist. High rate of grievances exist. No formal measurement exists but anecdotal information suggests that most personnel appear satisfied. Personnel accept current terms of employment. Personnel absenteeism, turnover and recruitment are not considered to be problems. Reactions to staff development activities are generally positive. Systematic attempts to measure human resource indicators* reveal some emerging improvement patterns. Results indicate improvements in many human resource indicators*. Absenteeism rates for personnel and turnover show steady decline. Employee satisfaction measures reveal consistent upward trends. Staff development activities reveal impact on instructional and work practices. Employee satisfaction is high among all personnel classifications. District is recognized as an organization to benchmark**. Personnel absenteeism, turnover is low , with sustained employment pool of high quality qualified applicants. Staff development opportunities indicate sustained results on improved student performance. 1do not know.
  • 189.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Description 7.5 Educational Program andService Results Definitions: *Educationalprograms and services are defined as all programs and services provided to students and conducted by professional, certifiedpersonnel or non- certified personnel under the supervision o f certified personnel **Benchmarking is an improvement process in which an organization compares its performance against best-in-class organizations, determines how these organizations achieved their performance levels and uses the information to improve its own performance_________ 7.5.1 7.5.2 7.5.3 7.5.4 7.5.5 7.5.6 No results, other than state, national or local student achievement measures, are available to determine effectiveness of educational programs and services*. Enrollment data is used as indicator of effectiveness for some educational programs or services*. Some educational programs and services * evaluations exist but demonstrate poor performance results. Compliance with federal, state, local requirements exists. Some educational program and service* evaluations are beginning to show improvements on several indicators compared to baseline data. Steady gains are being made in enrollment, attendance and student graduation rates when compared to comparable school districts. Educational programs and services* are demonstrating improvements in performance results. Sustained results exist indicating outstanding performance for educational programs and services. High student satisfaction exists. High attendance, high graduation rates exist with graduate follow-up information revealing successful employment or higher education after graduation. District is used as a benchmark** both state and nationally for excellence in educational programs and services. I do not know. -j to
  • 190.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. Description 7.6 Educational support services Definitions: *Educationalsupport services include all programs and services which support the educational programs, such as business operations, transportation, public relations, purchasing, clerk, legal, volunteers, food service, records, buildings and grounds. **Benchmarking is an improvement process in which an organization compares its performance against best-in-class organizations, determines how these organizations achieved their performance levels and use the information to improve its own performance_________ 7.6.1 7.6.2 7.6.3 7.6.4 7.6.5 7.6.6 Audits of district reveal some areas of non-compliance with regulatory or legal requirements. Little or no information is available from educational support services*. Some performance data exists for some services. Results reveal significant gaps in stakeholder satisfaction. Regulatory and legal compliance is improving. Performance results of support services reveal some recent improvement Improvements are evident in some educational support service areas. Benchmarks are being used from other comparable organizations and reveal upward trend. Sustained improvement results exist with high customer satisfaction, cost effectiveness, efficiency and productivity. District educational support services are used as benchmarks** for other comparable districts. 1do not know. U>
  • 191.
    Reproducedwithpermissionofthecopyrightowner.Furtherreproductionprohibitedwithoutpermission. 1. Could thisinstrument be useful as a tool to study the seven areas of your school district? Extremely Usefi.il Somewhat Useful Little Use No Use______ Please explain why or why not._________________________________________________________________________________ 2. Could this instrument be useful as a tool to help district personnel determine areas for future school district improvement? Extremely Useful Somewhat Useful Little Use No Use______ Please explain why or why not._________________________________________________________________________________ My most sincere appreciation for your responses 174
  • 192.
    175 Appendix B: Panel ofExperts Used in Content Validation Dr. Susan Leddick— Trainer, consultant, practitioner of quality and the Malcolm Baldrige National Quality Award. Dr. Roland Smith— Consultant in quality applications and Malcolm Baldrige Examiner at the state and national level. Dr, Jim Shipley— Trainer in Pinellas County, Florida schools. Designed a similar instrument for use in those schools. Malcolm Baldrige Examiner at the national level. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 193.
    176 Appendix C: Letter toPanel o f Experts August, 1997 Dear Thank you for agreeing to be a content area expert for my dissertation study. I have enclosed the current draft o f the instrument for your review and critique. The feedback that I am requesting is in the following areas. • Alignments o f the intent, design and content o f the instrument with the Malcolm Baldrige National Quality Award criteria. • Clarity o f items. • Specific suggestions for improving the instrument • Extent to which this instrument could be a guide for self-study in a school district. The research questions posed in this study are: 1. How do educators perceive their own school district’s performance based on an instrument designed using the Malcolm Baldrige National Quality Award Education Criteria? 2. Are there differences in these ratings based on type o f educators or size of district? 3. Do educators find this instrument a useful tool to study these areas o f a school district? 4. Do educators believe this instrument could be useful in determining school improvement needs? Please feel free to give your comments to me in the easiest way for you. You can reach me through these numbers, fax, or e-mail. I so appreciate your willingness to advise me and share your expertise. Sincerely, Sally Anderson Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 194.
    177 Appendix D: Matrix of Population Sample Superintendents Principals Teachers Grp. Population N % S** N % S** N % S** 1 5000 + 13 13.4 10 220 44 96 6736 51.5 191 2 2500-4999 13 13.4 10 84 16.8 37 2358 18 67 1000-2499 27 27.8 21 106 21.2 46 2374 18 67 4 500- 999 22 22.6 18 55 11 24 987 8 29 5 1- 499 21 21.6 17 31 6 14 617 5 19 Total 26 217 373 Superintendents Principals Teachers N S N S N S State Totals 97 76 500 217 13,076 373 *Note: Group classifications by size are based on the 1996-97 Annual Statistical Report prepared by the Idaho State Department o f Education. **Note: Proportional sample number from total sample. R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 195.
    178 Appendix E: Codes onthe Instrument Size o f District Type of Educator: 1 S 1 P 1 T 2 S 2 P 2 T -> S 3 P J T 4 S 4 P 4 T 5 S 5 P 5 T R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 196.
    179 Appendix F: Cover Letterand Directions October 6, 1997 Dear Colleague, As educators, we work diligently to improve student performance and optimize the effectiveness of our schools. It is on this shared responsibility that I ask you to participate in my dissertation study. The enclosed instrument is not a survey. It is an analysis tool designed to assist school districts in the process o f self-study for the purpose of continual improvement o f school systems. Selections you make on this instrument are neither right nor wrong, good nor bad. They should reflect your perception of where your school district is at this point in time in that area. It is your honest and realistic professional opinion that will provide me the information for my study. This analysis instrument is adapted from the Malcolm Baldrige National Quality Award - Education Pilot, the curriculum audit process, and the 1996 accreditation standards from the Northwest Association o f Schools and Colleges. The instrument asks you to reflect on seven categories common to every school district. Within each category are specific items to which you will respond. You, specifically, are in a unique position of responsibility to influence and implement school improvement efforts in your district. I realize that I am asking very busy professionals to use some of your already limited time towards this endeavor. Since it is not a survey, it may take you a little longer to thoughtfully respond. I am appealing to your commitment to the improvement o f education in Idaho, your curiosity about the findings o f this research, and your willingness to assist a colleague who shares your interest and desire for continued excellence in education. Your returned response is critical to the findings. Read the directions thoroughly to assist you with this unique project. Please return the completed instrument to me in the self- addressed stamped envelope provided. I am asking that you send it to me by October 27, 1997. Please accept the pen as a small token o f my appreciation for participation in this study. Please feel free to use it when completing the enclosed analysis. I would welcome the opportunity to discuss my results with you personally and the implications for how school improvement might be approached. You may reach me at the J.A. & Kathryn Albertson Foundation, 208-342-7931. My sincere appreciation, Sally Anderson, Doctoral Student University o f Idaho R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 197.
    180 Directions Please Read Carefully!Thank You! Please Read Carefully! Thank You! • The Performance Analysisfor School Districts has been prepared on paper that will be scanned. Therefore, please fill in the circle next to the item you select completely with pencil or black ink. Do not fold or tear. Do not staple. Use clip. • This instrument is an analysis tool to assist you in a reflective self-study of the major components o f a school district. The questions do NOT ask about a specific school. Rather, they focus on the whole school district. • Unlike a survey, this is not intended to be responded to quickly. It is intended to be a reflective tool for your thoughtful consideration regarding the complete scope o f your school district operations. • As the sole researcher, I can assure you that all information is confidential and anonymous. Results will be reported in grouped data only. No individual responses are reported. • It is not necessary to do any research or consult with others to help you answer the items, unless you wish to. I am interested in your perspective, your knowledge, your perception based on what you know from your view in your school district. • There is the opportunity to select an “I do not know category” if you truly feel you can not make a judgem ent on that item. • Some terms used are defined in each sub-category. Please refer to the definitions under each sub-category in the upper left-hand corner for added clarity. • If you feel that you do not have information about an area, answer based on what it appears to be from your point of view. • Please do not write in any comments unless they are asked for. • Please respond to every section to the best o f your ability. • Select only one o f the five choices in each section that most accurately reflects your perception o f your school district at this point in time. I am interested in knowing your perception. • Return the completed Performance Analysisfor Schools to me in the self-addressed stamped envelope provided by October 27, 1997. • If you have any questions or would like the results o f the study, please call me at the J.A. & Kathryn Albertson Foundation (208-342-793 1). R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 198.
    181 Appendix G: Postcard Reminder October25, 1997 Dear Colleague, Recently, I sent you a school district self-study instrument in the mail to collect information for my dissertation research study. If you have returned it to me already, I am most grateful. If you have not, I ask that you take the time to do so. Assessing the performance o f organizations is becoming an extremely important practice and as school districts, we will need to consider such a process. The instrument serves only to find out your perceptions o f the current performance o f your school district using the Malcolm Baldrige National Quality Award as the criteria. Please send me the completed instrument by November 5, 1997. Thank you for your contribution to my study and your gift o f time. Sally Anderson, Doctoral Candidate (208-342-7931) University o f Idaho R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 199.
    182 Appendix H: Letters of Support from IASA and IEA IDAHOASSOCIATIONOFSCHOOLADMINISTRATORS 777SouthLatah Boise.ID83705 PHONE:(208)345-1171 FAX:345-1172 M ichael L. Friend, Executive D irector E-MAIL: idschadm@micron.net WEBSITE: httpi//www.lewiston.icl2.id.us.oiganizations/iasa/"Leadership For Tomorrow's Leaders" October1,1997 DearParticipant: TheIdahoAssociationofSchoolAdministratorsispleasedtoendorsethisresearchstudyandprovidethe opportunityforyoutoparticipateintheproject. Asthekeyleadersofschoolimprovementefforts,our perspectiveisvitallyimportant;thisstudyrecognizesthatfact. TheIASAhasgreatinterestinthefindingsof thisparticularresearchstudydesignedbySallyAnderson,aUniversityofIdahodoctoralstudent. Theresults oftheresearchwillbepublishedinourjournal,Perspectives. Yourcompletionofthissurveyinstrumentisofcriticalimportancetothestudy. Theresponseswillbekept confidentialandhandledonlybySally. Resultswillbereportedintheaggregate. We recognizethatthistaskwillrequirevaluabletimeandconsiderablethoughtonyourpart. Considerthe useofyourtimeasaninvestmentinourfutureapproachtoschoolimprovementandthedevelopmentofhigh performancedistricts. Pleasetakethenecessarytimetosupportyourcolleagueinthisproject! Thankingyouinadvanceforyourassistance. MikeFriend ExecutiveDirector Affiliated DivMoaf: Idaho School Superintendent!' Auocietioo: Idaho Ataociaiioo oTSecondary School Principals: Idaho Aisociarion of Betncntaiy School Principal!; Idaho Association of Special Education Administrators AUkd Aamdatec Idaho School District Council: Northwest Women for Educational Action: Idaho School Business Officials; Idaho Rural Schools Association; Idaho Middle Level Aisociarion; Idaho School Public Relations Association; Idaho Association of Educational Office Professionals; Idaho Association for Supervision and Curriculum Development Sincerely, Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 200.
    IDAHO EDUCATION ASSOCIATION P.O.BOX 2638, BOISE. IDAHO 83701,620 NORTH SIXTH STREET. 83702 INTERNET:http://www.idahoea.org 208/344-1341 'FAX 208/336-6967 ROBIN NETTINGA President e-mail:mettinga@nea.org JAMES A. SHACKELFORD Executive Director e-mail:jashacfc@nea.org October 1, 1997 Dear Participant: The Idaho Education Association is pleased to have the opportunity to encourage you to assist with this important study. As the cornerstone of all school improvement efforts, your perspective is vitally important and this study recognizes that. The IEA is interested in the findings ofthis research study that has been designed by Sally Anderson, doctoral student with the University of Idaho. You have been randomly selected to be a participant in this study. Your completion of this instrument is both critically important to the study and could be of practical significance to you and your district. The results of this study are entirely confidential and handled only by Sally Anderson. Results will only be reported in the aggregate. We recognize that it will require some of your time and considerable thought. Consider the use of your time as an investment towards our future approach to school improvement and high performance school districts. Please take the necessary time to support your colleagues in this study by completing this instrument. Thank you for your assistance. Sincerely, Executive Director JAS/jh R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 201.
    184 Appendix I: Districts byEnrollment Size District District Number Name Classification 1— Districts of Over 5,000: 001 Boise 002 Meridian 025 Pocatello 091 Idaho Falls 131 Nampa 271 Coeur D ’Alene 093 Bonneville 411 Twin Falls 082 Bonner County 151 Cassia County 331 Minidoka County 340 Lewiston 132 Caldwell 13 districts totalfor Classification I Classification 2— Districts of 2,500 to 4,999: 055 Blackfoot 193 Mountain Home (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 202.
    185 District District Number Name Classification2, continued: 321 Madison 273 Post Falls 251 Jefferson County 272 Lakeland 139 Vallivue 261 Jerome 221 Emmett 061 Blaine County 281 Moscow 215 Fremont County 03 Kuna 13 districts totalfo r Classification 2 Classification 3— Districts o f 1,000 to 2,499: 52 Snake River 201 Preston 60 Shelley 241 Grangeville 134 Middleton 371 Payette (table continues) Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 203.
    186 District Number District Name (Classification 3, continued) 33Bear Lake County 101 Boundary County 381 American Falls 171 Orofino 21 Marsh Valley 431 Weiser 391 Kellogg 412 Buhl 322 Sugar-Salem 291 Salmon 41 St. Maries 372 Fruitland 413 Filer 231 Gooding 150 Soda Springs 414 Kimberly 370 Homedale 401 Teton County 421 McCall-Donnelly 232 Wendell (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 204.
    187 District District Number Name (Classification3, continued) 59 351 28 districts totalfo r Classification 3 Classification 4— Districts of from 500 to 999: 137 Parma 58 Aberdeen 371 New Plymouth 392 Wallace 252 Ririe 253 West Jefferson 262 Valley 192 Glenns Ferry 363 Marsing 181 Challis 286 Whitepine 304 Kamiah 136 Melba 285 Potlatch 111 Butte County (table continues) Firth Oneida County Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 205.
    188 District Number District Name (Classification 4, continues) 365Bruneau-Grand View 148 Grace 202 West Side 044 Plummer-Worley 242 Cottonwood 341 Lapwai 133 Wilder 072 Basin tdistricts totalfo r Classification 4 Classification 5— Districts of from 1 to 499: 312 Shoshone 422 Cascade 233 Hagerman 013 Council 283 Kendrick 135 Notus 415 Hansen 417 Castleford (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 206.
    189 District District Number Name (Classification5, continued) 282 Genesee 071 Garden Valley 274 Kootenai 418 Murtaugh 073 Horseshoe Bend 432 Cambridge 305 Highland 182 Mackay 161 Clark County 342 Culdesac Oil Meadows Valley 302 NezPerce 149 North Gem 314 Dietrich 121 Camas County 316 Richfield 234 Bliss 392 Mullan 292 South Lemhi (table continues) R eproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 207.
    190 District Number District Name (Classification 5, continued) 382Rockland 433 Midvale *092 Swan Valley *394 Avery School *364 Pleasant Valley *383 Arbon *191 Prairie *416 Three Creek *Note: Eliminated due to size— less than 100 and no superintendent. 29 districts totalfo r Classification 5 Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.
  • 208.
    IMAGE EVALUATION TEST TARGET(QA-3) ✓ ' v A 1.0 1.1 L i ■ 2.8 L£ _ Ui K£ If U£ ■4° lllll 2.0 2 3 2.2 1.8 1.25 1.4 1.6 150mm IIWlGE.Inc 1653 East MainStreet Rochester. NY14609 USA Phone: 716/482-0300 Fax: 716/288-5989 O 1993. Applied Imago. Inc.. All Rignts Reserved Reproduced with perm ission of the copyright owner. Further reproduction prohibited without perm ission.