SlideShare a Scribd company logo
1 of 43
Download to read offline
Running head: PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES
Conflict and Tension:
The Impact of Performance-Based Accountability Policies on Student Affairs and Services
Phil Alexander
Department of Graduate and Undergraduate
Studies in Education
Submitted in partial fulfillment
of the requirements for course EDUC 5Q97
Faculty of Education, Brock University
St. Catharines, Ontario
© Phil Alexander, 2015
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES
ii
Abstract
The age of accountability has arrived. The higher education system in Ontario is in the midst of
a “new accountability” movement as evidenced through the implementation of multiple
performance-based accountability policies. What does this mean for the practice of Student
Affairs & Services (SAS) divisions? Through a historical and theoretical review of both
performance-based accountability policies and Student Affairs & Services, I argue that the
underlying foundational principles of both are in conflict with each other and therefore create
tension in practice. Furthermore, as a Student Affairs and Services practitioner myself, I provide
recommendations on how we can organize ourselves to positively influence performance
indicators without sacrificing the holistic, transformative ideals we have come to embrace.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES
iii
Acknowledgements
I would like to acknowledge the professors whom I have had the pleasure of learning from
throughout my journey; their experience, knowledge, and direction have been invaluable. In
alpha order: Dr. Denise Armstrong, Dr. Jill Grose, Dr. Catherine Hands, Dr. Xiaobin Li, Dr.
Coral Mitchell, Dr. Dolana Mogadime, Dr. Lissa Paul, Dr. Jennifer Rowsell, and Dr. Nicola
Simmons.
Finally, to my family, S, E, and O; thank you for three and a half years of support. This is for
you!
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES
iv
Table of Contents
Abstract……………………………………………………………………………………………ii
Acknowledgements………………………………………………………………………………iii
Introduction………………………………………………………………………………………..1
Accountability and the Ontario Context……………………………………………………….….3
Defining Accountability…………………………………………………………………….….3
The “New Accountability” Movement………………………………………………………...4
Types of Performance-Based Accountability Policies……………………………………6
Performance-Based Accountability in Ontario………………………………………………...7
Key Performance Indicators………………………………………………………………8
Multi-Year Accountability Agreements………………………………………….……….8
Differentiation Policy Framework………………………………………………………...9
Maclean’s Rankings……………………………………………………………………...11
National Survey of Student Engagement………………………………………………...11
Accountability and Student Affairs & Services………………………………………………….12
Defining Student Affairs and Services………………………………………………………..13
The Student Development Movement...…………………………………………….…….......13
The Student Learning Movement……………………………………………………………..15
Conflict & Tension……………………………………………………………………………17
Accountability and Organization of Student Affairs & Services……………………….……..…23
Inclusion of Students……………………………………………………………………….…23
Assessment……………………………………………………………………………………25
Leadership…………………………………………………………………………………….26
Conclusion……………………………………………………………………………………….29
References......................................................................................................................................31
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 1
Beneath and behind the growing public interest in measures and devices of all kinds,
there was a dream. … It was the dream of education and society as machines, efficient
devices for the attainment of high social objectives on one hand, and inculcation of
measurable knowledge and marketable skills on the other. It was the idea that a machine-
like solution could be found to the ancient problem of assigning the young to their proper
places and professions. It was the hope that a truly scarce good—advanced education—
could mechanically and fairly be distributed, through exact testing and accountancy, to
every deserving person. (Bruneau & Savage, 2002, p. 26)
Over the past fifty years access to Ontario’s higher education system has transitioned
from an elite system to a near universal system (Clark, Moran, Skolnik, & Trick, 2009). With
greater access came an increased interest from stakeholders in the performance of institutions
(Clark et al., 2009) leading to what has been termed the “new accountability” movement in
higher education (Hillman, Tandberg, & Gross, 2014). Within the “new accountability”
movement governments place greater emphasis on the performance of institutions through
accountability policies that include performance indicators such as graduation rate, retention rate,
course completion, credits earned, etc. How does the “new accountability” movement impact
Student Affairs and Services (SAS) divisions operating within universities?
Much has been written regarding the mechanistic (Taylor, 2007/1912) principles
underlying performance indicators and performance-based accountability policies in higher
education (Alexander, 2000; Bruneau & Savage, 2002; Shannon, 2009). This begs the question,
what is the purpose of universities? Are they primarily breeding grounds for job-ready
graduates? Are universities accountable to the demands of local, national, or international market
economies? Or are universities a place for personal growth and maturation through
transformative learning experiences with societal ideals such as democracy and responsible
citizenship at its core? Are they accountable to the students they serve? I have worked for ten
years at two universities in Ontario in several jobs situated under the SAS umbrella. I believe the
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 2
role of university is to foster learning through holistic transformation. I believe in the university
as a place for personal growth and maturation where students learn to be contributing members
of a democratic society. I approach my work as a Student Affairs and Services practitioner this
way. However, with the growing use and influence of performance-based accountability policies
in Ontario I believe SAS divisions and the practitioners within them have been forced to “work
to the performance indicator”. Indeed, according to Reason and Broido (2011), the “new
accountability” movement “change[s] what we do and how we see ourselves professionally.
Student affairs professionals now focus on learning outcomes and creating curricula to guide the
achievement of those outcomes” (p. 92). I believe this brings the mechanistic principles of
performance-based accountability policies in direct conflict with the holistic, transformative
approach of SAS. Therefore, through a historical and theoretical review of performance-based
accountability policies and Student Affairs & Services I argue that the underlying principles of
both are in conflict with each other and therefore create tension in practice.
I am also a realist and understand that the age of accountability has not only arrived but is
here to stay. So in addition to arguing that the tension and conflict exist, I also provide
recommendations on how SAS divisions can organize themselves to positively influence
performance indicators without sacrificing the holistic, transformative ideals we have come to
embrace. I present my argument in three parts. In part one I examine what we mean by
accountability, describe the “new accountability” movement in higher education, and situate the
Ontario context within it. In part two I provide an overview of the theoretical foundations for
and evolution of SAS practice followed by an examination of several dynamics of why tension
and conflict exist between performance-based accountability policies and SAS. In part three, I
provide recommendations on organizational characteristics that I believe will position SAS
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 3
divisions to influence performance indicators while still allowing for a holistic, transformative
approach.
ACCOUNTABILITY AND THE ONTARIO CONTEXT
When using the term accountability in higher education there are a multitude of possible
meanings. Burke (2005) states “accountability is the most advocated and least analyzed word in
higher education” (p. 1). The journey through which the meaning of accountability has traversed
is examined below.
Defining Accountability
On a basic level accountability is defined as “an obligation or willingness to accept
responsibility or to account for one’s actions” (“Accountability”, 2015). Conceptually,
accountability can be framed through several questions, “who is accountable to whom, for what
purposes, for whose benefit, by which means, and with what consequences?” (Burke, 2005, p. 2).
Within higher education the answers to these questions changed in the late 1980s to early 1990s.
Prior to the early 1990s the Ontario higher education system was considered elitist, meaning the
number of people who attended postsecondary institutions was below 15% of the population
(Trow, 1973). Institutional accountability and quality were defined primarily in terms of the
resources available by “referring to such indicators as the faculty-student ratio, operating
expenditures per student, the value of library acquisitions, and the amount of capital
expenditures” (Clark et al., 2009, p. 114). However, the convergence of several factors in the
early 1990s transformed how accountability was conceptualized (Hillman et al., 2014;
McLendon, Hearn, & Deaton, 2006; Rutherford & Rabovsky, 2014). Beginning with an
ideological shift towards neo-liberalist social policy, the Ontario government placed greater
emphasis on the belief “in the virtue and infallibility of global markets” (Shanahan, 2009, p. 4).
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 4
With the rising popularity of the idea that investment in higher education produced a competitive
advantage in the national and international economy, the government of Ontario increased access
to postsecondary institutions resulting in massification and expansion of the system (Clark et al.,
2009). Of course, with a larger post-secondary system came increased costs to fund it. Pressures
arose, as a consequence of increased cost, for both a “broader conceptualization of accountability
and for the provision of information on university performance to the government and external
stakeholders” (Clark et al., 2009, p. 115).
The “New Accountability” Movement
As economic priorities took centre stage at universities the meaning of accountability
increasingly became defined in market terms. Shanahan (2009) illustrates examples of market
terms used:
Business and private sector criteria are employed to make education decisions. Job
training and meeting labour market needs have become key education priorities.
Economic principles of productivity, efficiency and competitiveness have become
imperatives. And we have seen our accountability frameworks become infused with
market discourse, market principles and market mechanisms. (pp. 4-5)
As a consequence of the reconceptualization, governments and policy-makers turned to
performance-based accountability policies “as the model of choice for resource allocation to
public colleges and universities” (Alexander, 2000, p. 419). The national and international rise
in popularity of performance-based accountability policies has been termed the “new
accountability” movement in higher education (Hillman et al., 2014). These new approaches to
accountability, according to Bruneau and Savage (2002), were billed as a “new wave that would
render old-fashioned statistics obsolete…. A university might boast fine buildings, a celebrated
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 5
staff, and an excellent library, but these did not guarantee students would actually learn” (p. 2).
They could not guarantee students were actually learning because they were not measuring if
students had learned. The lack of comparable indicators to measure learning influenced the need
to reform how quality was determined. Rutherford and Rabovsky (2014) summarize the basic
logic of performance-based accountability policies below:
Rather than allocating resources primarily on the basis of inputs such as enrollments,
these reformers seek to shift the funding mechanisms to student outcomes such as
graduation rates and degree production. They argue that, under traditional budget
arrangements, universities have little incentive to care much about student outcomes, and
have thus tended to focus on other priorities including graduate education, research
productivity, and capital investments in new buildings…. By reformulating the
incentives that universities face, so that institutions are rewarded or punished primarily
based on actual performance (outcomes) rather than simple input measures, performance
funding seeks to stimulate shifts in institutional behaviour that will result in greater
efficiency and productivity. (p. 187)
The theoretical principles that performance-based accountability policies embody draw from
“theories of action” (Argyris & Schön, 1996). The first and most prominent theory of action
reflects a resource-dependency perspective wherein “public funding is manipulated to stimulate
market profit incentives that, according to the theory, motivate institutions to improve
performance” (Ziskin, Hossler, Rabourn, Cekic, & Hwang, 2014, p. 14). In addition to resource-
dependency theory, there are three other theories of action that help clarify the behaviour of
institutions:
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 6
The second theory of action described supposes that policies persuade institutions to
agree with public policy makers on the importance of improved student outcomes. It
follows then, in the theory, that these institutions change their behaviors to improve
student outcomes. The third theory of action involves raising institutions’ awareness of
their performance, leading naturally to comparisons across institutions that stimulate
institutions’ pride and status-striving and that motivate changes in institutional behaviors
– resulting, theoretically, in improved outcomes. The fourth theory of action…entails
providing institutions with resources to support greater capacity on key performance
indicators and improved practice as learning organizations. (Ziskin et al., 2014, p. 14)
It is important to note the theoretical principles of performance-based accountability policies
because they help to understand why institutions react the way they do. They also aid in
determining how and why the conflict and tension causes shifts in behavior of SAS divisions.
Types of Performance-Based Accountability Policies
The “new accountability” movement has generated three types of performance-based
accountability policies (Burke & Minassians, 2002; COU, 2013; McLendon et al., 2006). The
first type, performance funding, is an approach that links government funding directly to the
performance of institutions on individual indicators. Under these types of policies, “the
relationship between performance and funding is predetermined and prescribed: if an institution
meets a specified performance target, it receives a designated amount or percentage of funding”
(McLendon et al., 2006, p. 2). The best example of this type of policy in practice is in the state
of Tennessee. Tennessee is the only state to have 100% of its institutions’ public funding based
on performance indicators allocated through the state funding formula (Dougherty & Reddy,
2011; Natow, Pheatt, Dougherty, Jones, Lahr, & Reddy, 2014; Ziskin et al., 2014).
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 7
Performance budgeting, in contrast, considers institutional performance as only one
factor in determining funding allocations. In this model, “the possibility of additional funding
due to good or improved performance depends solely on the judgment and discretion of state
[provincial], coordinating, or system officials” (Burke & Minassians, 2003, p. 3). Performance
budgeting policies are much more prevalent in practice due to their ‘safer’ characteristics over
performance funding policies (Lang, 2013, p.25). Consequently, there are numerous examples in
use including in twenty one US states (Burke & Minassians, 2003, p. 8), in Australia (Lang,
2013, p. 22), and in Ontario (Ziskin et al., 2014, p. 12).
Lastly, performance reporting policies provide stakeholders with reports and/or indicators
of institutional performance but with no formal links to funding. Therefore, this model “relies on
information and publicity rather than funding or budgeting to encourage colleges and universities
to improve their performance” (Burke & Minassians, 2003, p. 3). This policy option is the most
popular in jurisdictions that implement performance-based accountability policies of some sort.
Burke and Minassians (2003) report that forty six US states (92%) have incorporated some form
of performance reporting (p. 12). Within Canada, all provinces except Nova Scotia, Manitoba,
and Newfoundland have some form of performance reporting (Ziskin et al., 2014, p. 11).
With an understanding of the three types of accountability-based policies currently in use
– performance funding, budgeting, and reporting – let us review the Ontario higher education
performance-based accountability policies.
Performance-Based Accountability in Ontario
Three distinct policies are related to accountability and institutional performance in the
Ontario higher education system – Key Performance Indicators (KPI), Multi-Year
Accountability Agreements (MYAA), and the Differentiation Policy Framework for Higher
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 8
Education. Furthermore, two non-policy instruments influence institutions’ behaviour through
accountability-based mechanisms – the Maclean’s annual ranking of universities and the
National Survey of Student Engagement (NSSE).
Key Performance Indicators
Beginning in 1995 and coinciding with the rise in popularity of “new accountability”
policies, the Ontario government introduced its first performance-based accountability policy,
key performance indicators (KPI). The first iteration of KPIs were a form of performance
reporting because the basic idea was to provide institutional information to students so that they
could make a more informed decision (Lang, 2013). The specific indicators used were
graduation rate, employment rate six months and twenty-four months after graduation, and
default rates on Ontario Student Assistance Program (OSAP) loans. However, soon after
implementation of the KPIs as informational guides, the government decided to link KPIs to a
portion of the annual operating funding of institutions (2%), thus evolving the KPIs from a
performance reporting policy to a performance budgeting policy. The shift in usage for the KPIs
signaled the beginning of the government’s venture into the performance-based accountability
arena for the explicit purpose of aligning institutional behaviour to government priorities (Lang,
2013, p. 7).
Multi-Year Accountability Agreements
In 2005, as a part of the Reaching Higher plan for Ontario higher education (Ontario,
2005), the government launched the Multi-Year Accountability Agreement (MYAA) process.
The policy had three basic features, “to outline the government’s commitment to stable funding,
[to] articulate each institution’s commitment to accessibility, quality improvements and
measurements of results, and [to] tie the commitment to results” (HEQCO, 2009, p. 1). The
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 9
MYAAs are a performance budgeting accountability policy because they represent only one
factor in overall institutional funding. In these agreements, the individual institutions indicate
strategies, programs, and performance targets in regard to goals set by the government. After
goals are set, the institutions subsequently report back on the progress of performance in relation
to those goals over a multi-year span. Therefore, “through the goal setting and review process,
the government [was] able to exercise a degree of control over post-secondary institutions that
did not exist before the MYAA process” (Clark et al., 2009, p. 128). By 2009-10 the Reaching
Higher plan was scheduled to end and the MYAAs offered a basis for a revised accountability
framework. However, the government decided to pursue a much larger systemic transformation
in the form of the Differentiation Policy Framework.
Differentiation Policy Framework
In 2013 the government announced its most comprehensive policy yet aimed at system-
wide transformation. The Differentiation Policy Framework has overall goals of increasing
quality, access, productivity, sustainability, and accountability of the higher education sector
(HEQCO, 2013; MTCU, 2013; Weingarten, Hicks, Jonker, & Liu, 2013). Although those
aspects of the system interact and intersect, the specific interest here is the revamped
accountability framework.
Hillman, Tandberg, & Fryar (2014) have observed that “many of the more recent
performance-based policies in higher education have sought to move away from a one-size-fits-
all approach to include a range of performance indicators that value many different types of
student success” (p. 6). In the performance-based policy literature, these types of policies are
known as Performance Funding 2.0:
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 10
Performance funding 2.0 programs, while not sacrificing indicators of ultimate outcomes,
also put considerable emphasis on indicators of intermediate achievement: for example,
course completions; successful completion of developmental education courses or
programs; passage of key gateway courses such as college mathematics or college
English. (Dougherty & Reddy, 2011, p. 6)
Including specific performance indicators that measure intermediate outcomes allows for a
greater effect on measuring the difference between individual institutions, as opposed to general,
system-wide indicators. Intermediate performance indicators are important for the Ontario
context because of the wide range of institutions (size, type, location) operating within the
province. The Ontario government understood this discrepancy while designing the
Differentiation Policy Framework. The first step outlined in the policy was to formally negotiate
Strategic Mandate Agreements (SMAs) with the institutions. Institutions outlined their strengths
and areas they planned to focus on as well as indicated specific performance indicators by which
to measure these strengths. Both sides then agreed on a complete, final set of performance
indicators that measures institutions with system-wide metrics and institution specific metrics.
Currently, the Ontario government has signed SMAs with all publically funded postsecondary
institutions. The SMAs are in effect until 2017.
The Differentiation Policy Framework is a type of hybrid accountability framework in
that it is a performance budgeting policy because the performance indicators negotiated provide
only one aspect of overall institutional funding (the base operating grant is still calculated per
full-time enrolments). However, through the SMA exercise, funding formula reform is an
ultimate goal of the government (COU, 2013; HEQCO, 2013; OCUFA, 2013; Ziskin et al.,
2014). Performance-accountability policies that specifically link funding directly to the funding
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 11
formula and include performance indicators that attempt to measure intermediate outcomes (as
opposed to just ultimate outcomes like graduation rate) have been termed “performance funding
2.0” (Dougherty & Reddy, 2011). Reforming the Ontario funding formula to fund institutions, in
part, based on outcomes (or performance) would indicate a performance-based funding 2.0
model. Although the government’s intentions are to reform the funding formula, at this point in
time the government is effectively steering the system with mandates through the SMA exercise.
In April 2015 the government of Ontario plans to sit down with all forty-four postsecondary
institutions, as well as representative of employers in Ontario, to begin discussions on funding
formula reform (Chiose, 2015).
Maclean’s University Rankings and the National Survey of Student Engagement
There are two more accountability instruments that are not formal policy initiatives
introduced by the government but nonetheless are important because they exert a certain level of
influence over institutional behaviour. The first instrument is the Maclean’s university rankings.
Maclean’s first published university rankings in 1991 (interestingly this also coincided with the
rise of the “new accountability” movement). Due to the tremendous effect the rankings have on
prospective students they have become quite influential. Universities and colleges are aware that
as soon as anything is measured on a common scale, “there will be a temptation on the part of
some not just to compare one institution against another, but also to place all institutions in an
ordinal fashion depending on the results” (Educational Policy Institute, 2008, p. 1). In other
words, when something is measured, “the results can be ranked, and this creates a certain amount
of trepidation among institutions” (Educational Policy Institute, 2008, p. 1). Over time, this
trepidation from the institutions has evolved from first boycotting the rankings in the early years
to modification of behaviour for the purpose of improving rank (Scott, 2013, p. 117).
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 12
The second accountability instrument is the National Survey on Student Engagement
(NSSE). The NSSE is a questionnaire on student engagement and satisfaction that is given to
first-year and fourth-year students every two years. The influence over institutional behaviour
that the NSSE generates is derived from its data organization structure: “NSSE is organized via
consortia, which allows institutions to compare their performance against selected peer
institutions (it is this aspect which makes it popular among administrators, as it fulfills an
important internal benchmarking role without being a ranking instrument)” (Educational Policy
Institute, 2008, p. 1). Both the Maclean’s rankings and the NSSE encompass the third theory of
action through raising institutions’ awareness of their own performance (Argyris & Schön,
1996). This leads to “comparisons across institutions that stimulate institutions’ pride and status-
striving and that motivate changes in institutional behaviours – resulting, theoretically, in
improved outcomes” (Ziskin et al., 2014, p. 14).
This concludes the review of performance-based accountability policies in Ontario, their
underlying theory, and the reconceptualization of accountability as viewed through a “new
accountability” movement lens. What happens when the mechanistic, market-based performance
accountability frameworks of the “new accountability” movement infringe on SAS divisions? I
tackle the answer in the next section.
ACCOUNTABILITY AND STUDENT AFFAIRS AND SERVICES
In part two, I present the theoretical foundations of SAS practice for the purpose of
understanding how the profession has developed to what it is now. Through an understanding of
the theoretical foundation it becomes evident that a tension exists between the principles of
performance-based accountability policies and the principles of SAS. I breakdown the evolution
of SAS practice into two movements – the student development movement and the student
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 13
learning movement (Dungy & Gordon, 2011, pp. 91-92). I begin first with a brief definition and
description of Student Affairs & Services.
Defining Student Affairs & Services (SAS)
In regards to SAS this term refers to the administrative areas that “provide support
services that facilitate students’ entry, matriculation, engagement, and ultimately post-secondary
education success” (Seifert et al., 2011, p. 7). Examples of these administrative offices include
enrolment management, admissions, registrar services, financial assistance, scholarship services,
orientation and first-year services, housing and residence life, student judicial affairs, counseling
services, student disability services, health and wellness services, career and employment
services, and student leadership, involvement, and service-learning (Hardy Cox & Strange,
2010).
The Student Development Movement
The student development movement began in the 1970s and steadily evolved toward a
universal acceptance of student development theory as the foundation of the profession (Jones &
Abes, 2011, p. 153). The purpose of student development theory is to guide the work of SAS
professionals through describing how students “change and grow during college and what
activities or experiences best influence that growth” (Reason & Broido, 2011, p. 91). It
encompasses a wide range of more specific theories aimed at understanding three basic
developments: “(a) how students’ psychosocial identity formation evolves across the lifespan, (b)
how students make meaning of their experiences through advancement of cognitive-structural
development, and, (c) how students acquire a consistency of approach through development of
personal preferences, styles, and types” (Strange, 2009, p. 20).
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 14
As the movement evolved further, it became clear to SAS professionals that student
success is only partially a function of individual development. Therefore, operating under the
assumption that student behaviour and development are functions of both the person and the
environment (Evans & Reason, 2001), a second set of theories emerged from student
development theories but with a particular focus on the campus environment and how it
contributes to student success (Renn & Patton, 2011; Strange, 2009). Campus environment
theory encompasses four essential elements: “its physical components and design; its dominant
human characteristics; the organizational structures that serve its purposes; and participants’
construction of its presses, social climate, and culture” (Strange, 2009, p. 28). It is within these
theoretical constructs that SAS organizational and structural approaches to student success exist.
According to Renn and Patton (2011), campus environment models “offer valuable theoretical
insight into current challenges facing student affairs administrators in an era of increasing
accountability and emphasis on institutional ‘productivity’” (p. 253). Furthermore, “abundant
evidence links campus climate to student retention and graduation rates, a key component of
productivity” (p. 253), as well as chief indicators used in performance accountability policies.
Campus environment theory played a central role in the design of SAS divisions in the
1970s but slowed by the 1980s. In the mid-1990s it was reinvigorated due to several converging
factors which included a reconceptualization of SAS practice, the rising popularity and use of
performance indicators, and the emergence of the “new accountability” era. In the next section I
expand on this point but first it is important to understand that, although the movements I am
discussing proceed through time in a linear fashion, SAS theory itself is additive. Therefore,
each movement is superimposed on the one previous.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 15
The Student Learning Movement
According to Reason and Broido (2011), “the student development movement has pushed
us as student affairs professionals to see ourselves as educators concerned about holistic student
growth and development” (p. 92). This reconceptualization triggered a shift in how SAS
professionals view themselves and their work and, as a result, effectively launched the student
learning movement. In Canada the main organization for SAS professionals, the Canadian
Association of College and University Student Services (CACUSS) published the Mission of
Student Services (1989) just as both the learning movement and the “new accountability”
movement were gaining steam. Listed in the document is an essential premise that coincides
with the reconceptualization of SAS:
Student Services professionals are educators. They have knowledge and expertise
about students that makes them an invaluable resource in the students’ educational
experience. The expertise of Student Services professionals is important in creating the
climate to help students develop the necessary skills to optimize their learning
opportunities. (CACUSS, 1989)
One might argue Canadian SAS professionals were one step ahead of their US counterparts in
the reconceptualization of the profession. It wasn’t until 1994 when the American College
Personnel Association (ACPA) (1994) published the seminal work The Student Learning
Imperative that, for the first time in the US, clearly situated learning (and not service) at the
forefront of SAS practice.
The concept of linking student development and learning together to form the
foundations of SAS practice gained momentum throughout the 1990s and into the 2000s (AAHE,
1998; UNESCO, 2002). In 2004 the publication of Learning Reconsidered (Keeling, 2004)
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 16
marked a further advancement of SAS practice through the addition and application of
transformative learning theory (Mezirow, 2000).
Through the use of transformative learning theory, Keeling (2004) contends that
“transformative education places the student’s reflective processes at the core of the learning
experience and asks the student to evaluate both new information and the frames of reference
through which the information acquires meaning” (p. 9). Furthermore, through transformative
learning theory, the purpose of the educational involvement of SAS is the evolution and
development of student identity including but not limited to their cognitive, affective, behavioral,
and spiritual identities (Keeling, 2004, p. 9). Therefore, in addition to academic learning (in
class), the critical element that transformative learning theory brings to SAS practice is the
conceptual uniting of learning, development, and identity formation. Transformative learning
theory recognizes the “essential integration of personal development with learning; it reflects the
diverse ways through which students may engage, as whole people with multiple dimensions and
unique personal histories, with the tasks and content of learning” (Keeling, 2004, p. 3). SAS
practice then parallels the academic mission and becomes a type of experiential pedagogy (Fried,
2012). Learning is now a holistic endeavour through which SAS professionals are nicely
situated to aid in the acquiring and refining of life skills, the same life skills that can positively
affect learning outcomes and student engagement survey results that are used in performance
indicators.
Currently, SAS practice operates within a holistic, transformative framework rooted in
the belief that a university’s purpose is to educate and guide students’ growth and maturity so
they may be active contributors in a democratic society. However, as I discuss in the next
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 17
section, performance-based accountability policies operate under a mechanistic (Taylor,
2007/1911) framework. At a fundamental level there appears to be conflict between the two.
Conflict and Tension
Performance-based accountability policies and SAS originate from divergent, even
contradictory philosophies. This creates conflict and tension in practice. According to the
Ontario Confederation of University Faculty Associations (OCUFA, 2006),
critics of performance indicators [PI] have expressed a wide range of concerns about how
PIs are used to measure performance in universities today. PIs and the accountability
movement in general have been characterized as a means of implementing public funding
cuts while asserting greater government control over previously autonomous
institutions…. They express concern about an over reliance on quantitative indicators
that serve as a poor proxy for quality and neglect those elements of academia that cannot
be reduced to a simple indicator. (p.8)
What values, beliefs, assumptions, and philosophies are at play in SAS and performance
accountability policies that create conflict and tension? To answer this question I provide an in-
depth examination of the issue through four dynamics – fundamental designs, differing
paradigms, separate conceptions of process, and organizational theory.
To begin with, tension arises from the fact that SAS practice, as conceptualized through
transformative learning theory (Mezirow, 2000), has an organic, fluid, holistic approach that is
quite often very difficult to measure in quantitative terms. The SAS practitioner approaches
education as an instrument for democracy, for citizenship, for personal discovery. Performance
accountability policies, in contrast, are mechanistic (Taylor, 2007/1912). They measure
performance through standardization of ranking, ratios, and percentages and are concerned with
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 18
routinization, efficiency, productivity, and performativity. They approach education as an
instrument for economic competitiveness with emphasis on the technical aspect of learning.
Performance-based accountability policies include performance indicators that are “imbued with
a consumer ideology that encourages the view of education as a commodity” (Shanahan, 2009, p.
9). An inherent assumption within performance indicators (and performance-based
accountability policies) is that as “larger and larger numbers of students enrol in universities, and
as substantial tuition becomes a fact of life for more and more of these students, higher education
is viewed as a consumer good, with a degree or credential or a job as essential outcomes for all
students” (Eaton, 2013, p.131). Under this assumption the purpose of university shifts from
operating within the social (development) and educational (learning) domains to a mechanistic
domain. In other words, “the idea is to make graduates as employable as soon as possible….
Universities thus become cheap training centres for industry…. This has little or nothing to
do with education, but everything to do with markets and management” (Bruneau & Savage,
2002, p. 57). Through standardized and routinized performance indicators and performance-
based accountability policies, universities are stripped of their humanistic and cultural aspects,
and ultimately their autonomy.
Second, at the very basic level the conflict and tension between performance-based
accountability policies and SAS divisions can be understood through differences in philosophical
worldviews. A worldview is defined as a paradigm, or lens, through which experiences are
viewed to interpret, construct, and determine meaning and understanding. Performance-based
accountability policies are derived from the positivist worldview where institutions (and
departments and divisions within them) are assumed to be linear, objectivist, hierarchical, and
measureable (Fried, 2012, p. 49); in a word mechanistic. On the other hand, SAS practice is
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 19
derived from the constructivist worldview where the focus is on relationships and perspectives.
Constructivism tends to be nonlinear, interactive, and unpredictable, and described in terms of
probabilities. It operates in a web-like fashion and looks for interactive patterns and trends rather
than unidirectional cause and effect; in a word human. According to Fried (2012), the dominant
paradigm in SAS incorporates constructivism by focusing on inquiry into such areas as identity
formation through “increasingly complex forms, intra- and interpersonal competence, practical
competence, persistence and academic achievement, humanitarianism, and civic engagement” (p.
62). The positivist paradigm is scientific in nature and therefore sees truth as unchanging; it
views basic reality as a physical entity so it can be “measured, counted and seen, touched, or
apprehended in some other physical modality” (Fried, 2012, p. 49). By applying positivist
assumptions to performance indicators that measure student success an inconsistency develops.
One of the major discrepancies between performance-based accountability policies and
the practice of SAS is in how each conceptualizes student success. Performance-based
accountability policies focus on “numerical measurements that are supposed to show ‘quality’.
They are increasingly focused on outputs in order to facilitate comparisons” (Shanahan, 2009,
p.10). Performance indicators, therefore, lack an explanation of the process involved in
achieving student success. They are unable to ask how and why a student is learning. As a
result of this oversight performance indicators are referred to as “over-simplistic” (Lang, 2005, p.
18), “reductionist” (Shanahan, 2009, p. 11), and “narrow” (Bruneau & Savage, 2002, p. 4).
Performance indicators over-simplify the total amount of effort and work students, SAS
practitioners, and faculty put forth to achieve student success.
Conversely, SAS practitioners view student success in a more incremental process
through the continued achievement of multiple goals throughout the students’ career. Through
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 20
this conception success is more than a solitary outcome of completing a credential (degree,
diploma, or certificate). Instead, success is defined in terms of “setting and achieving academic
AND personal goals, developing life skills, becoming career ready, and igniting a passion for
lifelong learning” (Seifert, 2015, para. 3). Success is not seen as an end itself but as the
accumulation of learning throughout the journey. Seifert (2015) suggests that policy makers “re-
orient their notion of success from that of a discrete outcome (credential completion) to one that
recognizes success in the decisions students make and the opportunities they seek that place them
in good stead towards a long term goal” (para. 4). This is a difficult task as performance
indicators would need to somehow measure success as a journey through the decisions and
opportunities students face. The Differentiation Policy Framework, and its performance funding
2.0 orientation, is attempting to reconcile this issue through the Strategic Mandate Agreement
(SMA) exercise with the use of negotiated performance indicators between institutions and the
government that focus on incremental measures of student success in areas of engagement and
satisfaction (MTCU, 2013, p. 14).
Finally, the last dynamic I explore involves insights from organization theory. Through
the organization theory lens, the tension and conflict occur because SAS and performance-based
accountability policies operate on what Strange (2009) describes as two opposite ends of the
organizational continuum, “at one end are dynamically organized environments: flexible in
design, less centralized, and informal; at the other end are static environments: rigid, centralized,
and formal” (p. 27). Bolman and Deal (1997) developed the four-frame model which provides
an understanding of how organizations operate through interpretive lenses – structural, human
resource, political, and symbolic.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 21
Performance-based accountability policies operate within and are interpreted through the
structural frame. The structural frame can best be illustrated through the use of the machine
metaphor (Morgan, 1997) and assumes there is a rational system in place where approaches to
organization are standardized and routinized. Performance indicators operate under the
assumption that individual institutional measurements are equally weighted (however the weight
may be defined). This is not always the case. For example Tam (2001) illustrates the difficulty
of measuring institutional quality with performance indicators:
It examines the quality of education provision against the expressed aspirations of the
individual institution. If the institution has high aspirations, quality is to be measured
against this yardstick. That might make it more difficult for a university to succeed than
another which set itself lower aspirations. Taken to absurdity, a university which aspired
to produce rubbish, and succeeded, would be of higher quality than a university which
claimed intellectual excellence, but narrowly failed. (p. 50)
If I were to apply this logic to the Ontario system it would be like measuring and then ranking
the quality of the educational experience at a large central urban university compared to a small
rural northern university. I am not saying that any university strives to produce rubbish, just that
there are several other factors at play (including the culture of a university) that distinguish the
quality of an educational experience for which performance indicators are not able to
numerically measure. As a consequence of the difficulties in measuring institutional intangibles
such as culture, the structural frame “tends to dominate, down playing the role of emotions and
completely ignoring unconscious processes that may be shaping organizational behavior and
functioning” (Kezar, 2011, p. 239).
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 22
In contrast to the structural frame, SAS divisions exist within and are interpreted through
the human resources frame (Bolman & Deal, 1997). The commonly used metaphor illustrating
this frame is of a family (Morgan, 1997). SAS leaders and practitioners who use the human
resource frame “see themselves as serving and supporting others in the organization” (Komvies,
2011, p. 363). They seem to “possess an egalitarian ethic among different groups on campus”
(Kezar, 2011, p. 233). Additionally, they concentrate on growth and engagement of students
through focusing on their motivation, needs, commitment, and learning (Kezar, 2011, p. 228).
Tull and Freeman (2011) conducted a study of 478 SAS administrators and found that the human
resource frame is their frame of choice (p. 39). Their findings are not surprising given the
collaborative and humanistic nature of SAS work.
The multiple dynamics presented above paint a clear picture of conflict and tension
between performance-based accountability policies and SAS. As a consequence, SAS
practitioners are forced to justify their work in terms of its relevance to the economy. They must
show how their departments, offices, programs, services, workshops, and activities provide
students with the skills needed by industry. Therefore, performance-based accountability
policies redefine SAS priorities. To the extent that these accountability policies accomplish the
reordering of SAS priorities, they undermine the holistic, transformative approach. However,
SAS practitioners are resilient! In responding to the call for greater accountability, “student
affairs professionals have continued their focus on learning outcomes and assessment in order to
demonstrate student affairs programs and services’ valuable contributions to the development of
the whole student” (Dungy & Gordon, 2011, p. 74). Performance indicators and performance-
based accountability are a major component of the Differentiation Policy Framework in Ontario
(MTCU, 2013): the age of accountability is firmly entrenched. Furthermore, the government of
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 23
Ontario plans to expand performance-based funding policies with the intent of attaching greater
financial incentives to performance indicators (COU, 2013; HEQCO, 2013; OCUFA, 2013;
Ziskin et al., 2014). It becomes beneficial for SAS divisions to accept performance-based
accountability policies as the norm and to develop their divisions to best succeed in an
accountability era. In part three of the paper I outline several organizational recommendations
towards this end.
ACCOUNTABILITY AND ORGANIZATION OF SAS
My final aim is to provide recommendations on the organization of SAS divisions.
Within the “new accountability” era the assumption that SAS divisions aid in the transformative
development and learning of students is not good enough. We need to prove it. But simply
providing proof is not good enough either. We need to go one step further. We are now required
to show how this proof improves overall institutional performance as measured through
performance indicators incorporated within performance-based accountability policies. How can
SAS practitioners reduce the conflict and tension apparent between their own mandate and the
mandates of performance accountability to aid in institution and student success? To answer this
question I provide recommendations on three distinct, yet interrelated, aspects of SAS
organizational characteristics – the inclusion of students in governance, assessment, and
leadership.
The Inclusion of Students
In the beginning of the paper I provided a conception of accountability through a series of
questions. One of those questions was: to whom should universities be held accountable? The
answer to that question is often assumed to be the government and the tax paying public. Does it
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 24
not, however, make sense to also include students? A recent report from the Ontario University
Students Association (2014) asked the same question:
If all those who fund post-secondary institutions are those to whom institutions must be
held accountable…then there is no reason that institutions should not also be held
accountable to their students…. And yet there has been no serious discussion about how
universities can be held accountable to both the public at large and to their students. (pp.
5-6)
The Ontario University Students Association report points to a general lack of meaningful
student participation in the governance of universities in Ontario. Narrowing the focus to SAS
governance, the same general lack of student participation exists as well. To me this seems
contradictory to the transformative, holistic approach to learning from which we as SAS
practitioners have conceptualized our field.
In terms of the type of SAS governance I am referring to, I am not suggesting that student
representatives be placed next to the Senior Student Affairs and Services Officer (SSASO) and
participate in daily decision-making. I am suggesting that students and SAS practitioners work
together to decide on and develop learning outcomes that should be achieved through the
participation and engagement of students in SAS programs, workshops, activities, etc. Through
collaboration on the development of learning outcomes, both SAS practitioners and students
accept responsibility for setting priorities and ensuring their achievement. This collaborative
dynamic is particularly important in the “new accountability” era because successful completion
of learning outcomes directly impacts performance indicators such as engagement and
satisfaction metrics, which are in turn used in measuring institutional performance at the system
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 25
level. The Differentiation Policy Framework specifically mentions satisfaction and engagement
within the teaching and learning component of proposed metrics (MTCU, 2013, p. 14).
Furthermore, collaboration between students and SAS practitioners in developing
learning outcomes emphasizes transformative, holistic learning through incorporating “reflective
practices that include provocative questions and stimulate students’ assessments of their own
meaning making” (King & Baxter Magolda, 2011, p. 215). SAS divisions become truly student-
centred divisions and, as Strange and Hardy Cox (2009) point out, “if ever there were a first
principle that frames both what we do and why we do what we do as student services
professionals, this would be it” (p. 238).
Assessment
In an era of increased institutional accountability it only makes sense that SAS divisions hold
themselves accountable as well. For SAS divisions this means asking two questions: 1. What
are we doing to facilitate student success; 2. How are we doing in meeting this goal? According
to the Higher Education Quality Council of Ontario’s (HEQCO) Second Annual Report (2009)
the second most common category cited by institutions for addressing and tracking quality was
based on social aspects of student success. The report goes on to define social aspects:
“surveying students for satisfaction in their first and last years; the development of new social
services programs and activities; tracking participation in student services including academic,
personal and career training; tracking participation in orientation/transition programs; and
participation in social events” (HEQCO, 2009, p. 121). This is encouraging because it proves
that institutions are interested in the contributions that SAS divisions make to the overall
educational experience. It also, however, requires SAS divisions to have their own assessment
procedures in place.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 26
Keeling, Wall, Underhile, and Dungy (2008) acknowledge that “assessment is integral to,
perhaps even synonymous with, learning” (p. 6). This is an excellent observation because it re-
conceptualizes assessment, a naturally mechanistic procedure, to fit nicely into the
transformative, holistic approach of SAS:
When one realizes that to learn is to make meaning of events (notice: learning is not just
acquiring and applying new knowledge; it encompasses the transformative process of
making meaning of that knowledge), then, the full breadth of what it means “to learn”
can be understood and conceptualized. Based on that premise, to assess (which is to
observe) then is the foundation of learning. Just as we ask students to make meaning – to
learn – from their experiences, so too must we make meaning of their learning. (Keeling
et al., 2008, p. 6)
Assessment practice can, therefore, be viewed as a form of critical reflection of the learning that
has taken place by both students and SAS divisions. Re-conceptualizing assessment practice in
this way, therefore, alleviates the tension and conflict between SAS divisions and performance-
based accountability policies. The use of performance-based accountability policies makes it
important to embrace assessment practice now more than ever. After all, “making meaning of
how, what, when, and where students learn is a vital, exciting, and inspiring component of higher
education” (Keeling et al., 2008, p. 6).
Leadership
The Student Affairs and Services Senior Officer (SASSO) is situated in the best position
to effect change. At the same time, the SASSO is also in the best (or worst?) position to observe
the conflict and tension between performance-based accountability policies and the divisions
they run. Kezar (2004), through a leadership lens, posits: what is more effective to improving
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 27
overall governance of higher education institutions? Organizational structures and processes, or
relationships and trust? Answering this question is beyond the scope of this paper, but the
question itself portrays the tension and conflict between performance-based accountability
policies and SAS divisions nicely. On one end, there are the rigid, mechanistic, tangible,
structures and processes. On the other end, the fluid, organic, intangible aspects of relationships
and trust. To allow SAS practitioners to practice a transformative, holistic approach as well as
“work to the performance indicator” the SSASO needs to incorporate aspects of both sides.
Therefore, there are two dynamics to leadership that I provide recommendations for. The first
dynamic consists of the relationship and trust aspect. The second consists of the organizational
structures and processes aspect.
Leading SAS divisions through relationships and trust fundamentally involves the
philosophical approach that the SSASOs apply to their practice. The type of leadership practice
that I believe would best allow the SSASO to lead with transformative, holistic principles at the
centre is called transformative leadership (Shields, 2011). Transformative leadership emphasizes
“the need for education to focus both on academic excellence and on social transformation”
(Shields, 2011, p. 3). Shields defines transformative leadership as beginning with “questions of
justice and democracy; it critiques inequitable practices and offers the promise not only of
greater individual achievement but of a better life lived in common with others” (p. 559). The
historical foundations of the theory evolved from Burns’ (1978) conception of leadership as a
transformative force and takes seriously Freire’s (1998) belief “that education is not the ultimate
lever for social transformation, but without it transformation cannot occur” (p. 37).
Transformative leaders are people oriented; they build relationships and develop goals
and identify strategies for their accomplishment. Therefore, the essential work of the
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 28
transformative leader is to “create learning contexts or communities in which social, political,
and cultural capital is enhanced in such a way as to provide equity of opportunity for students as
they take their place as contributing members of society” (Shields, 2010, p. 572).
The second aspect of leadership deals with organizational structures and processes. The
recommendations I make for organizational structures and processes draw from campus
ecology/environment theory (Renn & Patton, 2011; Strange, 2009) and are situated within a
student-centred model for SAS divisions. My first recommendation for the SSASO is that the
connections and contributions to the university mission must be clearly communicated not only
in an upwards fashion to senior administrators, but also in a downwards fashion to practitioners
within the division. SAS practitioners who can connect their day-to-day work to the overall
mission of the university are better situated to understand the purpose and importance of what
they are doing.
A second recommendation for the SASSO concerns the physical organization of the SAS
division. SAS divisions are horizontal in nature because they address the needs of all students in
all schools. The identification of desired learning outcomes within SAS divisions, then, creates
“a new horizontal force – accountability for producing a group of outcomes for all students”
(Keeling, Underhile, & Wall, 2007, p. 25). Furthermore, this horizontal force “challenges
student affairs leadership to adopt a curricular approach to the assessment, conceptualization,
planning, implementation, and evaluation of programmatic and student learning outcomes” (p.
25). According to a study of 278 SAS practitioners and 14 SASSOs by Seifert et al. (2011):
A student-focused approach to the organizational structure was often characterized by
departments that serve similar or complimentary functions or address common student
issues and concerns being grouped together in a unit such that students were more likely
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 29
to have a seamless experience. To the extent that it was practical, a student-focused
approach to the structure physically located these departments in close proximity. (p. 24)
This physical placement of SAS offices and departments is popularly called a “one stop shop” at
universities in Canada and the US and is an organizational strategy that I strongly recommend.
I realize the above recommendations may be outside the purview of the SSASO due to
financial restrictions, space constraints, or institutional limitations. Nonetheless, research has
illustrated these designs increase student success (Keeling, Underhile, & Wall, 2007; Seifert et
al., 2011), and therefore, I believe should be implemented if possible.
Conclusion rising
Beginning in the early 1990s a “new accountability” movement in higher education arose
out of increased pressure for a broader conceptualization of accountability and greater
transparency on university performance. As a result, the meaning of accountability was re-
conceptualized towards market mechanisms focused on productivity, efficiency, and
competitiveness. The framework of choice for governments in a “new accountability” era
became performance-based accountability policies. Within Ontario, Key Performance
Indicators, Multi-Year Accountability Agreements, and the Differentiation Policy Framework
represented differing forms and degrees of performance-based accountability policies. The
fundamental principles underlying these policies denote mechanistic characteristics and
approaches to higher education. Contrasting the view of marketization and mechanization of
higher education, SAS divisions fundamentally operate on a transformative, holistic level under
the assumption that a university’s purpose is to develop successful democratic citizens through
the combination of learning and development, in addition to developing skilled graduates who
are prepared to enter the workforce. This has caused conflict and tension between SAS divisions
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 30
and the application of performance-based accountability policies on institutions of which they
are a part of. Through performance-based accountability policies, the SAS practitioner works
“toward the performance indicator”. But how does the mechanistic, quantifiable performance
indicator measure the intangibles of education—critical thinking, creativity, maturation, and
emotional growth?
The age of accountability is here. Fortunately, the student affairs profession has been
“nimble and has adapted to institutional missions and the needs of students” in the past (Dungy
& Gordon, 2011, p. 75). SAS practitioners must now adapt once more to the demands of the
“new accountability” movement. Fried (2012) metaphorically describes the educational mission
of the SAS profession as “border crossing” (p. 108). The idea is that SAS practitioners help
students cross borders between living (their development) and learning (their education). In
doing so, SAS practitioners cross borders themselves, from service providers to joining faculty
as educators. Performance-based accountability policies represent another border that needs to
be crossed to reduce the conflict and tension. Accomplishing this will allow SAS practitioners to
not only continue providing holistic, transformative learning opportunities to students, but will
provide a holistic, transformative learning experience for themselves as well.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 31
References
Accountability. (2015). In Merriam-Webster’s online dictionary. Retrieved from
http://www.merriam-webster.com/inter?dest=/dictionary/accountability
Alexander, F. K. (2000). The changing face of accountability: Monitoring and assessing
institutional performance in higher education. Journal of Higher Education, 411-431.
American Association of Higher Education (AAHE). (1998). Powerful partnerships: A shared
responsibility for learning. Washington, DC: Authors.
American College Personnel Association (ACPA). (1994). The student learning imperative:
Implications for student affairs. Retrieved from
http://www.myacpa.org/sites/default/files/ACPA%27s%20Student%20Learning%20Impe
rative.pdf
Argyris, C., & Schon, D. A. (1996). Organizational learning II: Theory, methods, and practice.
Reading, MA: Addison-Wesley.
Bolman, L. G., & Deal, T. E. (1997). Reframing organizations: Artistry, choice, and leadership.
San Francisco, CA: Jossey-Bass.
Bruneau, W. A., & Savage, D. C. (2002). Counting out the scholars: How performance
indicators undermine universities and colleges. Toronto, ON: James Lorimer & Co. Ltd.
Retrieved from
https://www.academia.edu/2562310/Counting_out_the_scholars_How_performance_indi
cators_undermine_universities_and_colleges
Burke, J. C., & Minassians, H. (2003). Real accountability or accountability ‘lite’: Seventh
annual survey, 2003. Rockefeller Institute of Government. Albany, NY. Retrieved from
https://www.soe.vt.edu/highered/files/Perspectives_PolicyNews/07-03/7thSurvey.pdf
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 32
Burns, J.M. (1978). Leadership. New York, NY: Harper & Row.
Canadian Association of College and University Student Services (CACUSS). (1989). The
mission of student services. Retrieved from
http://www.cacuss.ca/publications_mission_student.htm
Choise, S. (2015, March 12). New funding formula for Ontario universities to include input
from employers. The Globe and Mail. Retrieved from
http://www.theglobeandmail.com/news/national/new-funding-formula-for-ontario-
universities-to-include-input-from-
employers/article23442771/?ts=150313092306&ord=1
Clark, I. D., Moran, G., Skolnik, M. L., & Trick, D. (2009). Academic transformation: The
forces reshaping higher education in Ontario. Montreal and Kingston: Queen’s Policy
Studies Series, McGill-Queen’s University Press.
Council of Ontario Universities (COU). (2013). Performance-based funding. Retrieved from
http://www.cou.on.ca/publications/reports/pdfs/cou-background-paper---performance-
based-funding--
Dougherty, K. J., & Reddy, V. (2011). The impacts of state performance funding systems on
higher education institutions: Research literature review and policy recommendations.
CCRC Working Paper No. 37. Community College Research Center, Columbia
University. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/impacts-
state-funding-higher-education.pdf
Dungy, G., & Gordon, S. A. (2011). The development of student affairs. In In J. H. Schuh, S. R.
Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 61-79).
San Francisco, CA: Jossey-Bass.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 33
Educational Policy Institute. (2008). Producing indicators of institutional quality in Ontario
universities and colleges: Options for producing, managing and displaying comparative
data. Toronto, ON: Higher Education Quality Council of Ontario. Retrieved from
http://higheredstrategy.com/wp-content/uploads/2011/07/Quality_Indicators.pdf
Eaton, J. (2013). Rankings, new accountability tools, and quality assurance. In P.T.M.
Marope, P.J. Wells & E. Hazelkorn (Eds.), Rankings and accountability in higher
education: Uses and misuses (pp. 113-127). Paris, France: United Nations Educational,
Scientific, and Cultural Organization (UNESCO). Retrieved from
http://unesdoc.unesco.org/images/0022/002207/220789e.pdf
Evans, N. J., & Reason, R. D. (2001). Guiding principles: A review and analysis of student
affairs philosophical statements. Journal of College Student Development, 42(4), 359-
377.
Freire, P. (1998). Pedagogy of freedom: Ethics, democracy, and civic courage. Lanham,
MD: Rowman & Littlefield.
Fried, J. (2012). Transformative learning through engagement: Student affairs practice
as experiential pedagogy. Sterling, VA: Stylus Publishing.
Hardy Cox, D., & Strange, C. C. (2010). Achieving student success: Effective student
services in Canadian higher education. Montreal, QB: McGill-Queen's University Press.
Higher Education Quality Council of Ontario (HEQCO). (2013). Quality: Shifting the focus. A
report from the expert panel to assess the strategic mandate agreement submissions.
Toronto, ON: Higher Education Quality Council of Ontario.
Higher Education Quality Council of Ontario (HEQCO). (2012). The productivity of the Ontario
public postsecondary system preliminary report. Toronto, ON: Higher Education Quality
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 34
Council of Ontario.
Higher Education Quality Council of Ontario (HEQCO). (2009). Second annual review and
research plan. Toronto, ON: Higher Education Quality Council of Ontario. Retrieved
from http://www.heqco.ca/SiteCollectionDocuments/Second%20Annual%20Review%
20and%20Research%20Plan.pdf
Hillman, N. W., Tandberg, D. A., & Gross, J. P. K. (2014). Performance funding in higher
education: Do financial incentives impact college completions?. The Journal of Higher
Education, (85)6, 826-857.
Jones, S.R., & Abes, E.S. (2011). The nature and uses of theory. In J. H. Schuh, S. R. Jones, &
S. R. Harper (Eds.), Student Services: A handbook for the profession (pp. 149-167). San
Francisco, CA: Jossey-Bass.
Keeling, R. P., Wall, A. F., Underhile, R., & Dungy, G. J. (2008). Assessment reconsidered:
Institutional effectiveness for student success. USA: International Center for Student
Success & Institutional Accountability.
Keeling, R. P., Underhile, R., & Wall, A. F. (2007). Horizontal and vertical structures: The
dynamics in higher education. Liberal Education, 93(4), 22-31.
Keeling, R. P. (Ed.). (2004). Learning reconsidered. Washington, DC: American College
Personnel Association and National Association of Student Affairs Professionals.
Retrieved from
http://www.naspa.org/images/uploads/main/Learning_Reconsidered_Report.pdf
Kezar, A. (2011). Organization theory. In J.H. Schuh, S.R. Jones, &
S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 226-241). San
Francisco, CA: Jossey-Bass.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 35
Kezar, A. (2004). What is more important to effective governance: Relationships, trust, and
leadership, or structures and formal processes?. New directions for higher education,
(127), 35-46. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/he.154/abstract
King, P. M., & Baxter Magolda, M. B. (2011). Student learning. In J. H. Schuh, S. R. Jones, &
S. R. Harper (Eds.), Student Services: A handbook for the profession (pp. 207-225). San
Francisco, CA: Jossey-Bass.
Komvies, S. R. (2011). Leadership. In J. H. Schuh, S. R. Jones, & S. R. Harper (Eds.), Student
Services: A handbook for the profession (pp. 353-371). San Francisco, CA: Jossey-Bass.
Lang, D . W. (2013). Performance funding: Past, present and future. CUPA/ MTCU/ HEQCO
day. January 9, 2013. Retrieved from
http://www.oise.utoronto.ca/lhae/Programs/Higher_Education/Past_Seminar_Presentatio
ns.html
McLendon, M. K., Hearn, J. C., & Deaton, R. (2006). Called to account: Analyzing the origins
and spread of state performance-accountability policies for higher education. Educational
Evaluation and Policy Analysis, 28(1), 1-24.
Mezirow, J., & Associates. (2000). Learning as transformation: Critical perspectives on
a theory in progress. San Francisco, CA: Jossey Bass
Ministry of Training, Colleges, and Universities (MTCU). (November, 2013). Ontario’s
differentiation policy framework for postsecondary education. Retrieved from
http://www.tcu.gov.on.ca/pepg/publications/PolicyFramework_PostSec.pdf
Morgan, G. (1997). Images of organization. Thousand Oaks, CA: Sage.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 36
Natow, R. S., Pheatt, L., Dougherty, K. J., Jones, S. M., Lahr, H., & Reddy, V. (2014).
Institutional changes to organizational policies, practices, and programs following the
adoption of state-level performance funding policies. CCRC Working Paper Number 76.
Community College Research Center. Columbia University, NY. Retrieved from
http://ccrc.tc.columbia.edu/media/k2/attachments/institutional-changes-following-
performance-funding-policies.pdf
Ontario. Ministry of Finance. (2005). 2005 Ontario budget: Investing in people, strengthening
our economy. Toronto: Author. Retrieved from
http://www.fin.gov.on.ca/en/budget/ontariobudgets/2005/pdf/bke1.pdf
Ontario Confederation of University Faculty Associations (OCUFA). (May, 2006). The
measured academic quality controls in Ontario universities. OCUFA Research Paper.
Retrieved from http://ocufa.on.ca/assets/Measured_Academic_May_2006.pdf
Ontario University Student Alliance (OUSA). (2014). Policy paper: Accountability. Retrieved
from http://www.ousa.ca/wordpress/wp-content/uploads/2014/11/Accountability-Fall-
2014.pdf
Reason, R.D., & Broido, E.M. (2011). Philosophies and values. In J.H. Schuh, S.R. Jones, &
S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 80-95). San
Francisco, CA: Jossey-Bass.
Renn, K. A., & Patton, L. D. (2011). Campus ecology and environments. In J.H. Schuh, S.R.
Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 242-
256). San Francisco, CA: Jossey-Bass.
Scott, P. (2013). Ranking higher education institutions: A critical perspective. In P.T.M.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 37
Marope, P.J. Wells & E. Hazelkorn (Eds.), Rankings and accountability in higher
education: Uses and misuses (pp. 113-127). Paris, France: United Nations Educational,
Scientific, and Cultural Organization (UNESCO). Retrieved from
http://unesdoc.unesco.org/images/0022/002207/220789e.pdf
Seifert, T. A. (2015, January 26). Thoughts from space; wisdom on Earth [Web log post].
Retrieved from https://supportingstudentsuccess.wordpress.com/2015/01/26/thoughts-
from-space-wisdom-on-earth/
Seifert, T.A., & Burrow, J. (2013). Perceptions of student affairs and services practitioners in
Ontario’s post-secondary institutions: An examination of colleges and universities.
Canadian Journal of Higher Education, 43(2), 132-148.
Seifert, T. A., Arnold, C., Burrow, J., & Brown, A. (2011). Supporting student success:
The role of student services within Ontario’s postsecondary institutions. Toronto, ON:
Higher Education Quality Council of Ontario. Retrieved from
http://www.heqco.ca/SiteCollectionDocuments/Supporting%20Student%20SuccessENG.
pdf
Shanahan, T. (2009). Accountability initiatives in higher education: An overview of the impetus
to accountability, its expressions and implications. In Accounting or accountability in
higher education: Proceedings from the January 8, 2009 OCUFA conference (pp. 3–15).
Retrieved from http://edu.apps01.yorku.ca/news/wp-content/uploads/shanahan-ocufa-
articlesmarch10094.pdf
Shields, C. M. (2010). Transformative leadership: Working for equity in diverse contexts.
Educational Administration Quarterly, 46(4), 558-589. doi:10.1177/0013161X10375609
Shields, C. M. (2011). Transformative leadership: A reader. New York, NY: Peter Lang.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 38
Strange, C. C. (2009). Theoretical foundation of student services. In D. Hardy Cox & C. C.
Strange (Eds.), Achieving Student Success: Effective Student Services in Canadian
Higher Education. (pp. 18-32). Montreal, QB: McGill-Queen’s University Press.
Sullivan, B. (2009). Organizing, leading, and managing student services. In D. Hardy Cox & C.
C. Strange (Eds.), Achieving Student Success: Effective Student Services in Canadian
Higher Education. (pp. 165-191). Montreal, QB: McGill-Queen’s University Press.
Tam, M. (2001). Measuring quality and performance in education. Quality in Higher Education,
43(1), 47-54. doi:10.1080/13538320120045076
Taylor, F. W. (2007/1912). Scientific Management. In D. S. Pugh (Ed.),
Organization theory: Selected classic readings (5th
ed.) (pp. 275-295). London, UK:
Penguin. (Original work published 1912).
Trow, M. (1973). Problems in the transition from elite to mass higher education. Carnegie
Commission on Higher Education Reprint.
Tull, A., & Freeman, J. P. (2011). Reframing student affairs leadership: An analysis of
organizational frames of reference and locust of control. Research in the Schools, 18(1),
33-43.
United Nations Educational, Scientific, and Cultural Organization (UNESCO). (2002). The role
of student affairs and services in higher education: A practical manual for developing,
implementing and assessing student affairs programmes and services. Paris, France:
UNESCO. Retrieved from http://unesdoc.unesco.org/images/0012/001281/128118e.pdf
Weingarten, H. P., Hicks, M., Jonker, L., & Liu, S. (2013). The diversity of Ontario’s
universities:A data set to inform the differentiation discussion. Toronto, ON: Higher
Education Quality Council of Ontario.
PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 39
Ziskin, M.B., Hossler, D., Rabourn, K., Cekic, O., & Hwang, Y. (2014). Outcomes-Based
Funding: Current Status, Promising Practices and Emerging Trends. Toronto, ON:
Higher Education Quality Council of Ontario.

More Related Content

Viewers also liked

Доклад замгубернатора Олега Зарубы
Доклад замгубернатора Олега ЗарубыДоклад замгубернатора Олега Зарубы
Доклад замгубернатора Олега ЗарубыPavel Zaharov
 
Цільова міні-програма "Світ роди, яка читає"
Цільова міні-програма "Світ роди, яка читає"Цільова міні-програма "Світ роди, яка читає"
Цільова міні-програма "Світ роди, яка читає"Ротанова Юлия
 
Next generation monitoring met environmental DNA - RAVON
Next generation monitoring met environmental DNA - RAVONNext generation monitoring met environmental DNA - RAVON
Next generation monitoring met environmental DNA - RAVONSovon Vogelonderzoek
 
organic farming prospects and constraints
organic farming prospects and constraintsorganic farming prospects and constraints
organic farming prospects and constraintsagriculturalchemistry
 
Pt lippo cikarang keseluruhan
Pt lippo cikarang keseluruhanPt lippo cikarang keseluruhan
Pt lippo cikarang keseluruhansulistyo wibowo
 

Viewers also liked (6)

Доклад замгубернатора Олега Зарубы
Доклад замгубернатора Олега ЗарубыДоклад замгубернатора Олега Зарубы
Доклад замгубернатора Олега Зарубы
 
Marketing plan
Marketing planMarketing plan
Marketing plan
 
Цільова міні-програма "Світ роди, яка читає"
Цільова міні-програма "Світ роди, яка читає"Цільова міні-програма "Світ роди, яка читає"
Цільова міні-програма "Світ роди, яка читає"
 
Next generation monitoring met environmental DNA - RAVON
Next generation monitoring met environmental DNA - RAVONNext generation monitoring met environmental DNA - RAVON
Next generation monitoring met environmental DNA - RAVON
 
organic farming prospects and constraints
organic farming prospects and constraintsorganic farming prospects and constraints
organic farming prospects and constraints
 
Pt lippo cikarang keseluruhan
Pt lippo cikarang keseluruhanPt lippo cikarang keseluruhan
Pt lippo cikarang keseluruhan
 

Similar to Accountability_Alexander_2015

An Organizational Development Framework For Assessing Readiness And Capacity ...
An Organizational Development Framework For Assessing Readiness And Capacity ...An Organizational Development Framework For Assessing Readiness And Capacity ...
An Organizational Development Framework For Assessing Readiness And Capacity ...Angie Miller
 
The Role Of External Factors That Affect Student...
The Role Of External Factors That Affect Student...The Role Of External Factors That Affect Student...
The Role Of External Factors That Affect Student...Nicole Gomez
 
Opening the Black Box: Government Teacher Workforce Policy in New York City
Opening the Black Box: Government Teacher Workforce Policy in New York CityOpening the Black Box: Government Teacher Workforce Policy in New York City
Opening the Black Box: Government Teacher Workforce Policy in New York CityKatharine Stevens
 
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...Julie Roest
 
Pre-Program Evaluation Essay
Pre-Program Evaluation EssayPre-Program Evaluation Essay
Pre-Program Evaluation EssayKatie Parker
 
Application Of Property Theories Of The Beacon Hill
Application Of Property Theories Of The Beacon HillApplication Of Property Theories Of The Beacon Hill
Application Of Property Theories Of The Beacon HillTheresa Singh
 
Active Experimentation And Its Effects On Reality And The...
Active Experimentation And Its Effects On Reality And The...Active Experimentation And Its Effects On Reality And The...
Active Experimentation And Its Effects On Reality And The...Adriana Wilson
 
The Inspiring Education Document By Dr. Richard Moniuszko
The Inspiring Education Document By Dr. Richard MoniuszkoThe Inspiring Education Document By Dr. Richard Moniuszko
The Inspiring Education Document By Dr. Richard MoniuszkoAlison Reed
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program developmentDr Lendy Spires
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program developmentDr Lendy Spires
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program developmentDr Lendy Spires
 
Organizational Effectiveness of Naval State University: Proposed Institutiona...
Organizational Effectiveness of Naval State University: Proposed Institutiona...Organizational Effectiveness of Naval State University: Proposed Institutiona...
Organizational Effectiveness of Naval State University: Proposed Institutiona...Dr. Amarjeet Singh
 
Assessment in social work
Assessment in social workAssessment in social work
Assessment in social workBimal Antony
 
Assessing Service-Learning And Civic Engagement Principles And Techniques
Assessing Service-Learning And Civic Engagement  Principles And TechniquesAssessing Service-Learning And Civic Engagement  Principles And Techniques
Assessing Service-Learning And Civic Engagement Principles And TechniquesPedro Craggett
 

Similar to Accountability_Alexander_2015 (19)

An Organizational Development Framework For Assessing Readiness And Capacity ...
An Organizational Development Framework For Assessing Readiness And Capacity ...An Organizational Development Framework For Assessing Readiness And Capacity ...
An Organizational Development Framework For Assessing Readiness And Capacity ...
 
Organizational change and development
Organizational change and developmentOrganizational change and development
Organizational change and development
 
The Role Of External Factors That Affect Student...
The Role Of External Factors That Affect Student...The Role Of External Factors That Affect Student...
The Role Of External Factors That Affect Student...
 
Opening the Black Box: Government Teacher Workforce Policy in New York City
Opening the Black Box: Government Teacher Workforce Policy in New York CityOpening the Black Box: Government Teacher Workforce Policy in New York City
Opening the Black Box: Government Teacher Workforce Policy in New York City
 
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...
Response Essays. 012 Essay Example Writing Ged Practice Test Extended Respons...
 
Pre-Program Evaluation Essay
Pre-Program Evaluation EssayPre-Program Evaluation Essay
Pre-Program Evaluation Essay
 
Application Of Property Theories Of The Beacon Hill
Application Of Property Theories Of The Beacon HillApplication Of Property Theories Of The Beacon Hill
Application Of Property Theories Of The Beacon Hill
 
Active Experimentation And Its Effects On Reality And The...
Active Experimentation And Its Effects On Reality And The...Active Experimentation And Its Effects On Reality And The...
Active Experimentation And Its Effects On Reality And The...
 
The Inspiring Education Document By Dr. Richard Moniuszko
The Inspiring Education Document By Dr. Richard MoniuszkoThe Inspiring Education Document By Dr. Richard Moniuszko
The Inspiring Education Document By Dr. Richard Moniuszko
 
One On One
One On OneOne On One
One On One
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program development
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program development
 
A model for engaging youth in evidence informed policy and program development
A model for engaging youth in evidence   informed policy and program developmentA model for engaging youth in evidence   informed policy and program development
A model for engaging youth in evidence informed policy and program development
 
Dennis Pruitt, CBMI 2017 - Enrollment Management
Dennis Pruitt, CBMI 2017 - Enrollment ManagementDennis Pruitt, CBMI 2017 - Enrollment Management
Dennis Pruitt, CBMI 2017 - Enrollment Management
 
Ll credit guide
Ll credit guideLl credit guide
Ll credit guide
 
Ece conf2011 ireid
Ece conf2011 ireidEce conf2011 ireid
Ece conf2011 ireid
 
Organizational Effectiveness of Naval State University: Proposed Institutiona...
Organizational Effectiveness of Naval State University: Proposed Institutiona...Organizational Effectiveness of Naval State University: Proposed Institutiona...
Organizational Effectiveness of Naval State University: Proposed Institutiona...
 
Assessment in social work
Assessment in social workAssessment in social work
Assessment in social work
 
Assessing Service-Learning And Civic Engagement Principles And Techniques
Assessing Service-Learning And Civic Engagement  Principles And TechniquesAssessing Service-Learning And Civic Engagement  Principles And Techniques
Assessing Service-Learning And Civic Engagement Principles And Techniques
 

Accountability_Alexander_2015

  • 1. Running head: PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES Conflict and Tension: The Impact of Performance-Based Accountability Policies on Student Affairs and Services Phil Alexander Department of Graduate and Undergraduate Studies in Education Submitted in partial fulfillment of the requirements for course EDUC 5Q97 Faculty of Education, Brock University St. Catharines, Ontario © Phil Alexander, 2015
  • 2. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES ii Abstract The age of accountability has arrived. The higher education system in Ontario is in the midst of a “new accountability” movement as evidenced through the implementation of multiple performance-based accountability policies. What does this mean for the practice of Student Affairs & Services (SAS) divisions? Through a historical and theoretical review of both performance-based accountability policies and Student Affairs & Services, I argue that the underlying foundational principles of both are in conflict with each other and therefore create tension in practice. Furthermore, as a Student Affairs and Services practitioner myself, I provide recommendations on how we can organize ourselves to positively influence performance indicators without sacrificing the holistic, transformative ideals we have come to embrace.
  • 3. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES iii Acknowledgements I would like to acknowledge the professors whom I have had the pleasure of learning from throughout my journey; their experience, knowledge, and direction have been invaluable. In alpha order: Dr. Denise Armstrong, Dr. Jill Grose, Dr. Catherine Hands, Dr. Xiaobin Li, Dr. Coral Mitchell, Dr. Dolana Mogadime, Dr. Lissa Paul, Dr. Jennifer Rowsell, and Dr. Nicola Simmons. Finally, to my family, S, E, and O; thank you for three and a half years of support. This is for you!
  • 4. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES iv Table of Contents Abstract……………………………………………………………………………………………ii Acknowledgements………………………………………………………………………………iii Introduction………………………………………………………………………………………..1 Accountability and the Ontario Context……………………………………………………….….3 Defining Accountability…………………………………………………………………….….3 The “New Accountability” Movement………………………………………………………...4 Types of Performance-Based Accountability Policies……………………………………6 Performance-Based Accountability in Ontario………………………………………………...7 Key Performance Indicators………………………………………………………………8 Multi-Year Accountability Agreements………………………………………….……….8 Differentiation Policy Framework………………………………………………………...9 Maclean’s Rankings……………………………………………………………………...11 National Survey of Student Engagement………………………………………………...11 Accountability and Student Affairs & Services………………………………………………….12 Defining Student Affairs and Services………………………………………………………..13 The Student Development Movement...…………………………………………….…….......13 The Student Learning Movement……………………………………………………………..15 Conflict & Tension……………………………………………………………………………17 Accountability and Organization of Student Affairs & Services……………………….……..…23 Inclusion of Students……………………………………………………………………….…23 Assessment……………………………………………………………………………………25 Leadership…………………………………………………………………………………….26 Conclusion……………………………………………………………………………………….29 References......................................................................................................................................31
  • 5. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 1 Beneath and behind the growing public interest in measures and devices of all kinds, there was a dream. … It was the dream of education and society as machines, efficient devices for the attainment of high social objectives on one hand, and inculcation of measurable knowledge and marketable skills on the other. It was the idea that a machine- like solution could be found to the ancient problem of assigning the young to their proper places and professions. It was the hope that a truly scarce good—advanced education— could mechanically and fairly be distributed, through exact testing and accountancy, to every deserving person. (Bruneau & Savage, 2002, p. 26) Over the past fifty years access to Ontario’s higher education system has transitioned from an elite system to a near universal system (Clark, Moran, Skolnik, & Trick, 2009). With greater access came an increased interest from stakeholders in the performance of institutions (Clark et al., 2009) leading to what has been termed the “new accountability” movement in higher education (Hillman, Tandberg, & Gross, 2014). Within the “new accountability” movement governments place greater emphasis on the performance of institutions through accountability policies that include performance indicators such as graduation rate, retention rate, course completion, credits earned, etc. How does the “new accountability” movement impact Student Affairs and Services (SAS) divisions operating within universities? Much has been written regarding the mechanistic (Taylor, 2007/1912) principles underlying performance indicators and performance-based accountability policies in higher education (Alexander, 2000; Bruneau & Savage, 2002; Shannon, 2009). This begs the question, what is the purpose of universities? Are they primarily breeding grounds for job-ready graduates? Are universities accountable to the demands of local, national, or international market economies? Or are universities a place for personal growth and maturation through transformative learning experiences with societal ideals such as democracy and responsible citizenship at its core? Are they accountable to the students they serve? I have worked for ten years at two universities in Ontario in several jobs situated under the SAS umbrella. I believe the
  • 6. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 2 role of university is to foster learning through holistic transformation. I believe in the university as a place for personal growth and maturation where students learn to be contributing members of a democratic society. I approach my work as a Student Affairs and Services practitioner this way. However, with the growing use and influence of performance-based accountability policies in Ontario I believe SAS divisions and the practitioners within them have been forced to “work to the performance indicator”. Indeed, according to Reason and Broido (2011), the “new accountability” movement “change[s] what we do and how we see ourselves professionally. Student affairs professionals now focus on learning outcomes and creating curricula to guide the achievement of those outcomes” (p. 92). I believe this brings the mechanistic principles of performance-based accountability policies in direct conflict with the holistic, transformative approach of SAS. Therefore, through a historical and theoretical review of performance-based accountability policies and Student Affairs & Services I argue that the underlying principles of both are in conflict with each other and therefore create tension in practice. I am also a realist and understand that the age of accountability has not only arrived but is here to stay. So in addition to arguing that the tension and conflict exist, I also provide recommendations on how SAS divisions can organize themselves to positively influence performance indicators without sacrificing the holistic, transformative ideals we have come to embrace. I present my argument in three parts. In part one I examine what we mean by accountability, describe the “new accountability” movement in higher education, and situate the Ontario context within it. In part two I provide an overview of the theoretical foundations for and evolution of SAS practice followed by an examination of several dynamics of why tension and conflict exist between performance-based accountability policies and SAS. In part three, I provide recommendations on organizational characteristics that I believe will position SAS
  • 7. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 3 divisions to influence performance indicators while still allowing for a holistic, transformative approach. ACCOUNTABILITY AND THE ONTARIO CONTEXT When using the term accountability in higher education there are a multitude of possible meanings. Burke (2005) states “accountability is the most advocated and least analyzed word in higher education” (p. 1). The journey through which the meaning of accountability has traversed is examined below. Defining Accountability On a basic level accountability is defined as “an obligation or willingness to accept responsibility or to account for one’s actions” (“Accountability”, 2015). Conceptually, accountability can be framed through several questions, “who is accountable to whom, for what purposes, for whose benefit, by which means, and with what consequences?” (Burke, 2005, p. 2). Within higher education the answers to these questions changed in the late 1980s to early 1990s. Prior to the early 1990s the Ontario higher education system was considered elitist, meaning the number of people who attended postsecondary institutions was below 15% of the population (Trow, 1973). Institutional accountability and quality were defined primarily in terms of the resources available by “referring to such indicators as the faculty-student ratio, operating expenditures per student, the value of library acquisitions, and the amount of capital expenditures” (Clark et al., 2009, p. 114). However, the convergence of several factors in the early 1990s transformed how accountability was conceptualized (Hillman et al., 2014; McLendon, Hearn, & Deaton, 2006; Rutherford & Rabovsky, 2014). Beginning with an ideological shift towards neo-liberalist social policy, the Ontario government placed greater emphasis on the belief “in the virtue and infallibility of global markets” (Shanahan, 2009, p. 4).
  • 8. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 4 With the rising popularity of the idea that investment in higher education produced a competitive advantage in the national and international economy, the government of Ontario increased access to postsecondary institutions resulting in massification and expansion of the system (Clark et al., 2009). Of course, with a larger post-secondary system came increased costs to fund it. Pressures arose, as a consequence of increased cost, for both a “broader conceptualization of accountability and for the provision of information on university performance to the government and external stakeholders” (Clark et al., 2009, p. 115). The “New Accountability” Movement As economic priorities took centre stage at universities the meaning of accountability increasingly became defined in market terms. Shanahan (2009) illustrates examples of market terms used: Business and private sector criteria are employed to make education decisions. Job training and meeting labour market needs have become key education priorities. Economic principles of productivity, efficiency and competitiveness have become imperatives. And we have seen our accountability frameworks become infused with market discourse, market principles and market mechanisms. (pp. 4-5) As a consequence of the reconceptualization, governments and policy-makers turned to performance-based accountability policies “as the model of choice for resource allocation to public colleges and universities” (Alexander, 2000, p. 419). The national and international rise in popularity of performance-based accountability policies has been termed the “new accountability” movement in higher education (Hillman et al., 2014). These new approaches to accountability, according to Bruneau and Savage (2002), were billed as a “new wave that would render old-fashioned statistics obsolete…. A university might boast fine buildings, a celebrated
  • 9. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 5 staff, and an excellent library, but these did not guarantee students would actually learn” (p. 2). They could not guarantee students were actually learning because they were not measuring if students had learned. The lack of comparable indicators to measure learning influenced the need to reform how quality was determined. Rutherford and Rabovsky (2014) summarize the basic logic of performance-based accountability policies below: Rather than allocating resources primarily on the basis of inputs such as enrollments, these reformers seek to shift the funding mechanisms to student outcomes such as graduation rates and degree production. They argue that, under traditional budget arrangements, universities have little incentive to care much about student outcomes, and have thus tended to focus on other priorities including graduate education, research productivity, and capital investments in new buildings…. By reformulating the incentives that universities face, so that institutions are rewarded or punished primarily based on actual performance (outcomes) rather than simple input measures, performance funding seeks to stimulate shifts in institutional behaviour that will result in greater efficiency and productivity. (p. 187) The theoretical principles that performance-based accountability policies embody draw from “theories of action” (Argyris & Schön, 1996). The first and most prominent theory of action reflects a resource-dependency perspective wherein “public funding is manipulated to stimulate market profit incentives that, according to the theory, motivate institutions to improve performance” (Ziskin, Hossler, Rabourn, Cekic, & Hwang, 2014, p. 14). In addition to resource- dependency theory, there are three other theories of action that help clarify the behaviour of institutions:
  • 10. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 6 The second theory of action described supposes that policies persuade institutions to agree with public policy makers on the importance of improved student outcomes. It follows then, in the theory, that these institutions change their behaviors to improve student outcomes. The third theory of action involves raising institutions’ awareness of their performance, leading naturally to comparisons across institutions that stimulate institutions’ pride and status-striving and that motivate changes in institutional behaviors – resulting, theoretically, in improved outcomes. The fourth theory of action…entails providing institutions with resources to support greater capacity on key performance indicators and improved practice as learning organizations. (Ziskin et al., 2014, p. 14) It is important to note the theoretical principles of performance-based accountability policies because they help to understand why institutions react the way they do. They also aid in determining how and why the conflict and tension causes shifts in behavior of SAS divisions. Types of Performance-Based Accountability Policies The “new accountability” movement has generated three types of performance-based accountability policies (Burke & Minassians, 2002; COU, 2013; McLendon et al., 2006). The first type, performance funding, is an approach that links government funding directly to the performance of institutions on individual indicators. Under these types of policies, “the relationship between performance and funding is predetermined and prescribed: if an institution meets a specified performance target, it receives a designated amount or percentage of funding” (McLendon et al., 2006, p. 2). The best example of this type of policy in practice is in the state of Tennessee. Tennessee is the only state to have 100% of its institutions’ public funding based on performance indicators allocated through the state funding formula (Dougherty & Reddy, 2011; Natow, Pheatt, Dougherty, Jones, Lahr, & Reddy, 2014; Ziskin et al., 2014).
  • 11. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 7 Performance budgeting, in contrast, considers institutional performance as only one factor in determining funding allocations. In this model, “the possibility of additional funding due to good or improved performance depends solely on the judgment and discretion of state [provincial], coordinating, or system officials” (Burke & Minassians, 2003, p. 3). Performance budgeting policies are much more prevalent in practice due to their ‘safer’ characteristics over performance funding policies (Lang, 2013, p.25). Consequently, there are numerous examples in use including in twenty one US states (Burke & Minassians, 2003, p. 8), in Australia (Lang, 2013, p. 22), and in Ontario (Ziskin et al., 2014, p. 12). Lastly, performance reporting policies provide stakeholders with reports and/or indicators of institutional performance but with no formal links to funding. Therefore, this model “relies on information and publicity rather than funding or budgeting to encourage colleges and universities to improve their performance” (Burke & Minassians, 2003, p. 3). This policy option is the most popular in jurisdictions that implement performance-based accountability policies of some sort. Burke and Minassians (2003) report that forty six US states (92%) have incorporated some form of performance reporting (p. 12). Within Canada, all provinces except Nova Scotia, Manitoba, and Newfoundland have some form of performance reporting (Ziskin et al., 2014, p. 11). With an understanding of the three types of accountability-based policies currently in use – performance funding, budgeting, and reporting – let us review the Ontario higher education performance-based accountability policies. Performance-Based Accountability in Ontario Three distinct policies are related to accountability and institutional performance in the Ontario higher education system – Key Performance Indicators (KPI), Multi-Year Accountability Agreements (MYAA), and the Differentiation Policy Framework for Higher
  • 12. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 8 Education. Furthermore, two non-policy instruments influence institutions’ behaviour through accountability-based mechanisms – the Maclean’s annual ranking of universities and the National Survey of Student Engagement (NSSE). Key Performance Indicators Beginning in 1995 and coinciding with the rise in popularity of “new accountability” policies, the Ontario government introduced its first performance-based accountability policy, key performance indicators (KPI). The first iteration of KPIs were a form of performance reporting because the basic idea was to provide institutional information to students so that they could make a more informed decision (Lang, 2013). The specific indicators used were graduation rate, employment rate six months and twenty-four months after graduation, and default rates on Ontario Student Assistance Program (OSAP) loans. However, soon after implementation of the KPIs as informational guides, the government decided to link KPIs to a portion of the annual operating funding of institutions (2%), thus evolving the KPIs from a performance reporting policy to a performance budgeting policy. The shift in usage for the KPIs signaled the beginning of the government’s venture into the performance-based accountability arena for the explicit purpose of aligning institutional behaviour to government priorities (Lang, 2013, p. 7). Multi-Year Accountability Agreements In 2005, as a part of the Reaching Higher plan for Ontario higher education (Ontario, 2005), the government launched the Multi-Year Accountability Agreement (MYAA) process. The policy had three basic features, “to outline the government’s commitment to stable funding, [to] articulate each institution’s commitment to accessibility, quality improvements and measurements of results, and [to] tie the commitment to results” (HEQCO, 2009, p. 1). The
  • 13. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 9 MYAAs are a performance budgeting accountability policy because they represent only one factor in overall institutional funding. In these agreements, the individual institutions indicate strategies, programs, and performance targets in regard to goals set by the government. After goals are set, the institutions subsequently report back on the progress of performance in relation to those goals over a multi-year span. Therefore, “through the goal setting and review process, the government [was] able to exercise a degree of control over post-secondary institutions that did not exist before the MYAA process” (Clark et al., 2009, p. 128). By 2009-10 the Reaching Higher plan was scheduled to end and the MYAAs offered a basis for a revised accountability framework. However, the government decided to pursue a much larger systemic transformation in the form of the Differentiation Policy Framework. Differentiation Policy Framework In 2013 the government announced its most comprehensive policy yet aimed at system- wide transformation. The Differentiation Policy Framework has overall goals of increasing quality, access, productivity, sustainability, and accountability of the higher education sector (HEQCO, 2013; MTCU, 2013; Weingarten, Hicks, Jonker, & Liu, 2013). Although those aspects of the system interact and intersect, the specific interest here is the revamped accountability framework. Hillman, Tandberg, & Fryar (2014) have observed that “many of the more recent performance-based policies in higher education have sought to move away from a one-size-fits- all approach to include a range of performance indicators that value many different types of student success” (p. 6). In the performance-based policy literature, these types of policies are known as Performance Funding 2.0:
  • 14. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 10 Performance funding 2.0 programs, while not sacrificing indicators of ultimate outcomes, also put considerable emphasis on indicators of intermediate achievement: for example, course completions; successful completion of developmental education courses or programs; passage of key gateway courses such as college mathematics or college English. (Dougherty & Reddy, 2011, p. 6) Including specific performance indicators that measure intermediate outcomes allows for a greater effect on measuring the difference between individual institutions, as opposed to general, system-wide indicators. Intermediate performance indicators are important for the Ontario context because of the wide range of institutions (size, type, location) operating within the province. The Ontario government understood this discrepancy while designing the Differentiation Policy Framework. The first step outlined in the policy was to formally negotiate Strategic Mandate Agreements (SMAs) with the institutions. Institutions outlined their strengths and areas they planned to focus on as well as indicated specific performance indicators by which to measure these strengths. Both sides then agreed on a complete, final set of performance indicators that measures institutions with system-wide metrics and institution specific metrics. Currently, the Ontario government has signed SMAs with all publically funded postsecondary institutions. The SMAs are in effect until 2017. The Differentiation Policy Framework is a type of hybrid accountability framework in that it is a performance budgeting policy because the performance indicators negotiated provide only one aspect of overall institutional funding (the base operating grant is still calculated per full-time enrolments). However, through the SMA exercise, funding formula reform is an ultimate goal of the government (COU, 2013; HEQCO, 2013; OCUFA, 2013; Ziskin et al., 2014). Performance-accountability policies that specifically link funding directly to the funding
  • 15. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 11 formula and include performance indicators that attempt to measure intermediate outcomes (as opposed to just ultimate outcomes like graduation rate) have been termed “performance funding 2.0” (Dougherty & Reddy, 2011). Reforming the Ontario funding formula to fund institutions, in part, based on outcomes (or performance) would indicate a performance-based funding 2.0 model. Although the government’s intentions are to reform the funding formula, at this point in time the government is effectively steering the system with mandates through the SMA exercise. In April 2015 the government of Ontario plans to sit down with all forty-four postsecondary institutions, as well as representative of employers in Ontario, to begin discussions on funding formula reform (Chiose, 2015). Maclean’s University Rankings and the National Survey of Student Engagement There are two more accountability instruments that are not formal policy initiatives introduced by the government but nonetheless are important because they exert a certain level of influence over institutional behaviour. The first instrument is the Maclean’s university rankings. Maclean’s first published university rankings in 1991 (interestingly this also coincided with the rise of the “new accountability” movement). Due to the tremendous effect the rankings have on prospective students they have become quite influential. Universities and colleges are aware that as soon as anything is measured on a common scale, “there will be a temptation on the part of some not just to compare one institution against another, but also to place all institutions in an ordinal fashion depending on the results” (Educational Policy Institute, 2008, p. 1). In other words, when something is measured, “the results can be ranked, and this creates a certain amount of trepidation among institutions” (Educational Policy Institute, 2008, p. 1). Over time, this trepidation from the institutions has evolved from first boycotting the rankings in the early years to modification of behaviour for the purpose of improving rank (Scott, 2013, p. 117).
  • 16. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 12 The second accountability instrument is the National Survey on Student Engagement (NSSE). The NSSE is a questionnaire on student engagement and satisfaction that is given to first-year and fourth-year students every two years. The influence over institutional behaviour that the NSSE generates is derived from its data organization structure: “NSSE is organized via consortia, which allows institutions to compare their performance against selected peer institutions (it is this aspect which makes it popular among administrators, as it fulfills an important internal benchmarking role without being a ranking instrument)” (Educational Policy Institute, 2008, p. 1). Both the Maclean’s rankings and the NSSE encompass the third theory of action through raising institutions’ awareness of their own performance (Argyris & Schön, 1996). This leads to “comparisons across institutions that stimulate institutions’ pride and status- striving and that motivate changes in institutional behaviours – resulting, theoretically, in improved outcomes” (Ziskin et al., 2014, p. 14). This concludes the review of performance-based accountability policies in Ontario, their underlying theory, and the reconceptualization of accountability as viewed through a “new accountability” movement lens. What happens when the mechanistic, market-based performance accountability frameworks of the “new accountability” movement infringe on SAS divisions? I tackle the answer in the next section. ACCOUNTABILITY AND STUDENT AFFAIRS AND SERVICES In part two, I present the theoretical foundations of SAS practice for the purpose of understanding how the profession has developed to what it is now. Through an understanding of the theoretical foundation it becomes evident that a tension exists between the principles of performance-based accountability policies and the principles of SAS. I breakdown the evolution of SAS practice into two movements – the student development movement and the student
  • 17. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 13 learning movement (Dungy & Gordon, 2011, pp. 91-92). I begin first with a brief definition and description of Student Affairs & Services. Defining Student Affairs & Services (SAS) In regards to SAS this term refers to the administrative areas that “provide support services that facilitate students’ entry, matriculation, engagement, and ultimately post-secondary education success” (Seifert et al., 2011, p. 7). Examples of these administrative offices include enrolment management, admissions, registrar services, financial assistance, scholarship services, orientation and first-year services, housing and residence life, student judicial affairs, counseling services, student disability services, health and wellness services, career and employment services, and student leadership, involvement, and service-learning (Hardy Cox & Strange, 2010). The Student Development Movement The student development movement began in the 1970s and steadily evolved toward a universal acceptance of student development theory as the foundation of the profession (Jones & Abes, 2011, p. 153). The purpose of student development theory is to guide the work of SAS professionals through describing how students “change and grow during college and what activities or experiences best influence that growth” (Reason & Broido, 2011, p. 91). It encompasses a wide range of more specific theories aimed at understanding three basic developments: “(a) how students’ psychosocial identity formation evolves across the lifespan, (b) how students make meaning of their experiences through advancement of cognitive-structural development, and, (c) how students acquire a consistency of approach through development of personal preferences, styles, and types” (Strange, 2009, p. 20).
  • 18. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 14 As the movement evolved further, it became clear to SAS professionals that student success is only partially a function of individual development. Therefore, operating under the assumption that student behaviour and development are functions of both the person and the environment (Evans & Reason, 2001), a second set of theories emerged from student development theories but with a particular focus on the campus environment and how it contributes to student success (Renn & Patton, 2011; Strange, 2009). Campus environment theory encompasses four essential elements: “its physical components and design; its dominant human characteristics; the organizational structures that serve its purposes; and participants’ construction of its presses, social climate, and culture” (Strange, 2009, p. 28). It is within these theoretical constructs that SAS organizational and structural approaches to student success exist. According to Renn and Patton (2011), campus environment models “offer valuable theoretical insight into current challenges facing student affairs administrators in an era of increasing accountability and emphasis on institutional ‘productivity’” (p. 253). Furthermore, “abundant evidence links campus climate to student retention and graduation rates, a key component of productivity” (p. 253), as well as chief indicators used in performance accountability policies. Campus environment theory played a central role in the design of SAS divisions in the 1970s but slowed by the 1980s. In the mid-1990s it was reinvigorated due to several converging factors which included a reconceptualization of SAS practice, the rising popularity and use of performance indicators, and the emergence of the “new accountability” era. In the next section I expand on this point but first it is important to understand that, although the movements I am discussing proceed through time in a linear fashion, SAS theory itself is additive. Therefore, each movement is superimposed on the one previous.
  • 19. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 15 The Student Learning Movement According to Reason and Broido (2011), “the student development movement has pushed us as student affairs professionals to see ourselves as educators concerned about holistic student growth and development” (p. 92). This reconceptualization triggered a shift in how SAS professionals view themselves and their work and, as a result, effectively launched the student learning movement. In Canada the main organization for SAS professionals, the Canadian Association of College and University Student Services (CACUSS) published the Mission of Student Services (1989) just as both the learning movement and the “new accountability” movement were gaining steam. Listed in the document is an essential premise that coincides with the reconceptualization of SAS: Student Services professionals are educators. They have knowledge and expertise about students that makes them an invaluable resource in the students’ educational experience. The expertise of Student Services professionals is important in creating the climate to help students develop the necessary skills to optimize their learning opportunities. (CACUSS, 1989) One might argue Canadian SAS professionals were one step ahead of their US counterparts in the reconceptualization of the profession. It wasn’t until 1994 when the American College Personnel Association (ACPA) (1994) published the seminal work The Student Learning Imperative that, for the first time in the US, clearly situated learning (and not service) at the forefront of SAS practice. The concept of linking student development and learning together to form the foundations of SAS practice gained momentum throughout the 1990s and into the 2000s (AAHE, 1998; UNESCO, 2002). In 2004 the publication of Learning Reconsidered (Keeling, 2004)
  • 20. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 16 marked a further advancement of SAS practice through the addition and application of transformative learning theory (Mezirow, 2000). Through the use of transformative learning theory, Keeling (2004) contends that “transformative education places the student’s reflective processes at the core of the learning experience and asks the student to evaluate both new information and the frames of reference through which the information acquires meaning” (p. 9). Furthermore, through transformative learning theory, the purpose of the educational involvement of SAS is the evolution and development of student identity including but not limited to their cognitive, affective, behavioral, and spiritual identities (Keeling, 2004, p. 9). Therefore, in addition to academic learning (in class), the critical element that transformative learning theory brings to SAS practice is the conceptual uniting of learning, development, and identity formation. Transformative learning theory recognizes the “essential integration of personal development with learning; it reflects the diverse ways through which students may engage, as whole people with multiple dimensions and unique personal histories, with the tasks and content of learning” (Keeling, 2004, p. 3). SAS practice then parallels the academic mission and becomes a type of experiential pedagogy (Fried, 2012). Learning is now a holistic endeavour through which SAS professionals are nicely situated to aid in the acquiring and refining of life skills, the same life skills that can positively affect learning outcomes and student engagement survey results that are used in performance indicators. Currently, SAS practice operates within a holistic, transformative framework rooted in the belief that a university’s purpose is to educate and guide students’ growth and maturity so they may be active contributors in a democratic society. However, as I discuss in the next
  • 21. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 17 section, performance-based accountability policies operate under a mechanistic (Taylor, 2007/1911) framework. At a fundamental level there appears to be conflict between the two. Conflict and Tension Performance-based accountability policies and SAS originate from divergent, even contradictory philosophies. This creates conflict and tension in practice. According to the Ontario Confederation of University Faculty Associations (OCUFA, 2006), critics of performance indicators [PI] have expressed a wide range of concerns about how PIs are used to measure performance in universities today. PIs and the accountability movement in general have been characterized as a means of implementing public funding cuts while asserting greater government control over previously autonomous institutions…. They express concern about an over reliance on quantitative indicators that serve as a poor proxy for quality and neglect those elements of academia that cannot be reduced to a simple indicator. (p.8) What values, beliefs, assumptions, and philosophies are at play in SAS and performance accountability policies that create conflict and tension? To answer this question I provide an in- depth examination of the issue through four dynamics – fundamental designs, differing paradigms, separate conceptions of process, and organizational theory. To begin with, tension arises from the fact that SAS practice, as conceptualized through transformative learning theory (Mezirow, 2000), has an organic, fluid, holistic approach that is quite often very difficult to measure in quantitative terms. The SAS practitioner approaches education as an instrument for democracy, for citizenship, for personal discovery. Performance accountability policies, in contrast, are mechanistic (Taylor, 2007/1912). They measure performance through standardization of ranking, ratios, and percentages and are concerned with
  • 22. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 18 routinization, efficiency, productivity, and performativity. They approach education as an instrument for economic competitiveness with emphasis on the technical aspect of learning. Performance-based accountability policies include performance indicators that are “imbued with a consumer ideology that encourages the view of education as a commodity” (Shanahan, 2009, p. 9). An inherent assumption within performance indicators (and performance-based accountability policies) is that as “larger and larger numbers of students enrol in universities, and as substantial tuition becomes a fact of life for more and more of these students, higher education is viewed as a consumer good, with a degree or credential or a job as essential outcomes for all students” (Eaton, 2013, p.131). Under this assumption the purpose of university shifts from operating within the social (development) and educational (learning) domains to a mechanistic domain. In other words, “the idea is to make graduates as employable as soon as possible…. Universities thus become cheap training centres for industry…. This has little or nothing to do with education, but everything to do with markets and management” (Bruneau & Savage, 2002, p. 57). Through standardized and routinized performance indicators and performance- based accountability policies, universities are stripped of their humanistic and cultural aspects, and ultimately their autonomy. Second, at the very basic level the conflict and tension between performance-based accountability policies and SAS divisions can be understood through differences in philosophical worldviews. A worldview is defined as a paradigm, or lens, through which experiences are viewed to interpret, construct, and determine meaning and understanding. Performance-based accountability policies are derived from the positivist worldview where institutions (and departments and divisions within them) are assumed to be linear, objectivist, hierarchical, and measureable (Fried, 2012, p. 49); in a word mechanistic. On the other hand, SAS practice is
  • 23. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 19 derived from the constructivist worldview where the focus is on relationships and perspectives. Constructivism tends to be nonlinear, interactive, and unpredictable, and described in terms of probabilities. It operates in a web-like fashion and looks for interactive patterns and trends rather than unidirectional cause and effect; in a word human. According to Fried (2012), the dominant paradigm in SAS incorporates constructivism by focusing on inquiry into such areas as identity formation through “increasingly complex forms, intra- and interpersonal competence, practical competence, persistence and academic achievement, humanitarianism, and civic engagement” (p. 62). The positivist paradigm is scientific in nature and therefore sees truth as unchanging; it views basic reality as a physical entity so it can be “measured, counted and seen, touched, or apprehended in some other physical modality” (Fried, 2012, p. 49). By applying positivist assumptions to performance indicators that measure student success an inconsistency develops. One of the major discrepancies between performance-based accountability policies and the practice of SAS is in how each conceptualizes student success. Performance-based accountability policies focus on “numerical measurements that are supposed to show ‘quality’. They are increasingly focused on outputs in order to facilitate comparisons” (Shanahan, 2009, p.10). Performance indicators, therefore, lack an explanation of the process involved in achieving student success. They are unable to ask how and why a student is learning. As a result of this oversight performance indicators are referred to as “over-simplistic” (Lang, 2005, p. 18), “reductionist” (Shanahan, 2009, p. 11), and “narrow” (Bruneau & Savage, 2002, p. 4). Performance indicators over-simplify the total amount of effort and work students, SAS practitioners, and faculty put forth to achieve student success. Conversely, SAS practitioners view student success in a more incremental process through the continued achievement of multiple goals throughout the students’ career. Through
  • 24. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 20 this conception success is more than a solitary outcome of completing a credential (degree, diploma, or certificate). Instead, success is defined in terms of “setting and achieving academic AND personal goals, developing life skills, becoming career ready, and igniting a passion for lifelong learning” (Seifert, 2015, para. 3). Success is not seen as an end itself but as the accumulation of learning throughout the journey. Seifert (2015) suggests that policy makers “re- orient their notion of success from that of a discrete outcome (credential completion) to one that recognizes success in the decisions students make and the opportunities they seek that place them in good stead towards a long term goal” (para. 4). This is a difficult task as performance indicators would need to somehow measure success as a journey through the decisions and opportunities students face. The Differentiation Policy Framework, and its performance funding 2.0 orientation, is attempting to reconcile this issue through the Strategic Mandate Agreement (SMA) exercise with the use of negotiated performance indicators between institutions and the government that focus on incremental measures of student success in areas of engagement and satisfaction (MTCU, 2013, p. 14). Finally, the last dynamic I explore involves insights from organization theory. Through the organization theory lens, the tension and conflict occur because SAS and performance-based accountability policies operate on what Strange (2009) describes as two opposite ends of the organizational continuum, “at one end are dynamically organized environments: flexible in design, less centralized, and informal; at the other end are static environments: rigid, centralized, and formal” (p. 27). Bolman and Deal (1997) developed the four-frame model which provides an understanding of how organizations operate through interpretive lenses – structural, human resource, political, and symbolic.
  • 25. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 21 Performance-based accountability policies operate within and are interpreted through the structural frame. The structural frame can best be illustrated through the use of the machine metaphor (Morgan, 1997) and assumes there is a rational system in place where approaches to organization are standardized and routinized. Performance indicators operate under the assumption that individual institutional measurements are equally weighted (however the weight may be defined). This is not always the case. For example Tam (2001) illustrates the difficulty of measuring institutional quality with performance indicators: It examines the quality of education provision against the expressed aspirations of the individual institution. If the institution has high aspirations, quality is to be measured against this yardstick. That might make it more difficult for a university to succeed than another which set itself lower aspirations. Taken to absurdity, a university which aspired to produce rubbish, and succeeded, would be of higher quality than a university which claimed intellectual excellence, but narrowly failed. (p. 50) If I were to apply this logic to the Ontario system it would be like measuring and then ranking the quality of the educational experience at a large central urban university compared to a small rural northern university. I am not saying that any university strives to produce rubbish, just that there are several other factors at play (including the culture of a university) that distinguish the quality of an educational experience for which performance indicators are not able to numerically measure. As a consequence of the difficulties in measuring institutional intangibles such as culture, the structural frame “tends to dominate, down playing the role of emotions and completely ignoring unconscious processes that may be shaping organizational behavior and functioning” (Kezar, 2011, p. 239).
  • 26. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 22 In contrast to the structural frame, SAS divisions exist within and are interpreted through the human resources frame (Bolman & Deal, 1997). The commonly used metaphor illustrating this frame is of a family (Morgan, 1997). SAS leaders and practitioners who use the human resource frame “see themselves as serving and supporting others in the organization” (Komvies, 2011, p. 363). They seem to “possess an egalitarian ethic among different groups on campus” (Kezar, 2011, p. 233). Additionally, they concentrate on growth and engagement of students through focusing on their motivation, needs, commitment, and learning (Kezar, 2011, p. 228). Tull and Freeman (2011) conducted a study of 478 SAS administrators and found that the human resource frame is their frame of choice (p. 39). Their findings are not surprising given the collaborative and humanistic nature of SAS work. The multiple dynamics presented above paint a clear picture of conflict and tension between performance-based accountability policies and SAS. As a consequence, SAS practitioners are forced to justify their work in terms of its relevance to the economy. They must show how their departments, offices, programs, services, workshops, and activities provide students with the skills needed by industry. Therefore, performance-based accountability policies redefine SAS priorities. To the extent that these accountability policies accomplish the reordering of SAS priorities, they undermine the holistic, transformative approach. However, SAS practitioners are resilient! In responding to the call for greater accountability, “student affairs professionals have continued their focus on learning outcomes and assessment in order to demonstrate student affairs programs and services’ valuable contributions to the development of the whole student” (Dungy & Gordon, 2011, p. 74). Performance indicators and performance- based accountability are a major component of the Differentiation Policy Framework in Ontario (MTCU, 2013): the age of accountability is firmly entrenched. Furthermore, the government of
  • 27. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 23 Ontario plans to expand performance-based funding policies with the intent of attaching greater financial incentives to performance indicators (COU, 2013; HEQCO, 2013; OCUFA, 2013; Ziskin et al., 2014). It becomes beneficial for SAS divisions to accept performance-based accountability policies as the norm and to develop their divisions to best succeed in an accountability era. In part three of the paper I outline several organizational recommendations towards this end. ACCOUNTABILITY AND ORGANIZATION OF SAS My final aim is to provide recommendations on the organization of SAS divisions. Within the “new accountability” era the assumption that SAS divisions aid in the transformative development and learning of students is not good enough. We need to prove it. But simply providing proof is not good enough either. We need to go one step further. We are now required to show how this proof improves overall institutional performance as measured through performance indicators incorporated within performance-based accountability policies. How can SAS practitioners reduce the conflict and tension apparent between their own mandate and the mandates of performance accountability to aid in institution and student success? To answer this question I provide recommendations on three distinct, yet interrelated, aspects of SAS organizational characteristics – the inclusion of students in governance, assessment, and leadership. The Inclusion of Students In the beginning of the paper I provided a conception of accountability through a series of questions. One of those questions was: to whom should universities be held accountable? The answer to that question is often assumed to be the government and the tax paying public. Does it
  • 28. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 24 not, however, make sense to also include students? A recent report from the Ontario University Students Association (2014) asked the same question: If all those who fund post-secondary institutions are those to whom institutions must be held accountable…then there is no reason that institutions should not also be held accountable to their students…. And yet there has been no serious discussion about how universities can be held accountable to both the public at large and to their students. (pp. 5-6) The Ontario University Students Association report points to a general lack of meaningful student participation in the governance of universities in Ontario. Narrowing the focus to SAS governance, the same general lack of student participation exists as well. To me this seems contradictory to the transformative, holistic approach to learning from which we as SAS practitioners have conceptualized our field. In terms of the type of SAS governance I am referring to, I am not suggesting that student representatives be placed next to the Senior Student Affairs and Services Officer (SSASO) and participate in daily decision-making. I am suggesting that students and SAS practitioners work together to decide on and develop learning outcomes that should be achieved through the participation and engagement of students in SAS programs, workshops, activities, etc. Through collaboration on the development of learning outcomes, both SAS practitioners and students accept responsibility for setting priorities and ensuring their achievement. This collaborative dynamic is particularly important in the “new accountability” era because successful completion of learning outcomes directly impacts performance indicators such as engagement and satisfaction metrics, which are in turn used in measuring institutional performance at the system
  • 29. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 25 level. The Differentiation Policy Framework specifically mentions satisfaction and engagement within the teaching and learning component of proposed metrics (MTCU, 2013, p. 14). Furthermore, collaboration between students and SAS practitioners in developing learning outcomes emphasizes transformative, holistic learning through incorporating “reflective practices that include provocative questions and stimulate students’ assessments of their own meaning making” (King & Baxter Magolda, 2011, p. 215). SAS divisions become truly student- centred divisions and, as Strange and Hardy Cox (2009) point out, “if ever there were a first principle that frames both what we do and why we do what we do as student services professionals, this would be it” (p. 238). Assessment In an era of increased institutional accountability it only makes sense that SAS divisions hold themselves accountable as well. For SAS divisions this means asking two questions: 1. What are we doing to facilitate student success; 2. How are we doing in meeting this goal? According to the Higher Education Quality Council of Ontario’s (HEQCO) Second Annual Report (2009) the second most common category cited by institutions for addressing and tracking quality was based on social aspects of student success. The report goes on to define social aspects: “surveying students for satisfaction in their first and last years; the development of new social services programs and activities; tracking participation in student services including academic, personal and career training; tracking participation in orientation/transition programs; and participation in social events” (HEQCO, 2009, p. 121). This is encouraging because it proves that institutions are interested in the contributions that SAS divisions make to the overall educational experience. It also, however, requires SAS divisions to have their own assessment procedures in place.
  • 30. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 26 Keeling, Wall, Underhile, and Dungy (2008) acknowledge that “assessment is integral to, perhaps even synonymous with, learning” (p. 6). This is an excellent observation because it re- conceptualizes assessment, a naturally mechanistic procedure, to fit nicely into the transformative, holistic approach of SAS: When one realizes that to learn is to make meaning of events (notice: learning is not just acquiring and applying new knowledge; it encompasses the transformative process of making meaning of that knowledge), then, the full breadth of what it means “to learn” can be understood and conceptualized. Based on that premise, to assess (which is to observe) then is the foundation of learning. Just as we ask students to make meaning – to learn – from their experiences, so too must we make meaning of their learning. (Keeling et al., 2008, p. 6) Assessment practice can, therefore, be viewed as a form of critical reflection of the learning that has taken place by both students and SAS divisions. Re-conceptualizing assessment practice in this way, therefore, alleviates the tension and conflict between SAS divisions and performance- based accountability policies. The use of performance-based accountability policies makes it important to embrace assessment practice now more than ever. After all, “making meaning of how, what, when, and where students learn is a vital, exciting, and inspiring component of higher education” (Keeling et al., 2008, p. 6). Leadership The Student Affairs and Services Senior Officer (SASSO) is situated in the best position to effect change. At the same time, the SASSO is also in the best (or worst?) position to observe the conflict and tension between performance-based accountability policies and the divisions they run. Kezar (2004), through a leadership lens, posits: what is more effective to improving
  • 31. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 27 overall governance of higher education institutions? Organizational structures and processes, or relationships and trust? Answering this question is beyond the scope of this paper, but the question itself portrays the tension and conflict between performance-based accountability policies and SAS divisions nicely. On one end, there are the rigid, mechanistic, tangible, structures and processes. On the other end, the fluid, organic, intangible aspects of relationships and trust. To allow SAS practitioners to practice a transformative, holistic approach as well as “work to the performance indicator” the SSASO needs to incorporate aspects of both sides. Therefore, there are two dynamics to leadership that I provide recommendations for. The first dynamic consists of the relationship and trust aspect. The second consists of the organizational structures and processes aspect. Leading SAS divisions through relationships and trust fundamentally involves the philosophical approach that the SSASOs apply to their practice. The type of leadership practice that I believe would best allow the SSASO to lead with transformative, holistic principles at the centre is called transformative leadership (Shields, 2011). Transformative leadership emphasizes “the need for education to focus both on academic excellence and on social transformation” (Shields, 2011, p. 3). Shields defines transformative leadership as beginning with “questions of justice and democracy; it critiques inequitable practices and offers the promise not only of greater individual achievement but of a better life lived in common with others” (p. 559). The historical foundations of the theory evolved from Burns’ (1978) conception of leadership as a transformative force and takes seriously Freire’s (1998) belief “that education is not the ultimate lever for social transformation, but without it transformation cannot occur” (p. 37). Transformative leaders are people oriented; they build relationships and develop goals and identify strategies for their accomplishment. Therefore, the essential work of the
  • 32. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 28 transformative leader is to “create learning contexts or communities in which social, political, and cultural capital is enhanced in such a way as to provide equity of opportunity for students as they take their place as contributing members of society” (Shields, 2010, p. 572). The second aspect of leadership deals with organizational structures and processes. The recommendations I make for organizational structures and processes draw from campus ecology/environment theory (Renn & Patton, 2011; Strange, 2009) and are situated within a student-centred model for SAS divisions. My first recommendation for the SSASO is that the connections and contributions to the university mission must be clearly communicated not only in an upwards fashion to senior administrators, but also in a downwards fashion to practitioners within the division. SAS practitioners who can connect their day-to-day work to the overall mission of the university are better situated to understand the purpose and importance of what they are doing. A second recommendation for the SASSO concerns the physical organization of the SAS division. SAS divisions are horizontal in nature because they address the needs of all students in all schools. The identification of desired learning outcomes within SAS divisions, then, creates “a new horizontal force – accountability for producing a group of outcomes for all students” (Keeling, Underhile, & Wall, 2007, p. 25). Furthermore, this horizontal force “challenges student affairs leadership to adopt a curricular approach to the assessment, conceptualization, planning, implementation, and evaluation of programmatic and student learning outcomes” (p. 25). According to a study of 278 SAS practitioners and 14 SASSOs by Seifert et al. (2011): A student-focused approach to the organizational structure was often characterized by departments that serve similar or complimentary functions or address common student issues and concerns being grouped together in a unit such that students were more likely
  • 33. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 29 to have a seamless experience. To the extent that it was practical, a student-focused approach to the structure physically located these departments in close proximity. (p. 24) This physical placement of SAS offices and departments is popularly called a “one stop shop” at universities in Canada and the US and is an organizational strategy that I strongly recommend. I realize the above recommendations may be outside the purview of the SSASO due to financial restrictions, space constraints, or institutional limitations. Nonetheless, research has illustrated these designs increase student success (Keeling, Underhile, & Wall, 2007; Seifert et al., 2011), and therefore, I believe should be implemented if possible. Conclusion rising Beginning in the early 1990s a “new accountability” movement in higher education arose out of increased pressure for a broader conceptualization of accountability and greater transparency on university performance. As a result, the meaning of accountability was re- conceptualized towards market mechanisms focused on productivity, efficiency, and competitiveness. The framework of choice for governments in a “new accountability” era became performance-based accountability policies. Within Ontario, Key Performance Indicators, Multi-Year Accountability Agreements, and the Differentiation Policy Framework represented differing forms and degrees of performance-based accountability policies. The fundamental principles underlying these policies denote mechanistic characteristics and approaches to higher education. Contrasting the view of marketization and mechanization of higher education, SAS divisions fundamentally operate on a transformative, holistic level under the assumption that a university’s purpose is to develop successful democratic citizens through the combination of learning and development, in addition to developing skilled graduates who are prepared to enter the workforce. This has caused conflict and tension between SAS divisions
  • 34. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 30 and the application of performance-based accountability policies on institutions of which they are a part of. Through performance-based accountability policies, the SAS practitioner works “toward the performance indicator”. But how does the mechanistic, quantifiable performance indicator measure the intangibles of education—critical thinking, creativity, maturation, and emotional growth? The age of accountability is here. Fortunately, the student affairs profession has been “nimble and has adapted to institutional missions and the needs of students” in the past (Dungy & Gordon, 2011, p. 75). SAS practitioners must now adapt once more to the demands of the “new accountability” movement. Fried (2012) metaphorically describes the educational mission of the SAS profession as “border crossing” (p. 108). The idea is that SAS practitioners help students cross borders between living (their development) and learning (their education). In doing so, SAS practitioners cross borders themselves, from service providers to joining faculty as educators. Performance-based accountability policies represent another border that needs to be crossed to reduce the conflict and tension. Accomplishing this will allow SAS practitioners to not only continue providing holistic, transformative learning opportunities to students, but will provide a holistic, transformative learning experience for themselves as well.
  • 35. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 31 References Accountability. (2015). In Merriam-Webster’s online dictionary. Retrieved from http://www.merriam-webster.com/inter?dest=/dictionary/accountability Alexander, F. K. (2000). The changing face of accountability: Monitoring and assessing institutional performance in higher education. Journal of Higher Education, 411-431. American Association of Higher Education (AAHE). (1998). Powerful partnerships: A shared responsibility for learning. Washington, DC: Authors. American College Personnel Association (ACPA). (1994). The student learning imperative: Implications for student affairs. Retrieved from http://www.myacpa.org/sites/default/files/ACPA%27s%20Student%20Learning%20Impe rative.pdf Argyris, C., & Schon, D. A. (1996). Organizational learning II: Theory, methods, and practice. Reading, MA: Addison-Wesley. Bolman, L. G., & Deal, T. E. (1997). Reframing organizations: Artistry, choice, and leadership. San Francisco, CA: Jossey-Bass. Bruneau, W. A., & Savage, D. C. (2002). Counting out the scholars: How performance indicators undermine universities and colleges. Toronto, ON: James Lorimer & Co. Ltd. Retrieved from https://www.academia.edu/2562310/Counting_out_the_scholars_How_performance_indi cators_undermine_universities_and_colleges Burke, J. C., & Minassians, H. (2003). Real accountability or accountability ‘lite’: Seventh annual survey, 2003. Rockefeller Institute of Government. Albany, NY. Retrieved from https://www.soe.vt.edu/highered/files/Perspectives_PolicyNews/07-03/7thSurvey.pdf
  • 36. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 32 Burns, J.M. (1978). Leadership. New York, NY: Harper & Row. Canadian Association of College and University Student Services (CACUSS). (1989). The mission of student services. Retrieved from http://www.cacuss.ca/publications_mission_student.htm Choise, S. (2015, March 12). New funding formula for Ontario universities to include input from employers. The Globe and Mail. Retrieved from http://www.theglobeandmail.com/news/national/new-funding-formula-for-ontario- universities-to-include-input-from- employers/article23442771/?ts=150313092306&ord=1 Clark, I. D., Moran, G., Skolnik, M. L., & Trick, D. (2009). Academic transformation: The forces reshaping higher education in Ontario. Montreal and Kingston: Queen’s Policy Studies Series, McGill-Queen’s University Press. Council of Ontario Universities (COU). (2013). Performance-based funding. Retrieved from http://www.cou.on.ca/publications/reports/pdfs/cou-background-paper---performance- based-funding-- Dougherty, K. J., & Reddy, V. (2011). The impacts of state performance funding systems on higher education institutions: Research literature review and policy recommendations. CCRC Working Paper No. 37. Community College Research Center, Columbia University. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/impacts- state-funding-higher-education.pdf Dungy, G., & Gordon, S. A. (2011). The development of student affairs. In In J. H. Schuh, S. R. Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 61-79). San Francisco, CA: Jossey-Bass.
  • 37. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 33 Educational Policy Institute. (2008). Producing indicators of institutional quality in Ontario universities and colleges: Options for producing, managing and displaying comparative data. Toronto, ON: Higher Education Quality Council of Ontario. Retrieved from http://higheredstrategy.com/wp-content/uploads/2011/07/Quality_Indicators.pdf Eaton, J. (2013). Rankings, new accountability tools, and quality assurance. In P.T.M. Marope, P.J. Wells & E. Hazelkorn (Eds.), Rankings and accountability in higher education: Uses and misuses (pp. 113-127). Paris, France: United Nations Educational, Scientific, and Cultural Organization (UNESCO). Retrieved from http://unesdoc.unesco.org/images/0022/002207/220789e.pdf Evans, N. J., & Reason, R. D. (2001). Guiding principles: A review and analysis of student affairs philosophical statements. Journal of College Student Development, 42(4), 359- 377. Freire, P. (1998). Pedagogy of freedom: Ethics, democracy, and civic courage. Lanham, MD: Rowman & Littlefield. Fried, J. (2012). Transformative learning through engagement: Student affairs practice as experiential pedagogy. Sterling, VA: Stylus Publishing. Hardy Cox, D., & Strange, C. C. (2010). Achieving student success: Effective student services in Canadian higher education. Montreal, QB: McGill-Queen's University Press. Higher Education Quality Council of Ontario (HEQCO). (2013). Quality: Shifting the focus. A report from the expert panel to assess the strategic mandate agreement submissions. Toronto, ON: Higher Education Quality Council of Ontario. Higher Education Quality Council of Ontario (HEQCO). (2012). The productivity of the Ontario public postsecondary system preliminary report. Toronto, ON: Higher Education Quality
  • 38. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 34 Council of Ontario. Higher Education Quality Council of Ontario (HEQCO). (2009). Second annual review and research plan. Toronto, ON: Higher Education Quality Council of Ontario. Retrieved from http://www.heqco.ca/SiteCollectionDocuments/Second%20Annual%20Review% 20and%20Research%20Plan.pdf Hillman, N. W., Tandberg, D. A., & Gross, J. P. K. (2014). Performance funding in higher education: Do financial incentives impact college completions?. The Journal of Higher Education, (85)6, 826-857. Jones, S.R., & Abes, E.S. (2011). The nature and uses of theory. In J. H. Schuh, S. R. Jones, & S. R. Harper (Eds.), Student Services: A handbook for the profession (pp. 149-167). San Francisco, CA: Jossey-Bass. Keeling, R. P., Wall, A. F., Underhile, R., & Dungy, G. J. (2008). Assessment reconsidered: Institutional effectiveness for student success. USA: International Center for Student Success & Institutional Accountability. Keeling, R. P., Underhile, R., & Wall, A. F. (2007). Horizontal and vertical structures: The dynamics in higher education. Liberal Education, 93(4), 22-31. Keeling, R. P. (Ed.). (2004). Learning reconsidered. Washington, DC: American College Personnel Association and National Association of Student Affairs Professionals. Retrieved from http://www.naspa.org/images/uploads/main/Learning_Reconsidered_Report.pdf Kezar, A. (2011). Organization theory. In J.H. Schuh, S.R. Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 226-241). San Francisco, CA: Jossey-Bass.
  • 39. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 35 Kezar, A. (2004). What is more important to effective governance: Relationships, trust, and leadership, or structures and formal processes?. New directions for higher education, (127), 35-46. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/he.154/abstract King, P. M., & Baxter Magolda, M. B. (2011). Student learning. In J. H. Schuh, S. R. Jones, & S. R. Harper (Eds.), Student Services: A handbook for the profession (pp. 207-225). San Francisco, CA: Jossey-Bass. Komvies, S. R. (2011). Leadership. In J. H. Schuh, S. R. Jones, & S. R. Harper (Eds.), Student Services: A handbook for the profession (pp. 353-371). San Francisco, CA: Jossey-Bass. Lang, D . W. (2013). Performance funding: Past, present and future. CUPA/ MTCU/ HEQCO day. January 9, 2013. Retrieved from http://www.oise.utoronto.ca/lhae/Programs/Higher_Education/Past_Seminar_Presentatio ns.html McLendon, M. K., Hearn, J. C., & Deaton, R. (2006). Called to account: Analyzing the origins and spread of state performance-accountability policies for higher education. Educational Evaluation and Policy Analysis, 28(1), 1-24. Mezirow, J., & Associates. (2000). Learning as transformation: Critical perspectives on a theory in progress. San Francisco, CA: Jossey Bass Ministry of Training, Colleges, and Universities (MTCU). (November, 2013). Ontario’s differentiation policy framework for postsecondary education. Retrieved from http://www.tcu.gov.on.ca/pepg/publications/PolicyFramework_PostSec.pdf Morgan, G. (1997). Images of organization. Thousand Oaks, CA: Sage.
  • 40. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 36 Natow, R. S., Pheatt, L., Dougherty, K. J., Jones, S. M., Lahr, H., & Reddy, V. (2014). Institutional changes to organizational policies, practices, and programs following the adoption of state-level performance funding policies. CCRC Working Paper Number 76. Community College Research Center. Columbia University, NY. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/institutional-changes-following- performance-funding-policies.pdf Ontario. Ministry of Finance. (2005). 2005 Ontario budget: Investing in people, strengthening our economy. Toronto: Author. Retrieved from http://www.fin.gov.on.ca/en/budget/ontariobudgets/2005/pdf/bke1.pdf Ontario Confederation of University Faculty Associations (OCUFA). (May, 2006). The measured academic quality controls in Ontario universities. OCUFA Research Paper. Retrieved from http://ocufa.on.ca/assets/Measured_Academic_May_2006.pdf Ontario University Student Alliance (OUSA). (2014). Policy paper: Accountability. Retrieved from http://www.ousa.ca/wordpress/wp-content/uploads/2014/11/Accountability-Fall- 2014.pdf Reason, R.D., & Broido, E.M. (2011). Philosophies and values. In J.H. Schuh, S.R. Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 80-95). San Francisco, CA: Jossey-Bass. Renn, K. A., & Patton, L. D. (2011). Campus ecology and environments. In J.H. Schuh, S.R. Jones, & S.R. Harper (Eds.), Student Services: A handbook for the profession (pp. 242- 256). San Francisco, CA: Jossey-Bass. Scott, P. (2013). Ranking higher education institutions: A critical perspective. In P.T.M.
  • 41. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 37 Marope, P.J. Wells & E. Hazelkorn (Eds.), Rankings and accountability in higher education: Uses and misuses (pp. 113-127). Paris, France: United Nations Educational, Scientific, and Cultural Organization (UNESCO). Retrieved from http://unesdoc.unesco.org/images/0022/002207/220789e.pdf Seifert, T. A. (2015, January 26). Thoughts from space; wisdom on Earth [Web log post]. Retrieved from https://supportingstudentsuccess.wordpress.com/2015/01/26/thoughts- from-space-wisdom-on-earth/ Seifert, T.A., & Burrow, J. (2013). Perceptions of student affairs and services practitioners in Ontario’s post-secondary institutions: An examination of colleges and universities. Canadian Journal of Higher Education, 43(2), 132-148. Seifert, T. A., Arnold, C., Burrow, J., & Brown, A. (2011). Supporting student success: The role of student services within Ontario’s postsecondary institutions. Toronto, ON: Higher Education Quality Council of Ontario. Retrieved from http://www.heqco.ca/SiteCollectionDocuments/Supporting%20Student%20SuccessENG. pdf Shanahan, T. (2009). Accountability initiatives in higher education: An overview of the impetus to accountability, its expressions and implications. In Accounting or accountability in higher education: Proceedings from the January 8, 2009 OCUFA conference (pp. 3–15). Retrieved from http://edu.apps01.yorku.ca/news/wp-content/uploads/shanahan-ocufa- articlesmarch10094.pdf Shields, C. M. (2010). Transformative leadership: Working for equity in diverse contexts. Educational Administration Quarterly, 46(4), 558-589. doi:10.1177/0013161X10375609 Shields, C. M. (2011). Transformative leadership: A reader. New York, NY: Peter Lang.
  • 42. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 38 Strange, C. C. (2009). Theoretical foundation of student services. In D. Hardy Cox & C. C. Strange (Eds.), Achieving Student Success: Effective Student Services in Canadian Higher Education. (pp. 18-32). Montreal, QB: McGill-Queen’s University Press. Sullivan, B. (2009). Organizing, leading, and managing student services. In D. Hardy Cox & C. C. Strange (Eds.), Achieving Student Success: Effective Student Services in Canadian Higher Education. (pp. 165-191). Montreal, QB: McGill-Queen’s University Press. Tam, M. (2001). Measuring quality and performance in education. Quality in Higher Education, 43(1), 47-54. doi:10.1080/13538320120045076 Taylor, F. W. (2007/1912). Scientific Management. In D. S. Pugh (Ed.), Organization theory: Selected classic readings (5th ed.) (pp. 275-295). London, UK: Penguin. (Original work published 1912). Trow, M. (1973). Problems in the transition from elite to mass higher education. Carnegie Commission on Higher Education Reprint. Tull, A., & Freeman, J. P. (2011). Reframing student affairs leadership: An analysis of organizational frames of reference and locust of control. Research in the Schools, 18(1), 33-43. United Nations Educational, Scientific, and Cultural Organization (UNESCO). (2002). The role of student affairs and services in higher education: A practical manual for developing, implementing and assessing student affairs programmes and services. Paris, France: UNESCO. Retrieved from http://unesdoc.unesco.org/images/0012/001281/128118e.pdf Weingarten, H. P., Hicks, M., Jonker, L., & Liu, S. (2013). The diversity of Ontario’s universities:A data set to inform the differentiation discussion. Toronto, ON: Higher Education Quality Council of Ontario.
  • 43. PERFORMANCE-BASED ACCOUNTABILITY & STUDENT SERVICES 39 Ziskin, M.B., Hossler, D., Rabourn, K., Cekic, O., & Hwang, Y. (2014). Outcomes-Based Funding: Current Status, Promising Practices and Emerging Trends. Toronto, ON: Higher Education Quality Council of Ontario.