Reflective practice can sound like a mystical and puzzling concept for practitioners and scholars alike. However, stepping back from our practice to ask the questions, What is it that just happened?, What is happening now?, Why?, and Would I benefit from changing my current way of doing things?, can be very beneficial to evaluation practice. However, reflection does not appear to be utilized often in order to improve evaluation practice (Patton, 2012). The aim of this work is to provide evaluators with a critical and systematic approach to reflective practice that is practical enough to be used in regular evaluation practice.
This competency is a part of a set of competencies that have as their goal to improve training, enhance reflection, advance evaluation research, and for the continual professional development of the field. Similarly, it is worth noting that the Joint Committee on Standards for Educational Evaluation claim that the Program Evaluation Standards, available as a guide for evaluators in order to perform useful and effective evaluations, provide “guidance and [encourage] reflective practice” (Yarbrough, Shulha, Hopson, & Caruthers, 2011, p. xii).
The remainder of this presentation aims to provide an encompassing definition of reflective practice, a practical framework for reflection through the DATA model, and an example of the use of DATA for critical examination into evaluation practice.
The DATA Model involves identifying one’s assumptions, beliefs, values, and motivations, considering how they are intuitively associated with practice, and acting on the basis of a practical theory (Peters, 1991; Peters, 2009). Utilizing DATA is not merely an introspective process; it is action-based and can be utilized for the purpose of enhancing professional practice. It is also important to note that the model is recursive, primarily in the sense that reflection is not linear, but instead we tend to reflect on each of the steps of the model at different times during the process.
After this slide, turn over to Gary
F1 Project Management Evaluations should use effective project management strategies. F2 Practical Procedures Evaluation procedures should be practical and responsive to the way the program operates. F4 Resource Use Evaluations should use resources effectively and efficiently. U1 Evaluator Credibility Evaluations should be conducted by qualified people who establish and maintain credibility in the evaluation context. U2 Attention to Stakeholders Evaluations should devote attention to the full range of individuals and groups invested in the program and affected by its evaluation. U3 Negotiated Purposes Evaluation purposes should be identified and continually negotiated based on the needs of stakeholders. U4 Explicit Values Evaluations should clarify and specify the individual and cultural values underpinning purposes, processes, and judgments. U8 Concern for Consequences and Influence Evaluations should promote responsible and adaptive use while guarding against unintended negative consequences and misuse. P2 Formal Agreements Evaluation agreements should be negotiated to make obligations explicit and take into account the needs, expectations, and cultural contexts of clients and other stakeholders. P4 Clarity and Fairness Evaluations should be understandable and fair in addressing stakeholder needs and purposes. P7 Fiscal Responsibility Evaluations should account for all expended resources and comply with sound fiscal procedures and processes.
After this slide, Pat takes the show.
AEA 2013 Demystifying Reflective Practice 101613
DEMYSTIFYING REFLECTIVE PRACTICE:
USING THE DATA MODEL TO ENHANCE
EVALUATORS’ PROFESSIONAL ACTIVITIES
“Insights and innovation await us only if we are capable of stepping
outside the frenzied worlds of data and distraction that wash over us…
time for reflection is an open invitation to discover what awaits us…”
(Forrester, 2011, pp. 216-217)
Tiffany L. Smith
John M. Peters
Gary J. Skolits
Patrick B. Barlow
American Evaluation Association 2013
Friday, October 18th, 1:45-2:30pm
INTRODUCTION: WHO ARE WE?
A doctoral candidate in
the Evaluation, Statistics
program at the
University of Tennessee
Knoxville. She has taken
two courses on
Reflective Practice and
has interned with the
Institute for Reflective
Practice at the University
of Tennessee with John
Peters. Her primary
involvement as well as
reflective practice in
The Director of the
Tennessee Teaching and
Institute for Reflective
Practice. The primary
mission of the Institute is
to promote reflective
practice by individuals
and organizations served
by the University of
Tennessee, Knoxville. Its
service is based on the
belief that addressing
and managing change in
individual lives and
processes by persons
capable of proactive
The Director of the
Institute for Assessment
and Evaluation at the
University of Tennessee’s
in the College of
Education, Health, and
Human Sciences. He is a
professor in the
Evaluation, Statistics, an
d Measurement Ph.D.
program. His research
evaluation methods, the
interventions, P-16 and
college access program
evaluation as well as
A doctoral candidate in
Evaluation, Statistics, an
d Measurement program
at the University of
Tennessee. He works as a
Statistical and Research
Design Consultant for
the University of
School of Medicine. His
primary research areas
are in higher education
assessment and teaching
statistics and research
methods in the social
sciences, epidemiology, a
nd clinical medical
“Reflective practice is more than spending ten
minutes at the end of an evaluation
congratulating oneself on getting the damn
thing done” (Patton, 2012, p. 401).
REFLECTIVE PRACTICE IN EVALUATION
• One of six Essential Competencies for program
evaluators, defined as:
– “being acutely aware of personal evaluation
preferences, strengths, and limitations; selfmonitoring the results of actions intended to
facilitate effective evaluation studies; and planning
how to enhance future endeavors”
(Stevhan, King, Ghere, & Minnema, 2005, p. 46)
HOW DOES REFLECTION IMPROVE
• Can be used as a means for selfawareness, professional growth and
development, improved ethical practice, dialogue
and stakeholder communication, and for learning
• The reflection process can happen while engaging
in daily practice as well as after the fact, either
alone or with others.
• Reflection involves being a student of the actions
that you make, and the study of those actions is
Patton (2012) reports that “in speeches and
workshops at professional evaluation
association meetings, I like to ask for a show
of hands of those who systematically reflect
on evaluations they have conducted for
learning and further professional
development. Few hands go up; typically, in
fact, no one raises a hand” (p. 400).
A SHOW OF HANDS?
Who systematically reflects on evaluations they
have conducted for learning and further
Why is reflective practice not a part of regular
WHY IS RP NOT A PART OF REGULAR
• Perhaps there is a lack of awareness of the
purpose of reflection in evaluation?
• Unclear what reflective practice is in the first
WHAT IS RP, ANYWAY?
According to Peters (1991), “reflective practice
involves more than simply thinking about what
one is doing and what one should do next. It
involves identifying one’s assumptions and
feelings associated with practice, theorizing
about how these assumptions and feelings are
functionally or dysfunctionally associated with
practice, and acting on the basis of the resulting
theory of practice” (p. 89).
THE DATA MODEL FOR REFLECTION
HOW HAS DATA BEEN USED AS A REFLECTIVE
• Mainly to guide individual and group
reflection on a variety of work-related
issues, train teams of workers, and for
• Also as major component of action research
projects in community education
programs, business applications, and in higher
• A detailed account of the situation, task, or
incident that has happened in one’s practice.
• Concerned with identifying the specifics of the
situation at hand.
– Context of the situation, the setting in which it
occurred, who was involved, etc.
• Use as much detail as possible, so as to reflect on
the whole of the problem and its setting.
• Everything that you know about the situation and
context are key to this step of the reflective
• This might prove to be harder than one might
– A description only involves accounts of what actually
• Description is devoid of any why inclinations or
explanations in order that a clear picture of the
problem is painted (Peters, 2009).
• This portion of the DATA model could prove to be
beneficial if one includes other colleagues, be it
evaluators or stakeholders.
• The why that it was necessary to avoid during
the Describe phase gets its turn.
• Identification of factors contributing to the
• Examination of one’s assumptions, biases, and
• Beliefs, rules, motives, and facts should be things that
the practitioner is aware of during this stage of
• Through the first two steps of DATA, problem is at least
– The practitioner should have a very clear and holistic
understanding of the problem, its context, and why it has
come up in their practice.
• This will in turn produce a practical question of how to
solve the problem.
– How, specifically, can I change my practice to produce a
• The focus is on answering the practical question
introduced in the analysis.
• Refers to deriving a practical theory from the
description and analysis in order to improve one’s
– A practical theory refers to understanding the structure
and implications of a theory’s use in professional practice.
• This does not need to be a scholarly theory, derived
from the literature, although previous literature may
influence this section
• This theory comes from the practice in which one
• At the end of Theorize, the practitioner should be able
to ask What am I going to do, and Why is this solution
better than other potential solutions?
• How many options do you have?
• Is there relevant literature or research to indicate that
this option is better over others?
• In practice, what has worked before, and how well?
• The Theorize portion of DATA is intended to provide
practitioners with a theoretical solution to their
• Taking action.
• What are you going to do about it, specifically?
• Moving forward based on a practical theory which is
derived from reflection on the situation, including the
what and the why.
• When one takes action, it is in the context and for the
betterment of their practice.
• This action tests the practical theory that was
developed via this reflective process.
• This can lead to further reflection on one’s practice and
further professional development.
Michael is a seasoned evaluator who has
worked many different evaluation projects over
the past 20 years. The following is a case of
Michael utilizing the DATA model of reflective
practice in order to understand and solve a
dilemma in his evaluation work.
At the beginning of the evaluation project:
• Contract awarded for 3 years of funding
– Serving a true community need
• Stakeholders and evaluators come together to define the
evaluation design, data collection, schedules, procedures
• Client not familiar with evaluation or working with
evaluators and very reluctant and cautious
– Displaying signs of confusing evaluator with auditor
• Client staffs project and project data collection and
reporting based on evaluation design
• About a year in, change in requirements from funding agency
Reporting is vastly increased
Increase in the amount of data to be collected by tenfold
Complications for collecting the data are greatly increased
Project itself is being redefined by new data collection requirements
• Changes mandated at the policy level resulting from inner turmoil
in funding agency
• This issue requires a total renegotiation of the client’s data
collection process as well as the evaluation design, breaking all of
the prior agreements.
• Data collection is complex, training implemented that they never
Due to this issue, the evaluation is struggling.
Evaluative progress is not being made.
• Why is this situation occurring?
– Evaluator’s assumption that no major policy change
would occur was totally inaccurate; after 20 years of
practice, never has such a change been encountered
– Examination of how evaluator’s behavior could have
helped to create the situation
– Examine angers and frustrations
– Evaluator and client, and now multiple stakeholders
come to new understandings about what is expected
• Sense of a weak relationship, a weak tie, and a
mistrust in the stakeholders
Practical Question: How can we work to
reestablish trust in the evaluation and program
• Utilization of Program Evaluation Standards (as well as ethical
guidelines, Essential Competencies, Utilization Focused evaluation)
to decide on the most effective means of establishing trust.
– Some specific Standards: F1, F2, F4, U1, U2, U3, U4, U8, P2, P4, P7
• Discusses options with trusted colleagues
• After thinking about other options, including termination of the
project, Michael decides that the best strategy would be:
– Face to face discussion with primary stakeholders
• Review the status of the relationship prior to the change (contracting, working
• Review the relationship after the change
• Gauge the perspectives of stakeholders
• Is there a basis for continuing? Want?
• Both the evaluation staff and the client must be willing to put the cards on the
table and understand that this was an imposed change.
• Michael acts on the basis of a practical theory.
• Renegotiation of agreement
• Re-establishment of trust through critical
conversation with stakeholders
USING DATA IN EVALUATION
• How did reflection on the situation help
Michael to come up with a plan of action?
– How could it have improved his professional
practice? Way of thinking?
• When is DATA practical for evaluators?
• Should RP be an alone thing? Or is this best
USING DATA IN ACTION RESEARCH: THE
• DATA-DATA has been used to plan and conduct more
than 125 projects in higher education, community
education, and business.
• Using the second DATA when you want to have more
confidence in the result of your thinking and action
that results from the first DATA.
• The systematic design work and careful analysis in the
second DATA accomplishes this.
WHAT DOES REFLECTION PROVIDE FOR
• A way to step back from daily practice and to
create an encompassing understanding of the
issues and situations that we face
• Reflecting helps us to learn more about context
• Reflection provides us an opportunity to
investigate our own assumptions and how they
influence our actions.
• How can RP be integrated into evaluation work?
Can anyone think of a way that the DATA
model can be used in their evaluation work?
Tiffany Smith: firstname.lastname@example.org
John Peters: email@example.com
Gary Skolits: firstname.lastname@example.org
Patrick Barlow: email@example.com
Institute for Reflective Practice: firstname.lastname@example.org