Impact Assessment Workshop <ul><li>John Sener </li></ul><ul><li>Joan D. McMahon </li></ul>July 27 - August 5, 2011 -- Modu...
Workshop Outcomes <ul><li>Justify the relevance of conducting impact assessment.  </li></ul><ul><li>I dentify the tools an...
Module 1   What is Impact Assessment and Why Is It Valuable?
Module 1 Objectives <ul><li>Justify the relevance of conducting impact assessment. </li></ul><ul><li>Differentiate &quot;i...
Traditional “Institutional Driven” Assessment Separate student, faculty assessment Measure KSAs (somewhat) Often aim for “...
Impact Defined <ul><li>Measure of the tangible and intangible effects (consequences) of one thing's or entity's action or ...
What is Impact Assessment? <ul><li>Educational Impact: </li></ul><ul><li>Force exerted by a new idea, concept, ideology. <...
Impact Assessment: Key Characteristics  <ul><li>Constructive (rather than extractive) </li></ul><ul><li>Produces individua...
What Is Action Research? <ul><li>A practical approach to professional inquiry </li></ul><ul><ul><li>Practice = professiona...
How action research supports impact data collection <ul><li>Bridges theory and practice </li></ul><ul><li>Professionalizes...
Student Reflective Voice <ul><li>Guided Reflection Processes and Protocols </li></ul><ul><li>Scholarly Personal Narrative ...
Impact is not just reflective . Reflection ? Impact
Reflection + Insight = Impact <ul><li>Insightful reflection => more permanent impact </li></ul>Reflection Insight Impact
We’re doing it! <ul><li>We’re already collecting impact data and we don’t recognize it. </li></ul><ul><li>We aren’t catego...
Proof of Student Learning cut short We give team assignments without team training We have discussions but don’t develop c...
Huh? <ul><li>It would be liking having you take this workshop and asking if you  liked it . </li></ul><ul><li>That is a  h...
Kirkpatrick Scale <ul><ul><li>L1: Reaction (learner satisfaction, ‘smiley tests’) </li></ul></ul><ul><ul><li>L2: Learning ...
Why Is Impact Assessment Valuable? <ul><li>Enhance the value of education </li></ul><ul><li>Prepare students for future ch...
Upcoming SlideShare
Loading in …5
×

Module 1

1,076 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,076
On SlideShare
0
From Embeds
0
Number of Embeds
722
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • These objectives are not covered necessarily in this order, but maybe happening concurrently.
  • Traditional learning assessment is useful, but it is also limited and limiting because it focuses primarily on institution-centered outcomes: - Grades, course completion rates, and other student achievement measures are framed from an institutional point of view. - Most traditional learning assessment techniques such as tests, student satisfaction surveys, etc. are extractive, designed to extract data for institutional purposes; benefits to students are indirect, and the impact beyond this frame of reference is largely ignored. - Traditional learning assessment often aims for &amp;quot;objectively&amp;quot; comparable information through standardization, for instance through a common test. Results are commonly used to serve institution-oriented needs such as comparing results across courses or programs, increasing retention and graduation rates, etc. - Traditional learning assessment also tend to measure KSAs (Knowledge, Skills, Attitudes) in institution-oriented terms. Numerical and letter grades are comparative and sorting mechanisms which mask individual learner achievement; measurement of skill attainment in higher education is infrequent. - Most traditional learning assessment does not really engage learners in the process. Student satisfaction surveys limit themselves to getting reactions from students. When it asks learners to be reflective (which is relatively rarely), traditional learning assessment usually limits learners to reporting their reflections. Learner attitudes are otherwise largely ignored. - Traditional learning assessment usually considers assessment of the faculty experience (if it considers this at all) as totally separate from assessment of the student experience.
  • So what makes impact assessment different? First, let’s start with reviewing some definitions of the word impact . As these definitions illustrate, “impact” implies several key concepts: an agency which acts or influences in a way which has measurable effects.
  • The term “impact assessment” is most commonly used in the context of environmental studies, specifically in terms of “environmental impact assessment.” There the term has a somewhat different meaning as illustrated by this definition from the International Association for Impact Assessment ( http://www.iaia.org/publicdocuments/special-publications/Principles%20of%20IA_web.pdf ): “The process of identifying, predicting, evaluating and mitigating the biophysical, social, and other relevant effects of development proposals prior to major decisions being taken and commitments made.” This definition focuses on trying to guess (predict) possible bad consequences of actions ahead of time and trying to figure out ways to lessen (mitigate) those negative consequences. Our definition of impact assessment as applied to education has some common elements, but it also has a rather different focus. Educational impact assessment focuses on identifying and evaluating effects, but it does not try to predict effects ahead of time or focus on identifying and mitigating possible negative consequences. Instead, it focuses on identifying, assessing, and even generating the impacts of the educational experience on its participants. In this sense, the concept of “impact” as applied to (educational) impact assessment has these important qualities: - It views “impact” as the product of the force exerted by a new idea, concept, or ideology, with the underlying assumption that education is fundamentally about exposing learners to new ideas and confronting the - It focuses on real-world relevance, borrowing from adult learning theory; that is, it contextualizes the educational experience within a larger frame of reference. - It focuses on making meaning (the “so-what” factor) - what is the meaning of the educational experience? What is its significance, why does it matter? (in other words, ‘so what?’) - Since it views the educational experience in a larger context, it also concerns itself with the effects or influence of the educational experience beyond the classroom walls. Specifically, impact assessment considers the impact of the educational experience on the learners themselves, on their lives (career, future education, other life spheres), and even on the world itself (how has this educational experience made the world a better or different place?)
  • As a result, impact assessment has several key characteristics: - It is primarily constructive, by enabling learners and faculty to build assessments which they find personally valuable. - It aims to produce individualized outcomes based on the subjective experience of each student, but within common frames of reference (course, career, life, etc.) - It depends on engaging learners in the process. More specifically, impact assessment depends not only on learner reflection but also on how learners mine or produce insights from that reflection, and how that enables them to make meaning from their learning in the process. - Impact assessment also requires instructors to engage in the process. - Impact assessment considers the faculty experience and the learner experience as two integrated aspects of the same process; while they can be considered separately, their interconnectedness is not ignored. The key ingredients for making impact assessment are action research; engaging students in a supported reflection process, and collecting and mining insights.
  • Impact assessment can be a form of action research . Action research is a practical approach to professional inquiry. The aim is to improve one’s practice by analyzing it in context, identifying elements for substantive change, and developing ways to enact those changes. “Practice” simply means one’s professional actions -- both what you do and why you think you should do things the way you do. Action research can be thought of as “research on action which uses action as a research tool.” Confusing? Here’s a simpler explanation: The “action” part of action research comes from its focus on changing practice. The “research” part comes from its focus on using the process as a deliberate attempt to better understand one’s practice. Thus, action research is an effective means of bridging the “theory-practice” divide which so often plagues education. Action research aims to improve practice and one’s understanding of that practice, and it also aims to improve the “situation” itself - that is, the environment in which education is situated. This includes not just teaching and learning, but all the elements which interact with that environment. In the case of education, this can include interacting life spheres such as career, work, or family, as well as various stakeholders. Thus, action research can extend impact assessment to measure broader impacts which have value to learners and instructors, as well as others who care about the related outcomes. Action research is arguably a superior method for educational assessment relative to experimental and quasi-experimental methods. In recent years, these methods have come to be labeled (misleadingly, in our opinion) “evidence-based” and unfortunately are often viewed as the best or most “rigorous” methods. Action research simply broadens the range of acceptable “evidence” and provides a different form of rigor when done well. But the most important characteristic of action research which can make it a superior approach is that it recognizes that practice is always influenced by context. As psychologist Fritz Perls noted some decades ago, “Nothing has a meaning without a context.” Action research provides a way of working through concerns which offer practical solutions that derive from the specific circumstances of your practice. Other solutions may have merit but not exactly fit your individual situation. Action research can also be viewed as a process which consists of two important dialogues: one between the action and the intentions behind that action, and one between the practice and the values which animate that practice. There are a variety of action research methods which can be used. For some examples and more details, see Action Research in Education- parts 1-4 found at http://www.edu.plymouth.ac.uk/resined/actionresearch/arhome.htm For a more advanced theoretical description of some of the thinking behind impact assessment, also see What is Action Science? found at http://www.actionscience.com/actinq.htm
  • Action research supports impact data collection in the following key ways: 1) It provides an effective means of bridging the gap between theory and practice by combining the two in the same process. 2) It professionalizes teaching as a profession by putting the means of research and knowledge production in the hands of instructors as practitioners who can use what they learn to improve their own teaching and share it with others. 3) The process can also be a form of impact assessment in and of itself in that it provides a means for instructors to reflect and generate insights as they conduct the action research process.
  • Impact assessment uses a guided reflection process focused around student responses to statements. You’ll learn more about how to compose such statements in Module 2. There are many approaches and techniques for developing a guided reflection process. For example, ASCD offers a simple guided reflection protocol (see http://www.ascd.org/publications/educational-leadership/may99/vol56/num08/Reflection-Is-at-the-Heart-of-Practice.aspx for details). Another technique for supporting guided reflection is scholarly personal narrative, a research methodology that blends the rigor of traditional scholarship with the writer’s personal experience. Sometimes called personal ethnography or autobiographical scholarship, SPN starts with the writer at the center and radiates outward. To read more about Scholarly Personal Narrative (as developed by Robert Nash at the University of Vermont), see this article at http://www.wihe.com/displayNews.jsp?id=455 .
  • However, focusing on reflection alone is not always enough to elicit impact. An additional element is needed....
  • Reflection needs to generate insight in order to have a more permanent impact. The heart of impact assessment is to enable learners to identify insights that reveal the impact of their learning. This is necessary because according to Harvard Business School Professor Emeritus Chris Argyris, insight is not something that can be taught (i.e., attained via direct instruction). Techniques such as Personal Scholarly Narrative help to get at the “insight” that comes with learning. Module 2 of this workshop will focus on techniques for building impact assessment into courses by composing questions which enable learners to mine and generate insights.
  • Perhaps at this point, you may be thinking that impact assessment sounds impossibly abstract or complicated. Let us assure you: it’s not. It’s just -- subtle. Impact assessment is a relatively small shift from traditional learning assessment, but it is one which can make a huge difference, the way the right spice(s) can make all the difference between a bland, forgettable dish and a truly tasty, memorable one. The reality is that i mpact assessment is not really anything new. No doubt you have experienced impact assessment many times in your educational career. Anytime we reminisce about teachers who Changed Our Lives, or recall a pivotal moment in a lecture or seminar or project which changed our thinking or our career direction, or simply take the time to derive insights about how our learning or teaching experience has been important to us, we are doing impact assessment. In fact, it’s likely that you’re already collecting impact data without realizing it. It’s just that you aren’t
  • Here are some examples of what Joan likes to call “proof of student learning cut short.” In each case, we lay the groundwork for doing impact assessment, but stop short before actually following through on the process. For example, we set up learning experiences with real-world relevance (eg., field trips) but neglect to ask students to figure out how that relevance impacted them, or to tell us that. We sometimes provide relevant examples but fail to ask about the significance of those examples. We develop processes for eliciting impact assessment but we often fail to support those processes adequately or completely. The result is that we limit our chances to prove student learning, particularly those deeper forms which have impact not only in the course, but in other important ways.
  • Student satisfaction surveys also commonly stop short of impact assessment. For example, if we were to ask you at the end of this workshop how you liked it, that would be a happiness index (such surveys are often called “smile tests”) -- we would learn about your reaction to the workshop, but we wouldn’t learn very much about how the workshop impacted you (unless you happened to be so kind as to volunteer that information).
  • Instead, we could ask questions which get at deeper levels of impact. One common scale for doing this is the Kirkpatrick Scale, which posits four levels of learning. Note, for example, how the following feedback from one class participant illustrates impact at Kirkpatrick’s levels 3 (behavior) and 4 (results): “ After my first online team building course, I learned a great deal about others through the experiences I shared with my teammates. Most importantly, I learned that when you’re part of a team, you must put the team first and establish your commitment level. This was very similar to the sports teams I’ve played on throughout my life. Initially, this course was not a top priority. Teaching, coaching, and being a new husband consumed most of my time. However, after weeks of seeing the commitment level from my team members, whose lives are just as busy, if not more hectic than mine, I wanted to make the class a priority. I’ve reached a point by the end of the semester, where I want to work hard for my team members and not let them down. Overall, there are little adjustments we could make to improve our performance. For example, I like Kim’s idea of reviewing roles on a regular basis. However, for the most part I would not change much for the simple fact that our team went through all stages discussed in the course and it allowed us to get to the level of performance we are at currently. I don’t like confrontation, but without the storming we experienced, I don’t think we would be as strong of a team. I think Team Eagle went through a natural progression and is on the verge of becoming a high performance team. We understand what everyone brings to the table and could complete any task assigned to us in a very efficient manor [sic].”
  • The purpose of our workshop is to show you how to cultivate impact assessment. At present, such moments are random, haphazard, and too infrequent; we tend to &apos;harvest them wild&apos; if we pay attention to them at all. As a result, we lose out on a lot of opportunities to enhance the value of our educational experiences by failing to do impact assessment. So why is impact assessment valuable? - We think this ability to do impact assessment is far more important to cultivate in today&apos;s world because education itself is more important, and impact assessment is one of many techniques which can enhance the value of education for both learners and instructors. - We also believe that impact assessment will help prepare today’s students for the challenges they will face in their futures. Developing the capacity to reflect, mine/produce insights, and make meaning out of one&apos;s experience is better preparation to encounter jobs not yet invented, to solve problems not yet formulated, to invent knowledge, and to collaborate with a diversity of partners in a global, multicultural world. - We also believe that impact assessment can enhance institutional assessment by providing methods to obtain more useful data which serves institutional purposes, while engaging students more deeply by meeting their needs. Impact assessment is ultimately a way of increasing the success of education and its participants -- learners, instructors, and ultimately institutions and other stakeholders. As Bill Gates noted, “success depends on knowing what works.” Impact assessment is a way to figure out ‘what works’ for education’s participants in a deeper and more meaningful way. But perhaps the best reason why impact assessment is valuable is because it is engaging. A story in Daniel Pink’s recent book Drive (pp. 137-138) illustrates the power of this principle at work. A University of Pennsylvania psychology professor did a study of call center representatives for a university fundraising operation. Callers were divided into three groups: - One group read stories from previous employees about the personal benefits of working on this job - One group read stories from people who had received scholarships from the university fund and who described how their lives were improved as a result - One group read no stories (control group). The first group was no more successful than the control group in raising funds, but participants in the second group obtained more than twice as many pledges, and raised more than twice as much money as the other groups. As Pink notes, “ A brief reminder of the purpose of their work doubled their performance. ” This finding mirrors other research findings and theories such as Herzberg’s “Motivation-Hygiene Theory” (see http://www.netmba.com/mgmt/ob/motivation/herzberg/ for details), and Deci and Ryan’s Self-Determination Theory (see http://www.psych.rochester.edu/SDT/ for details). Also see Chapters 1 and 3 of Drive for a more detailed description of what he calls “Motivation 3.0” works. Impact assessment expands the domain of inquiry to include possible motivating factors outside the classroom. But more importantly, it is built on the assumption that activating learners’ intrinsic motivation by engaging them more deeply in the process is a more valuable way to assess their educational experience.
  • Module 1

    1. 1. Impact Assessment Workshop <ul><li>John Sener </li></ul><ul><li>Joan D. McMahon </li></ul>July 27 - August 5, 2011 -- Module 1
    2. 2. Workshop Outcomes <ul><li>Justify the relevance of conducting impact assessment. </li></ul><ul><li>I dentify the tools and strategies for building impact assessment into courses. </li></ul><ul><li>Develop a project plan to incorporate impact assessment into a course. </li></ul>
    3. 3. Module 1 What is Impact Assessment and Why Is It Valuable?
    4. 4. Module 1 Objectives <ul><li>Justify the relevance of conducting impact assessment. </li></ul><ul><li>Differentiate &quot;impact&quot; terms. </li></ul><ul><li>Clarify how action research supports impact data collection methods. </li></ul>
    5. 5. Traditional “Institutional Driven” Assessment Separate student, faculty assessment Measure KSAs (somewhat) Often aim for “objectively” comparable info Extract data primarily for institutional purposes Traditional assessment techniques (Grades, Completion Rates, Standardized Tests, Student & Faculty Satisfaction Surveys, et al.)...
    6. 6. Impact Defined <ul><li>Measure of the tangible and intangible effects (consequences) of one thing's or entity's action or influence upon another. </li></ul><ul><li> http://www.businessdictionary.com/definition/impact.html </li></ul><ul><li>That which impresses, or exercises an effect, action, or agency; appearance; phenomenon. </li></ul><ul><li>Influence or effect on the senses or the intellect hence, interest, concern. </li></ul><ul><li> http://www.brainyquote.com/words/im/impression176793.html </li></ul>
    7. 7. What is Impact Assessment? <ul><li>Educational Impact: </li></ul><ul><li>Force exerted by a new idea, concept, ideology. </li></ul><ul><li>Relevancy to the real world (adult learning theory) </li></ul><ul><li>“ The “So-What” Factor </li></ul><ul><li>Focused on effects or influence outside the institution (learners, lives, world) </li></ul>
    8. 8. Impact Assessment: Key Characteristics <ul><li>Constructive (rather than extractive) </li></ul><ul><li>Produces individualized, subjective outcomes </li></ul><ul><li>Engages learners, instructors in the process </li></ul><ul><ul><li>Often Action Research-Based </li></ul></ul><ul><ul><li>Collects Insight, student reflective voice </li></ul></ul><ul><li>Faculty, learner experience interconnected </li></ul>
    9. 9. What Is Action Research? <ul><li>A practical approach to professional inquiry </li></ul><ul><ul><li>Practice = professional actions (doing + the why of doing) </li></ul></ul><ul><ul><li>Action = Change practice by finding solutions to problems </li></ul></ul><ul><ul><li>Research = Understand what’s going on in the process </li></ul></ul><ul><li>Cuts across the theory-practice divide </li></ul><ul><li>Improving practice, understanding, situation </li></ul><ul><li>Dialogues between action & intention; practice & values </li></ul>
    10. 10. How action research supports impact data collection <ul><li>Bridges theory and practice </li></ul><ul><li>Professionalizes teaching </li></ul><ul><li>Supports reflection and insight generation </li></ul>
    11. 11. Student Reflective Voice <ul><li>Guided Reflection Processes and Protocols </li></ul><ul><li>Scholarly Personal Narrative </li></ul>
    12. 12. Impact is not just reflective . Reflection ? Impact
    13. 13. Reflection + Insight = Impact <ul><li>Insightful reflection => more permanent impact </li></ul>Reflection Insight Impact
    14. 14. We’re doing it! <ul><li>We’re already collecting impact data and we don’t recognize it. </li></ul><ul><li>We aren’t categorizing it in terms of impact as it pertains to our disciplines. </li></ul>
    15. 15. Proof of Student Learning cut short We give team assignments without team training We have discussions but don’t develop communities We give examples, but fail to ask “so what”? We go on field trips and then dismiss the class .
    16. 16. Huh? <ul><li>It would be liking having you take this workshop and asking if you liked it . </li></ul><ul><li>That is a happiness index….. </li></ul>
    17. 17. Kirkpatrick Scale <ul><ul><li>L1: Reaction (learner satisfaction, ‘smiley tests’) </li></ul></ul><ul><ul><li>L2: Learning (learning effectiveness, perceived/real) </li></ul></ul><ul><ul><li>L3: Behavior = application of learning to environment </li></ul></ul><ul><ul><li>L4: Results = effects on environment resulting from learner performance </li></ul></ul>
    18. 18. Why Is Impact Assessment Valuable? <ul><li>Enhance the value of education </li></ul><ul><li>Prepare students for future challenges </li></ul><ul><li>Obtain more useful institutional data </li></ul><ul><li>Engage students more deeply by meeting their needs </li></ul><ul><li>“ Success depends on knowing what works.” </li></ul><ul><ul><ul><li>Bill Gates, Co-Chair, Bill & Melinda Gates Foundation </li></ul></ul></ul>

    ×