• Like

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Program evaluation and outdoor education: An overview

  • 4,059 views
Uploaded on

This presentation discusses program evaluation in outdoor education. What is it? Why do it? What methods are there? How can data be analysed? How can results be used? We will consider several example …

This presentation discusses program evaluation in outdoor education. What is it? Why do it? What methods are there? How can data be analysed? How can results be used? We will consider several example program evaluation studies and available tools and resources. There will also be opportunity to workshop your own program evaluation needs.

Main presentation page: http://wilderdom.com/wiki/Neill_2010_Program_evaluation_and_outdoor_education:_An_overview

More in: Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
4,059
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
76
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Session abstract This session will discuss program evaluation in outdoor education. What is it? Why do it? What methods are there? How can data be analysed? How can results be used? We will consider several example program evaluation studies and available tools and resources. There will also be opportunity to workshop your own program evaluation needs. More info: http://wilderdom.com/wiki/Category:NOEC/2010 Biography James currently lectures in the Centre for Applied Psychology at the University of Canberra and conducts research in outdoor education, experiential learning, the natural environment, and new technologies. He also edits . James previously taught outdoor education at the University of New Hampshire and was the Research Coordinator and a senior instructor at Outward Bound Australia. More info: http://wilderdom.com/JamesNeill.htm
  • Image source: http://commons.wikimedia.org/wiki/File:Information_icon4.svg License: Public domain Image source: http://commons.wikimedia.org/wiki/File:Autoroute_icone.svg License: CC-BY-A 2.5 Author: http://commons.wikimedia.org/wiki/User:Doodledoo
  • Image source: http://commons.wikimedia.org/wiki/File:Information_icon4.svg License: Public domain Image source: http://commons.wikimedia.org/wiki/File:Autoroute_icone.svg License: CC-BY-A 2.5 Author: http://commons.wikimedia.org/wiki/User:Doodledoo
  • Image name: * Blackboard * Image source: http://www.flickr.com/photos/8078381@N03/3279725831/ Image author: pareeerica, http://www.flickr.com/people/8078381@N03/ Image license:CC-by-A 2.0 http://creativecommons.org/licenses/by/2.0/deed.en
  • The necessity argument runs something like this: - If we don’t get in and start building up a systemic evaluation programs of our programs now, we will eventually (probably) be increasingly forced by outside providers to do so The moral argument runs something like this: If we purport to affect psychosocial aspects of people’s lives then we have a moral obligation to maximize in a rigorous, thorough way the processes and outcomes of our programs (if we teach canoeing badly, but the kids survive, does it matter? But if we teach about personal lives badly, we potentially muck around with and damage the core aspects of a person’s being? Therefore we are as morally obligated if not more so to be as thorough and rigorous in our educational design, our training of facilitators, and our evaluation of outcomes as we are in throughly researching appropriate safety procedures, etc.) Safety audits are now common – but educational auditing is not – they are becoming more common Noted that in Brookes’ examination of fatalities since the 1960’s in Australian outdoor education, staff were identified as one of the highest risks for fatalities because they are inclined to go outside of program policies and normal behaviors – likewise, whilst we discipline our students through systematic processes of feedback and reflection, we tend not to subject the educational quality of our programs to the same kind of rigor in analysis
  • We are all natural researchers and evaluators. E.g., we are always assessing other people and judging value If you are asked to buy new harnesses for a ropes course, you naturally “research” the answer the question. When it comes to assessing the quality of programming, do we systematically investigate?
  • Image name: * Blackboard * Image source: http://www.flickr.com/photos/8078381@N03/3279725831/ Image author: pareeerica, http://www.flickr.com/people/8078381@N03/ Image license:CC-by-A 2.0 http://creativecommons.org/licenses/by/2.0/deed.en
  • Source: Priest, S. (1999). National life cycles in outdoor adventure programming .  The Outdoor Network, 10 (1),16-17, 34-35. Priest (1999) – figure & table
  • The necessity argument runs something like this: - If we don’t get in and start building up a systemic evaluation programs of our programs now, we will eventually (probably) be increasingly forced by outside providers to do so The moral argument runs something like this: If we purport to affect psychosocial aspects of people’s lives then we have a moral obligation to maximize in a rigorous, thorough way the processes and outcomes of our programs (if we teach canoeing badly, but the kids survive, does it matter? But if we teach about personal lives badly, we potentially muck around with and damage the core aspects of a person’s being? Therefore we are as morally obligated if not more so to be as thorough and rigorous in our educational design, our training of facilitators, and our evaluation of outcomes as we are in throughly researching appropriate safety procedures, etc.) Safety audits are now common – but educational auditing is not – they are becoming more common Noted that in Brookes’ examination of fatalities since the 1960’s in Australian outdoor education, staff were identified as one of the highest risks for fatalities because they are inclined to go outside of program policies and normal behaviors – likewise, whilst we discipline our students through systematic processes of feedback and reflection, we tend not to subject the educational quality of our programs to the same kind of rigor in analysis
  • Progression through the 7 stages can start from any point and need not progress linearly (e.g., a change in leadership can cause a big leap up or down).
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • http://managementhelp.org/evaluatn/fnl_eval.htm#anchor1585345
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • http://managementhelp.org/evaluatn/fnl_eval.htm#anchor1585345
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program
  • Other options: grants; minimalist “just do it” – just experiment with an end of program survey at the end of your next program

Transcript

  • 1. Program evaluation & outdoor education: An overview Dr James Neill Centre for Applied Psychology University of Canberra 16 th National Outdoor Education Conference, Perth, Western Australia, Jan 10-13, 2010
  • 2. Overview
    • Discuss program evaluation in outdoor education e.g.,
      • What is it?
      • 3. Why do it?
      • 4. What methods are there?
      • 5. How can data be analysed?
      • 6. How can results be used?
  • 7. Overview
    • Consider example program evaluation studies and available tools and resources.
    • 8. Opportunity to workshop your own program evaluation needs.
  • 9. Resources: http://wilderdom.com/wiki/Evaluation http://wilderdom.com/wiki/Research Email: [email_address] Contacts & resources
  • 10. Resources
  • 11. Resources http://managementhelp.org/evaluatn/fnl_eval.htm
  • 12. What is it?
  • 13. What is evaluation? E- value -ation (a systematic process of determining value)
  • 14. Research vs. evaluation
    • Research and evaluation are ways of answering questions .
    • 15. Research = aims to generalise findings to outside world
    • 16. Evaluation = findings are specific and restricted (more info: Priest, 2001)
  • 17. Why do it?
  • 18. Outdoor education life cycle: Role of research & evaluation As the field matures, there is a trend towards more programs becoming involved in research and evaluation.
  • 19. International Life Cycle (Priest, 1999) Source: Priest, S. (1999). National life cycles in outdoor adventure programming .  The Outdoor Network, 10 (1),16-17, 34-35.
  • 20. Organisational use of experiential learning cycle principles
  • 21. Why evaluate? Two main motivations:
    • NECCESITY
      • we have to (for others)
    • MORALITY
      • we want to (to improve/develop)
  • 22. Hierarchy of research/evaluation motivations -1. Intentional disinterest / non-engagement 0. Denial or non-awareness 1. Forced / compulsory 2. Marketing and funding purposes 3. Improve the quality of the program 4. Contribute to developing profession 5. For the sake of humanity & the cosmos
  • 23. Types/models of evaluation (Priest, 2001)
    • Needs assessment : What are some gaps that the program will fill?
    • 24. Feasibility study : Given the constraints, can the program succeed?
    • 25. Process evaluation : How is the implemented program progressing?
    • 26. Outcome evaluation : Were program goals and objectives achieved?
    • 27. Cost analysis : Was the program financially worthwhile or valuable?
    • 28. Research : Will the program work elsewhere?
  • 29. Ways of gathering data
    • Questionnaires, surveys, checklists
    • 30. Interviews
    • 31. Documentation review
    • 32. Observation
    • 33. Focus groups
    • 34. Case studies
  • 35. Models for getting started
    • Internal
      • Euse existing staff resources
      • 36. Advantages: cost-efficient; high level of specific program knowledge
    • External
      • Consultant, university, another organisation, graduate student, etc.
      • 37. Advantages: independent; expertise
    • Collaborative funding applications
  • 38. Example evaluation studies
    • Outward Bound Australia Colonial Foundation Study (Neill, 2001)
    • 39. Young Endeavour (Berman, Finkelstein, & Powell, 2005)
    • 40. Outdoor Education Group (Learning Journeys)
    • 41. Melbourne Children's Institute Resilience/Mental Study Study (2010-2012)
  • 42. A typical evaluation process
    • Define purpose of evaluation
    • 43. Audience – who needs to know?
    • 44. Identify stakeholders
    • 45. Establish program objectives & their operational definitions
    • 46. Identify data collection methods
    • 47. Establish research designs
    • 48. Develop & pilot measures
    • 49. Collect data
    • 50. Analyse data
    • 51. Report / disseminate - & get feedback
    • 52. Consider/Act on recommendations
  • 53. Define purpose of evaluation
    • What is your motivation?
      • Evaluation or research?
      • 54. Why are you wanting to evaluate?
    • What do you want to do with the evaluation?
    • 56. What is the research question ?
    • 57. Note: Its not research if you can't be surprised by the results?
  • 58. Audience: Who needs to know?
  • 66. Identify stakeholders? Who has valuable information to help develop a comprehensive picture?
  • 72. Program objectives & operational definitions
  • 73. Ways of gathering data
    • Questionnaires, surveys, checklists
    • 74. Interviews
    • 75. Documentation review
    • 76. Observation
    • 77. Focus groups
    • 78. Case studies
  • 79. Program objectives & operational definitions Purposes / outcomes Description Recreational, Physical Leisure (fun, relaxation, enjoyment), Physical fitness, Outdoor skills training Educational Direct (subject knowledge) and indirect (e.g., Academic self-concept) Developmental Personal and social development, life skills and functionality of behaviour Therapeutic, Redirectional Improve dysfunctional personal and group behaviour patterns Environmental Environmental attitude, knowledge, and behaviour
  • 80. Dissemination / Reporting forms
    • Academic article
    • 81. Conference presentation
    • 82. Report: Technical? Non-technical?
    • 83. Executive summary
    • 84. Seminar / briefing
    • 85. News release / Popular article
    • 86. Student thesis
    • 87. Website
    • 88. Video
  • 89.
    • Berman, M., Finkelstein, J., & Powell, M. (2005). Tall ships and social capital: A study of their interconnections . International Journal of the Humanities , 2 (2).
    • 90. Priest, S. (2001). A program evaluation primer. Journal of Experiential Education , 24 (1), 34-40. Retrieved January 10, 2010, from http://academic.evergreen.edu/curricular/atpsmpa/Priest.pdf
    References