Grace Kimble
8-10 October 2013
House of Astronomy, Heidelberg, Germany
Name
Country
What you want
from the
workshop
Welcom...
Name
Country
What you want
from the
workshop
Introductions
Workshop goals
Participants will:
1.1 Understand evaluation key ideas
1.2 Share evaluation case studies
1.3 Learn about ev...
Workshop 1 Aims
1.1 Understand evaluation key ideas
1.2 Share evaluation case studies
1.3 Learn about evaluation context
Workshop 1 activities
-Using the wiki
-What you already know
1.1 To understand evaluation key ideas
-Overview- key words a...
Workshop 1 activities
Day 1 aims:
-Using the wiki: evaluationunawe.wikispaces.com
Workshop 1 activities
-What you already know/ questions you have
General reference:
Personal Meaning Mapping
Falk, J. H. (...
Specific Astronomy education reference:
Lelliot, 2008
Data collection outside and inside the classroom:
Personal Meaning M...
Lelliott, A. D., Rollnick, M., & Pendlebury, S. (2005).
Investigating learning about astronomy - a school visit to a scien...
How can children respond to new
experiences?
Draw
Talk
Play games
Answer questions
Educator impact on teachers;
Teacher impact on children’s learning
Wouter Schrier and Erik Arends
Teacher training session...
•Randomised Control Tests e.g. pre/post test - knowledge
Example: Two-Group Pretest-Posttest Comparison Study
Prestest Sco...
• Interviews
• Observations
• Drawings
• Interpretive
• Images
• Video
TEXT/ VISUAL OUTPUT
1.1 To understand evaluation co...
1.1 To understand evaluation context and key ideas
-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas
-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas
-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas
-Overview- key words and organisation (GK)
1.1 To understand evaluation context and key ideas
-Overview- key words and organisation (GK)
Definitions
• Quantitative
• Qualitative
• Front-End
• Formative
• Summative
Case study
1. Who were the participants?
a) Age
b) Number of participants
2. What was the activity?
Website address:
a) Go...
Workshop 1 activities
1.2 To share evaluation case studies
-Case studies
----------------------------------------------
Pl...
Context: Levels of evaluation
• International Policy
• National Policy
• Regional policy
• Network of (in)formal learning ...
Globalisation and Policy Research in Education (Rivzi, 2009).
View of knowledge
When technology illuminates objective knowledge, the role of the educator can focus on subjective
View of knowledge
Patten 1997 ‘You get what you measure’
Measurement needs to adapt to what educators are doing in the cur...
UNAWE evaluation framework
UNAWE evaluation framework
Domains of learning
UNAWE Evaluation epistemology
‘Scientific realism’ - Robson 2002
Moves beyond objective, modernist positivism to
acknowled...
Issues in global evaluation
• Why? Purpose
• Acceptible evidence
• Role of stakeholders
• Standards
• Key questions
• Meth...
Tensions in globalised era
Criteria Ontological perspective:
Structure
Human Agency
Function Control, supervision,
account...
Context: Levels of evaluation
• International Policy
• National Policy
• Regional policy
• Network of (in)formal learning ...
1.3 To learn about evaluation research and examples
-Research about evaluation and examples (GK)
1.3 To learn about evaluation research and examples
-Research about evaluation and examples (GK)
1.3 To learn about evaluation research and examples
-Research about evaluation and examples (GK)
Democracy in UNAWE evalua...
Tensions in globalised era
Criteria Ontological perspective:
Structure
Human Agency
Function Control, supervision,
account...
1.3 To learn about evaluation research and examples
-Research about evaluation and examples (GK)
local human agency perspe...
UNAWE evaluation framework
Domains of learning
Evaluation research notes
Workshop 1 activities
-Review: return to mind map
What’s new? Try out? To share?
Workshop 1 Aims:
1.1 To understand evalua...
Three workshops
Aims for participants:
1.1 To understand evaluation key ideas
1.2 To share evaluation case studies
1.3 ...
Upcoming SlideShare
Loading in …5
×

Workshop 1 introduction, case studies and context for wiki

332 views

Published on

UNAWE EU Astronomy Education
Evaluation Workshop Session 1:
House of Astronomy, Heidelberg, October 8th 2013
Facilitator: Grace Kimble
Content can be freely used.

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
332
On SlideShare
0
From Embeds
0
Number of Embeds
96
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Workshop 1 introduction, case studies and context for wiki

  1. 1. Grace Kimble 8-10 October 2013 House of Astronomy, Heidelberg, Germany Name Country What you want from the workshop Welcome! Please write: EU Universe Awareness International Workshop: Evaluation sessions
  2. 2. Name Country What you want from the workshop Introductions
  3. 3. Workshop goals Participants will: 1.1 Understand evaluation key ideas 1.2 Share evaluation case studies 1.3 Learn about evaluation context 2.1 Recap evaluation methods 2.2 Carry out (video) interviews 2.3 Consider data analysis 3.1 Review evaluation reports 3.2 Present demographic information using mapping software 3.3 Consider evaluation strategies 3.4 Present evaluation information
  4. 4. Workshop 1 Aims 1.1 Understand evaluation key ideas 1.2 Share evaluation case studies 1.3 Learn about evaluation context
  5. 5. Workshop 1 activities -Using the wiki -What you already know 1.1 To understand evaluation key ideas -Overview- key words and organisation (GK) 1.2 To share evaluation case studies -Case studies. ----------------------------------- -Present web links, evaluation findings and gaps 1.3 To learn about evaluation context -Research about evaluation and examples (GK) -Review: return to mind map
  6. 6. Workshop 1 activities Day 1 aims: -Using the wiki: evaluationunawe.wikispaces.com
  7. 7. Workshop 1 activities -What you already know/ questions you have General reference: Personal Meaning Mapping Falk, J. H. (2003). Personal meaning mapping. In G. Caban, C. Scott, J. H. Falk & L. D. Dierking (Eds.), Museums and creativity: A study into the role of museums in design education. Sydney: Powerhouse Publishing.
  8. 8. Specific Astronomy education reference: Lelliot, 2008 Data collection outside and inside the classroom: Personal Meaning Mapping University of the Witwatersrand, South Africa Nonkululeko, Grade 7 Before a visit to an astronomy science centre •Listed nine planets together with some brief facts •E.G. Jupiter is the biggest planet and Mercury is the closest planet to the Sun. •She referred to stars as being “a lighting thing” created by God, and that they are our “friends, family and negbour” (sic). •stars being at the galaxy and Milky Way. •She stated that space consists of open space, containing planets, stars, galaxy and the Milky Way. When probed about her PMM, she confirmed that “God created stars so that it can shine at night”. Although she knew the term galaxy she was unable to explain its meaning or its relationship to the term Milky Way. She further referred to a spaceship and rocket, although she found difficulty in expressing herself here. She also appeared to have differing ideas on aliens. Having said she doesn’t believe in them in the structured interview, she mentioned that some planets have them in the PMM.
  9. 9. Lelliott, A. D., Rollnick, M., & Pendlebury, S. (2005). Investigating learning about astronomy - a school visit to a science centre. Paper presented at the Proceedings of the13th Annual SAARMSTE Conference, Windhoek, Namibia. After her visit to an astronomy space centre: •“saw which bottle goes high and low”. This was reference to the ‘Coke bottle rockets’ which students used in an activity. •Additional planets to the nine named ones. •Additional facts about the nine planets. •Black spots on the Sun. •Various features of Mars: water, land, and orbit. •A description of the Moon landing and the time taken to get there •A star bigger than the Sun.
  10. 10. How can children respond to new experiences? Draw Talk Play games Answer questions
  11. 11. Educator impact on teachers; Teacher impact on children’s learning Wouter Schrier and Erik Arends Teacher training session in Leiden Dumfries Primary School Teacher interview, Dumfries After training session by Libby McKearney and Mark Bailey
  12. 12. •Randomised Control Tests e.g. pre/post test - knowledge Example: Two-Group Pretest-Posttest Comparison Study Prestest Score Posttest Score Treatment A Apre Apost Treatment B Bpre Bpost •Affective responses: Likert scale •Closed questions in surveys •Multiple choice e.g. Sadler (1992). Administered to 1400 school students. Result: Project STAR curriculum materials 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK) Benefits? Disadvantages? QUANTITY= NUMERICAL OUTPUT
  13. 13. • Interviews • Observations • Drawings • Interpretive • Images • Video TEXT/ VISUAL OUTPUT 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK) Benefits? Disadvantages?
  14. 14. 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK)
  15. 15. 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK)
  16. 16. 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK)
  17. 17. 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK)
  18. 18. 1.1 To understand evaluation context and key ideas -Overview- key words and organisation (GK)
  19. 19. Definitions • Quantitative • Qualitative • Front-End • Formative • Summative
  20. 20. Case study 1. Who were the participants? a) Age b) Number of participants 2. What was the activity? Website address: a) Goals: b) Description: c) Time of day d) Activity length e) Who delivered it? f) Format (e.g. resource,school session, festival) g) How was it advertised? 3. Where did it take place? • Location • Country 4. How did you evaluate it? •What did you want to know? •Who were you evaluating it for? •Which methods did you use? •What did you find out? •What were the challenges? •How did you communicate what you found? •What are the implications of the evaluation? http://goo.gl/GyaeCc
  21. 21. Workshop 1 activities 1.2 To share evaluation case studies -Case studies ---------------------------------------------- Please show a weblink about your activity if possible. •What did you find out? •Were there gaps in what you were able to find out?
  22. 22. Context: Levels of evaluation • International Policy • National Policy • Regional policy • Network of (in)formal learning organisations • Cluster of schools • Informal learning organisation • Curriculum • Enrichment Programme • E Learning programme • Resource • Intervention • Group of informal educators • Group of teachers • Group of parents • Informal educator • Teacher • Parent • Group of children • Child
  23. 23. Globalisation and Policy Research in Education (Rivzi, 2009).
  24. 24. View of knowledge When technology illuminates objective knowledge, the role of the educator can focus on subjective
  25. 25. View of knowledge Patten 1997 ‘You get what you measure’ Measurement needs to adapt to what educators are doing in the current context
  26. 26. UNAWE evaluation framework
  27. 27. UNAWE evaluation framework Domains of learning
  28. 28. UNAWE Evaluation epistemology ‘Scientific realism’ - Robson 2002 Moves beyond objective, modernist positivism to acknowledge the importance of the social and historical factors. Chatterji, 2009 Extended Term, Mixed Method (ETMM) approaches 1.Pragmatic choice of methods 2.Mixture of qualitative and quantitative data; triangulation 3.Synthesis of common framework to organise complementary data 4.Does not attempt to generalise between contexts; applicability of results is specific to context. Moves beyond randomised control tests.
  29. 29. Issues in global evaluation • Why? Purpose • Acceptible evidence • Role of stakeholders • Standards • Key questions • Methods • Collaboration • Evaluator role There are different and conflicting schools of thought on how to do educational evaluation Schwandt, 2009 At the core of the IOCE vision is the belief that evaluation as a practice can best be strengthened by the collective and professional efforts of colleagues working together in organised ways International Organisation for Co-operation in evaluation, 2008
  30. 30. Tensions in globalised era Criteria Ontological perspective: Structure Human Agency Function Control, supervision, accountability Learning, understanding Goal Standardisation, universality Variance, difference, diversity and peculiarity Frame Structural/ macro perspective Diagnostic i.e. pupil level Focus Products, conceptual definitions Processes, local meanings Benefit Sorting, accountability Strengthening, autonomy Outcomes Knowledge/professionalism Strengthening/ autonomy Methodology Scientific, quantitative, RCT Responsive, diversified Inquiry Analytic Holistic Locus External Internal Levin-Rosalis et al., 2009:191
  31. 31. Context: Levels of evaluation • International Policy • National Policy • Regional policy • Network of (in)formal learning organisations • Cluster of schools • Informal learning organisation • Curriculum • Enrichment Programme • E Learning programme • Resource • Intervention • Group of teachers • Group of parents • Teacher • Parent • Group of children • Child Structural perspective Human agency perspective Key idea: democracy in evaluation
  32. 32. 1.3 To learn about evaluation research and examples -Research about evaluation and examples (GK)
  33. 33. 1.3 To learn about evaluation research and examples -Research about evaluation and examples (GK)
  34. 34. 1.3 To learn about evaluation research and examples -Research about evaluation and examples (GK) Democracy in UNAWE evaluation
  35. 35. Tensions in globalised era Criteria Ontological perspective: Structure Human Agency Function Control, supervision, accountability Learning, understanding Goal Standardisation, universality Variance, difference, diversity and peculiarity Frame Structural/ macro perspective Diagnostic i.e. pupil level Focus Products, conceptual definitions Processes, local meanings Benefit Sorting, accountability Strengthening, autonomy Outcomes Knowledge/professionalism Strengthening/ autonomy Methodology Scientific, quantitative, RCT Responsive, diversified Inquiry Analytic Holistic Locus External Internal Levin-Rosalis et al., 2009:191
  36. 36. 1.3 To learn about evaluation research and examples -Research about evaluation and examples (GK) local human agency perspective Integration with structural perspectiveCategorisation for communication
  37. 37. UNAWE evaluation framework Domains of learning
  38. 38. Evaluation research notes
  39. 39. Workshop 1 activities -Review: return to mind map What’s new? Try out? To share? Workshop 1 Aims: 1.1 To understand evaluation context Front end, formative and summative evaluation Quantitative and qualitative data 1.2 To share evaluation case studies 1.3 To place existing evaluation experiences in wider contexts Analysis: extent, breadth, depth, and mastery
  40. 40. Three workshops Aims for participants: 1.1 To understand evaluation key ideas 1.2 To share evaluation case studies 1.3 To learn about evaluation context Tomorrow 2.1 To recap evaluation methods 2.2 To carry out (video) interviews 2.3 To consider data analysis 3.1 To review evaluation reports 3.2 To present demographic information using mapping software 3.3 To consider evaluation strategies 3.4 To present evaluation information

×