Your SlideShare is downloading. ×
تصميم عمليات تقويم البرنامج التعليمي (المنهج)الاعتبارات والتحديات الأساسية
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

تصميم عمليات تقويم البرنامج التعليمي (المنهج)الاعتبارات والتحديات الأساسية

533
views

Published on

د. دوروثي هارنيش باحثة,مديرة برنامج / مقيمة في مجال برامج تعليم …

د. دوروثي هارنيش باحثة,مديرة برنامج / مقيمة في مجال برامج تعليم
الولايات المتحدة الأمريكية

Published in: Education, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
533
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. DESIGNING  EDUCATIONAL  PROGRAM  EVALUATIONS:    KEY  CONSIDERATIONS  AND  CHALLENGES    IEFE  2013  Riyadh,  Saudi  Arabia,  February  18-­‐22,  2013  Presenter:  Dr.  Dorothy  Harnish,  USA  
  • 2. Purpose  of  the  Evaluation  To  determine:  Ø Extent  of  program  implementation  Ø Stakeholder  satisfaction  Ø Impact/outcomes  of  program  Ø Improvements  needed  Ø Cost-­‐  benefit  Ø Continue/discontinue,  expand/revise    
  • 3. Evaluation  Questions  Ø Is  the  program  being  implemented  as  it  was  designed/intended?  Have  each  of  the  key  elements  of  the  program  been  put  in  place?  If  not,  why  not?  Ø What  changes  have  occurred  in  teaching  methods,  student  behaviors,  learning  outcomes,  or  school  processes?    What  evidence  exists  for  this?  Ø Are  those  implementing  the  program  (e.g.,  teachers,  administrators)  sufficiently  trained  to  use  the  new  methods,  materials,  or  equipment?  
  • 4. Evaluation  Questions  Ø  Are  resources  (e.g.,  materials,  technology,  time,  staffing)  sufficient    to  implement  all  parts  of  the  program?  Ø  How  satisfied  are  parents,  students,  teachers,  and  administrators,  with  the  program  and  results?    What  problems  have  been  identified  by  stakeholder  groups?  Ø  What  changes  are  needed  to  improve  the  program?  Ø  Do  the  results  justify  continued  expenditure  of  funds  on  this  program  or  expansion  to  other  sites?  
  • 5. Information  Sources  §  People  §  Products  §  Processes  §  Documents/Materials  §  Importance  of  “triangulation”  through  multiple  sources  of  information  
  • 6. Data  Collection  Ø Methods  determined  by  evaluation  purpose  and  questions  Ø Qualitative  and  quantitative  methods  § on-­‐site  field  observations    § individual  interviews  § focus  groups  § questionnaires    § document  or  program  materials  review  § student  test  or  performance  data  
  • 7. Method:  Field  Observation  §   First-­‐hand,  direct  observation  of  program  at  implementation  sites      §  Outside  evaluator  effect  on  daily  activities  in  school  –  “snapshot”    §  Checklists,  observation  schedules,  rating  scales,  and  protocols  
  • 8. Method:  Individual  Interview  §  Information  from  key  program  participants,  diverse  experience  and  opinions  on  program  §  Open-­‐ended  questions  (how,  what,  when,  where,  why)  address  key  evaluation  areas  §  Interview  protocol,  note-­‐taking,  recordings    
  • 9. Method:  Focus  Group  §  Small  group  discussion  by  key  stakeholders  (8-­‐10  persons)        §  Use  of  standardized  procedures,  script,  set  of  questions  (protocol)  by  trained  facilitator    §  Open-­‐ended  questions  probe  responses  in  key  evaluation  areas  §  Audio  recordings,  written  transcripts  
  • 10. Method:  Survey  Questionnaire  §  Quantifiable  information  from  large  numbers  of    diverse  participants  ú   (satisfaction,  impact/changes,  problems  ,  suggestions  for  improvement)    §  Online/web-­‐based  survey  software  reduces  cost  of  data  collection,  compilation,  reporting    §  Quality  issues:    ú  Questionnaire  length  ú  Item  response  options    ú  Neutral,  clear,  focused  wording  of  questions    ú  Sampling    concerns,  representativeness  of  respondents  ú  Response  rate      
  • 11. Method:  Document  Review  §  Examination  of  relevant  program-­‐related  documents  ú  Program  reports    ú  Sample  teaching/learning  materials    ú  Curriculum  material,    technology  equipment  ú  Training  guides  ú  Evidence  of  implementation  activities  and  results      §  Checklists  and  rating  scales  allow  systematic  review    
  • 12. Method:  Testing/Assessment  §  Direct  measures  of  student  knowledge  to  document  progress  or  outcomes  ú  Standardized,  norm-­‐referenced,  criterion-­‐referenced  tests  ú  Teacher  developed  instruments  measuring  student  achievement  levels.    ú  Student  assignments,  projects,  or  performances  rated  by  instructors    §  Triangulation  and  use  of  multiple,  alternative  measures  of  student  learning  §  Requires  specialized  knowledge  of  test  construction  and  analyses  of  data  
  • 13. Conducting  the  Evaluation  §  Logistical  considerations:  timeline  for  activities,  number  /type  of  sites,  participants    §  External,  multiple  trained  evaluators  §  Pilot  studies  §  Scheduling  visits,  accessing  information  §  Time  and  resource  commitments  §  Communications    
  • 14. Data  Analysis  &  Reporting  §  Experimental,  quasi-­‐experimental,  longitudinal,  and  qualitative  analyses  §  Pre-­‐post  analyses  to  assess  growth  and  impact  §  Comparison  groups  §  Randomization  to  control  extraneous  variables  §  Generalization  of  findings  §  Difficulty  of  cause-­‐effect  conclusions  
  • 15. Presentation  of  Findings  §  Regular  contact  with  program  personnel,  interim  reports  or  meetings  §  Written  reports,  summative  findings  §  Simplified,  graphic  presentation  of  results  §  How  evaluation  was  carried  out  §  Recommendations  that  address  evaluation  questions,  based  on  key  findings    
  • 16. Challenges  of  Educational  Evaluations  §  Resistance  to  change  §  Lack  of  access  to  information  §  Determining  actual  use  of  new  program  methods/materials  in  classrooms  §  Availability  of  student  assessment  data  §  Use  of  appropriate  evaluation  methods  §  Trust,  confidentiality  concerns  §  Data-­‐based  decision  making