Evaluacón ILE Introduction to the Course
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Evaluacón ILE Introduction to the Course

on

  • 1,060 views

 

Statistics

Views

Total Views
1,060
Views on SlideShare
1,060
Embed Views
0

Actions

Likes
0
Downloads
19
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Evaluacón ILE Introduction to the Course Presentation Transcript

  • 1. EVALUACIÓN EN EL PROCESO DE ENSEÑANZA DEL INGLÉS COMO LENGUA EXTRANJERA Prof. Rosynella Cardozo R. Prof. Jonathan Magdalena Instituto Pedagógico de Caracas Departamento de Idiomas Modernos Cátedra de Lingüística - Código IIU115 -
  • 2. CONTENT
    • 1.- Review of main concepts
    • a.- Generations in evaluation
    • b.- Evaluation and assessment
    • 2.- Test Characteristics
    • 3.- Principles of test design – test formats
    • 4.- Communicative testing
  • 3.
    • 5.- Test analysis and design
    • 6.- Informal assessment
    • 7.- Self assessment
    • 8.- Evaluation
    CONTENT
  • 4. 1- A.- GENERATIONS IN EVALUATION
    • 1st Generation: evaluators measure participants
    • 2nd Generation: evaluators describe participants
    • 3rd Generation: evaluators judge participants
    • 4th Generation: evaluators negotiate with participants
    Guba, E. and Lincoln, Y. (1989). Fourth Generation Evaluation. Newbury Park: Sage Publications Inc.
  • 5. EVALUATION AND ASSESSMENT Assessment Analysis of documents Appraisals Evaluation Evaluation Evaluation Evaluation Administrators’ Counselors’ Teachers’ Community members’
  • 6. 1- EVALUATION
    • Evaluation is a natural activity which consists of making value judgments constantly. However, evaluation itself is not usually carried out in a principled and systematic way.
    • The implications of evaluating in an educational context are more powerful than those related to the social setting. As a result, it becomes crucial that careful thought is given to make explicit what is being evaluated and the criteria by which it is being judged.
    • Therefore, Evaluation (in the pedagogical context) refers to the act of making value judgments in a systematic way, using a principled, well-defined criteria to determine the product of education.
    .
  • 7. TYPES OF EVALUATION Congruent Formative Summative
    • Before the process begins.
    • To predict results.
    • Throughout the whole process.
    • To reinforce or improve it.
    • At the end of (a stage of) the process.
    • To quantify it through the use of grades.
  • 8. IMPORTANCE OF EVALUATION
    • To diagnose the needs of participants.
    • To determine how effective a process is (so as to improve it).
    • To orient or reorient a process.
    • To obtain feedback about classroom practiced and progress.
    • To confirm the validity of all features in the education context.
    • To determine and monitor students’ weaknesses or strengths.
    • To determine the program’s appropriateness.
    • To check on the strategies and the students’ response to them.
    • To take decisions.
    .
  • 9. PURPOSES OF EVALUATION
    • Accountability: Summative . Determines whether there has been value for money; whether something has been effective or not. It informs to decide is to continue or to be drastically removed. How? Analysis of statistical data. Who? Policy makers and resource providers.
    • Curriculum Development : Formative. Involves information to be used as the basis of future planning and action. Improvement and renewal of curriculum. How? Responses to questionnaires, interviews, diaries. Who? Teachers and curriculum developers.
    • Teacher self-development: Formative. To raise consciousness on teachers and other practitioners about what actually happens in the classroom. How? Self-assessment, awareness-raising activities. Who? Teachers.
    • Student’s outcome : Formative. To check on students’ behavior (non-linguistic factors) and performance (linguistic factors). How? Self and peer-assessment and informal assessment. Who? Students.
  • 10. ASSESSMENT
    • Assessment refers to the collection of data to describe or better understand an issue. In general, it appears the term assessment if more often used when in relation to educational programs. For example, assessment is the "systematic collection, review and use of information about education programs undertaken for the purpose of improving learning and development" (Student Outcome Learning Assessment, 2004). In conclusion, assessment refers to the measurement of performance to determine if the ends of teaching have been achieved, whereas evaluation refers to the judgments based on that information.
    .
  • 11. TYPES OF ASSESSMENT Formal Assessment (TESTING) Informal Assessment Self - Peer Assessment
    • Tests:
    • Exams
    • Quizzes
    • Workshops
    • Projects
    • Presentations
    • Homework
    • Etc.
    • Questionnaires
    • Diaries
    • Surveys
    • Descriptions
    • Etc.
  • 12. WHO SHOULD USE ASSESSMENT AND EVALUATION? . Job Description Examples of why they need Assessment, Research and Evaluation Policy Makers set standards, focus on goals, monitor the quality of education, formulate policies, direct resources including personnel and money, and determine effects of tests Administration are school/departments meeting the goals of the University, appropriateness of curriculums and course, identify program strengths and weaknesses, designate program priorities, assess alternatives, plan and improve programs Teachers refine curriculum, perform individual diagnosis and prescription, monitor student progress, how much knowledge students are retaining from current teaching methods, provide feedback to students Researchers is research meeting the goal of the proposal (especially if funding is reliant on grant money that requires progress reports), how to improve the program, find unexpected outcomes
  • 13. 2.- TEST CHARACTERISTICS .- Validity A test is valid “if it measures accurately what it is intended to measure” (Hughes, A., 1989). .- Reliability A test is reliable if it measures consistently. Results must be stable. .- Practicality Aspects affecting time, money, effort, resources .- Washback Influence of tests on teaching and learning
  • 14. 3.- PRINCIPLES OF TEST DESIGN – TESTS FORMATS
    • - Guidelines for item design
    • - Formats
    • - Sample items
  • 15. 4 .- COMMUNICATIVE TESTING Relevance Contextualization Meaningfulness Authenticity
  • 16.
    • Analysis of authentic tests from different contexts prior to test/item design
    5.- TEST ANALYSIS AND DESIGN