• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Evaluating Learners Achievement
 

Evaluating Learners Achievement

on

  • 3,077 views

This is a slide show content for a training session that I gave to undergraduate students in the Introduction to Instructional Design class.

This is a slide show content for a training session that I gave to undergraduate students in the Introduction to Instructional Design class.

Statistics

Views

Total Views
3,077
Views on SlideShare
3,076
Embed Views
1

Actions

Likes
1
Downloads
69
Comments
1

1 Embed 1

http://www.docshut.com 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • this is a worthy presentation
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Tie it to how the pieces fit together Explain the how evaluation should be justified to take part

Evaluating Learners Achievement Evaluating Learners Achievement Presentation Transcript

  • Week 8 EVALUATING LEARNER ACHIEVEMENT
    • By the end of this class, LTLE385 students should be able to
      • Define evaluation in general and learner evaluation in particular.
      • Express the role of evaluation in the instructional design process.
      • List the different types of learner evaluation.
      • Describe the process of learner evaluation.
      • Develop appropriate test items.
    OBJECTIVES
    • http://www.polleverywhere.com /
    • Listen to instruction.
    • Read the question.
    • Have your mobiles ready.
    • Get started to text your answers to 148820
    PRE-TEST
    • Make three groups.
    • Put the pieces together and the winning team will get to have a prize.
    ICEBREAKER
    • What is the purpose of evaluation after all?
    A QUESTION FOR YOU
  • EVALUATION AND THE BIG PICTURE
    • Evaluation:
    • The process for determining the success level of an individual or product based on data and then making decisions based on this success level.
    • Assessment:
    • The procedures or techniques used to obtain data about learner or product.
    • Measurement:
    • The data collected, which is typically expressed quantitatively.
    EVALUATION, ASSESSMENT, AND MEASUREMENT
    • Determine the level of performance or achievement that an individual has attained as a result of instruction.
    • Begins by examining the instructional goal and objectives.
    • Decide on the desired change in learner knowledge, skill, or attitude.
    • Determine the criteria for judging the success.
    • Develop appropriate assessment activities.
    LEARNER EVALUATION
    • Pre-Instruction
      • Gather data about learner current abilities
      • Make adjustments to instruction based on learner knowledge and ability
    • During Instruction
      • Observation
      • Quick-Write or Random-Call
    • Post-Instruction
      • Summative Evaluation
    LEARNER EVALUATION TIMING
    • Validity:
    • A learner evaluation is considered valid when it helps determine whether the intended outcome were met by the learner
      • Face validity
      • Content validity
    • Reliability
    • The extent to which a learner evaluation will provide similar results when conducted on multiple occasions.
      • Criterion-Referenced
      • Norm-Referenced
    DEVELOPING A LEARNER EVALUATION
  • VALIDITY & RELIABILITY
    • Developing and implementing knowledge tests guidelines
    • (p. 143)
    • Objective tests
      • True/False Items
      • Multiple Choice Items
      • Matching Items
    • Constructed-response tests
      • Short-Answer Items
      • Essay Items
    DEVELOPING KNOWLEDGE ASSESSMENT TESTS (PAPER-AND PENCIL TESTS)
    • Guidelines for evaluating a skill (p.147-148)
    • Direct testing
    • Performance Ratings
    • Observations and Anecdotal Records
    • Portfolios
    • Rubrics
    DEVELOPING SKILL ASSESSMENT TESTING (PERFORMANCE ASSESSMENT)
    • The most difficult outcome to determine because it examines learner’s behavior
    • Observation and Anecdotal Records
    • Surveys and Questionnaires
    • Self-Reporting Inventories
    • Interviews
    ASSESSMENT TECHNIQUES TO DETERMINE A CHANGE IN ATTITUDE
    • What do you think the role of ID is?
    • Help create and execute a plan of action to design, develop, and implement an efficient and effective learner evaluation
      • Analyze the instructional goals and objectives
      • Determine the intended learning outcome(s)
      • Determine the appropriate type of learner evaluation needed
      • Design and develop the appropriate assessment techniques
      • Assist in the implementation
    THE INSTRUCTIONAL DESIGNER ROLE
    • Go back to your project’s goal and objective(s). If you haven’t developed them, go ahead and do it.
    • Think of ways to assess and evaluate the achievement of your objective(s) .
    • Post your thoughts in your blog.
    • Use your Google account to create a survey to assess your learners achievement.
    • You can go back to your handouts and book for further assistance.
    ACTIVITY
  • HANDOUTS
    • Use simple sentences.
    • Avoid using some words such as “always”, “never”, and “all”
    • Assess for one idea per item.
    • Avoid complex and ambiguous questions that create confusion or frustration.
    GUIDELINES FOR DEVELOPING TRUE/FALSE ITEMS
  • GUIDELINES FOR DEVELOPING CHOICE ITEMS
    • Write a clear descriptive stem.
    • Keep the choices short and clear (stick to four alternatives)
    • Avoid the use of negatives in the stem.
    • Have only one correct answer.
    • Write the distractors to be plausible yet clearly wrong .
    • Avoid using “All of the above” or “Non of the above”
    • List more options than the premise list.
    • Use homogenous list.
    • Place longer phrases in the left column.
    • Restrict the number of matches to 10 or fewer.
    GUIDELINES FOR DEVELOPING MATCHING ITEMS
    • Statement should not be directly quoted from the text.
    • Answer blanks should be in the same place on the page.
    • The questions should measure accomplishment of objectives appropriately.
    • There should be only one blank in an item.
    GUIDELINES FOR DEVELOPING SHORT-ANSWER ITEMS
    • Be unobtrusive because people behave differently in the presence of others.
    • Be objective in your language. It should be clear, accurate, and formal.
    • Be scientific in your observations. they should be accurate, thorough and complete.
    • should include nothing but your observations.
    • Include an observation form.
    GUIDELINES FOR CONDUCTING OBSERVATION
    • Choose a setting with little distractions.
    • Explain the purpose of the interview.
    • Address terms of confidentiality.
    • Explain the format of the interview.
    • Indicate how long the interview usually takes.
    • Provide your contact information.
    • Ask if they have any question or concern.
    • Record the interview or take notes.
    GUIDELINES FOR CONDUCTING INTERVIEWS
    • Pay attention to survey length and make it interesting.
    • Use legible questions.
    • Use quick to answer type questions.
    • Avoid leading questions.
    • Adopt the same definitions throughout.
    • Avoid negatives or double-negative.
    • Avoid double-barreled questions.
    GUIDELINES FOR DEVELOPING SURVEYS AND QUESTIONNAIRES QUESTIONS