• Save
Evaluating an Master Trainer by Sajjad Ahmad Awan PhD Research Scholar TE Planning
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
320
On Slideshare
320
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Example: <br /> Driving a car <br /> Drawing a blood gas <br />
  • Evaluation is the next step on from monitoring – in evaluation, we use the information we have collected during monitoring. <br /> We can use it to answer these questions. The key thing about evaluation is that we review the information we have collected and use it to adapt our training if necessary. Evaluation should lead to action, even if that action is that everything is going very well and we don’t need to change anything! It’s unlikely that everything will be perfect – there are always things we can learn from. <br />
  • Authoritative <br /> Ensuring that the group addresses a topic, re-routing the discussion; pointing out what needs to be done <br /> Providing knowledge or information; pointing to the connections between the issues; summarizing <br /> Challenging by direct questioning; disagreeing with or correcting or critically evaluating group’s statements <br /> Facilitative <br /> Arousing laughter; giving group member’s permission to release such emotions as anger, embarrassment, irritation or confusion <br /> Drawing out opinions, knowledge or abilities; aiding participant interaction; enabling learning by self-insight <br /> Approving, reinforcing and affirming <br />
  • Ask the question to the group, agree on a definition that everyone understands and accepts <br /> What <br /> (Letting people know how they are performing / Confirming / Recognizing /Behaviour modification /Give and Receive) <br /> Why <br /> (Recognize success and affirm good performance / Reward and encourage good performance / Improve & correct performance / Facilitate learning and change) <br />
  • Many people say that M&E is important, but why? <br /> When working on any project, it is important to step back sometimes to think about how things are going, what is working and, more importantly, what is not working. We talked earlier about assessing needs. But needs change and we need to be aware of that, when we evaluate the effectiveness of training workshops. They may be very effective when delivering them to groups of people who are not familiar with e-resources at all, but you cannot offer the same training to the same people year after year, as they are hopefully developing different skills. If you regularly review (or evaluate) your work, you can develop new materials, and learn from your experiences all the time. <br /> As well as you learning, there are likely to be some lessons that can be shared with other interested people. These people might be library directors, or institutional heads, or project coordinators/funders. As well as wanting to know about if and how the workshops are being successful, sharing lessons about the challenges is also important. If no-one shares their lessons, people will keep repeating the same mistakes/finding the same challenges. We all need to provide evidence to someone about the work we are doing. <br />
  • Perhaps the most difficult thing to decide is what to monitor and evaluate. <br /> Based on needs analysis, you will have worked out what the training needs are (refer back to earlier session on needs analysis). From this needs analysis, you will have written learning objectives (refer back to session on learning objectives, using concrete examples from that session). You need to think about the things you want to happen as a result of the training. These things (or outcomes) can be big or small. <br /> Some things can be measured more easily than others. For example, you probably want to monitor how many people attend the workshops you deliver. This is relatively easy to monitor, but does it give you any information about how effective the training was? No. This is quantitative data, which uses numbers/quantities. It is much more difficult to measure how effective our training is – and we tend to use qualitative data to look at this. <br /> If we think about outcomes, some may seem more likely than others. It’s therefore important to prioritise them. One way of doing this involves thinking about what we expect to happen, what we would like to happen, and what we would love to happen, in an ideal world, as a result of the training we’re delivering. <br />
  • Here are some of the things I see as outcomes of this workshop. (Talk about each one.) So you can see that I’m thinking of different ‘levels’ of outcomes. <br /> What is the overall aim of the workshops you are going to deliver? (Ask for responses from participants, or refer back to earlier sessions if this has been discussed.) Under this overall aim, there may be many things that you would expect to see as a result of the workshop. <br /> In small groups, I will give you a worksheet with some questions to think about. These questions ask you to consider what you would expect, like and love to see happen after the training you deliver. You will have 15 minutes to think about and discuss these questions in your group. Write some notes on the worksheet so that we can then discuss together some of the ideas you come up with. <br /> (Hand out worksheet 1 and monitor activity closely, providing prompts/advice where necessary. Increase time allowed for this activity if necessary.) <br />
  • As a participant, you may feel that we only monitor activities like workshops at the end. <br /> (Click, to reveal ‘answers’ to heading question.) <br /> Monitoring and evaluation is a continuous process. It doesn’t really have a beginning and an end as it’s cyclical (like the picture on the slide). We usually begin with a needs assessment (and we have looked at this part already). Once we have established what the needs are, we can plan and design our workshop. <br /> When we actually deliver the workshop, we can see what works well and what doesn’t. If there is a co-facilitator, it can be useful to ask them to make notes of things that work particularly well or not, and the facilitator should take time to reflect on the workshop at the end of the day (but discussing these reflections with the co-facilitator is useful as people are often overly critical of themselves). <br /> As well as reflection by the facilitator, the participants also need an opportunity to provide feedback – we usually do this at the end of an event by asking everyone to complete an evaluation form. <br /> A further step in the cycle is seeing the effect of the training later on, e.g. weeks or months later. At the end of a workshop, participants often leave feeling full of enthusiasm about what they’ll do differently. It’s important to see if this enthusiasm translates into concrete actions. If nothing changes, has the training been effective? And more importantly, what changes can be made to make it more effective and lead to changes in people’s behaviour. <br />
  • OJT - Transfer of training is maximized <br /> OJT - Brief & poorly structured; Co-workers may resent doing it <br /> Vestibule - promotes practice <br /> Vestibule - relatively few people trained at one time <br /> Job Rotation - acquaints workers with many jobs; opportunity to learn by doing; gives the organization flexibility during worker shortages; provides a variety of experiences and challenges <br /> Job Rotation - If workers are on piecerate, may be unwilling to rotate out of lucrative job; violates the principle of assigning people jobs that match the talents and interests <br /> Apprentice - lasts a predetermined amount of time (Mass Gen) <br />
  • History – Events that take place between measurements in an experiment and are not related to the independent variable <br /> Maturation – Natural changes in participants over time (e.g., growing wise or stronger as well as tired or bored). <br /> Testing – Problems with repeated measurement (e.g., practice effects) <br /> Instrumentation – changes in the measuring instrument over time (e.g., using different surveys to avoid problems with repeated measurement) <br /> Statistical regression – Occurs when participants are selected on the basis of an extreme score (e.g., selecting people for a training program or going to dr. for depression when symptoms are at there worst). <br /> Selection – when participants in one group differ initially from participants in another group <br /> Mortality – occurs when participants drop out of a study, especially at different rates (dull toy study example) <br /> Selection-Maturation – One group of participants changes faster than the other group for reasons not related to the independent variable (e.g., girls developing verbal abilities faster than boys) <br /> Diffusion of treatment – participants in experimental and control groups may communicate with each other reducing the differences between groups (e.g., telling control group about training) <br />
  • Examples of icebreaker activities include: <br /> Put people into pairs and give them five minutes to find out about one another. When the time is up, each person introduces his or her partner to the rest of the group. You can specify what the introduction must cover, for example: name, job role, what the person hopes to gain from the course, and something interesting about the person that is non-work related <br /> Ask everyone in turn to say their (preferred) name, why they were called this name, and how they feel about it <br /> Ask participants to share their views on one thing they like and one thing they dislike <br /> Ask everyone to draw on a piece of paper a cartoon ‘face’ showing how they feel about the topic being covered <br /> Everyone is invited to write the most important thing they already know about the topic being covered on pieces of paper, that are then stuck on the wall. (This is particularly useful so that you don’t tell students things they already know) <br />
  • Time: Keep your eye on the clock. If you’re running late tell the participants. Ask them if they would like to continue for a little longer on the current activity, or if they would like to move on. If the workshop is going too fast, pause and allow discussion on the subject in greater depth, or have some interesting fallback topics available. <br /> Broken projector: Give the group a short discussion activity, or move on to a prepared task while you send for a technician. If it can’t be fixed, do the presentation asking participants to look at your prepared handouts. <br /> Slow/no web connection: Call for a technician. Point out that the workbooks are self-explanatory and that the exercises can still be completed when students leave the workshop (if they have a Web connection). Ask participants how they might handle this problem if it happened with their own students. If you have participants who have experience of the Web sites you are covering, get them to describe the content and their experience. If not you will have to describe it yourself. Keep a sense of humour! <br /> Difficult participants: Act calmly. If the person has concerns then raise them head-on “Would you like to share your thoughts with the group?” Sometimes people are difficult because they feel their views aren’t being acknowledged. If the person is asking too many questions, check to see if the rest of the group are interested in the subject. If the group seems annoyed, because time is being wasted, ask if the group would prefer moving on, with the questioner being answered during a break. <br />
  • There is a huge amount of information available about monitoring and evaluating – this presentation is just to give you some basic ideas about it. <br /> This is an important step in the process of planning and delivering training events. <br /> Many think that this is a step that happens at the end of a workshop, that it’s an added-on bit at the end <br /> It’s actually something that continues throughout the whole process of training (we’ll look at how it fits in a little later) <br /> It’s important to monitor and evaluate the training we deliver but it’s only important if the information is then used <br /> We’ll look together at what we mean by ‘monitoring’ and ‘evaluating’ and why we should monitor and evaluate; and then look more closely at when and how we should monitor and evaluate <br />
  • We monitor activities by collecting information about them. For example, we may monitor how many people attend the workshops we deliver. We might monitor if the training changes how they feel about a topic. <br /> Monitoring also implies that we plan to collect this information and that we collect it regularly. Perhaps this is monthly, or quarterly (every 3 months) or every year. The information that we collect can be about anything that is relevant to our activity, such as how many workshops we deliver, or if there is a change in policy on access to electronic resources at the institution where we work. The most important thing is that we collect information that will be useful to us and will help us to see if our training is effective and is meeting the aims we identified in our needs analysis. <br />
  • Many organisations use evaluation forms to monitor their training activities. There is a large variety of forms available on the Internet, for example. However, it’s important to remember what the aims of your workshop are, and how you are going to measure them (as best you can). The questions should be relevant to your workshop (so don’t just use a form from the Internet without adapting it!), and should generate answers that will be useful to you as you review them. <br /> I have found 4 evaluation forms. I will give copies of each evaluation form to every small group, together with some things to consider. Look at each evaluation form as a group and think of the advantages and disadvantages for each form. Write the advantages on the pink post-it notes and the disadvantages on the yellow post-it notes. The four forms are on the desks at the back of the room – stick the advantages and disadvantages onto the relevant form. While doing this activity as a group, also take some time to think which of these forms you prefer. Put a green post-it note (one per person) next to the form you prefer. You have 20 minutes to do this activity. <br /> (Monitor closely and prompt with questions if groups are quiet. Give updates on time remaining every so often. Feedback: pick some post-its from each form and read aloud. Count votes to see which is preferred form. Ask people to comment on why they liked that one.) <br />
  • Much of what we have talked about has been about monitoring activities. As we said earlier, evaluation is the bringing together of the information you have collected. It is important to review completed workshop evaluation forms and if similar comments come up a lot, this could indicate that something needs to change (or that something is working very well!). <br /> In this project, perhaps evaluation can take place at different levels – locally, with your training partner; regionally with other teams in your area (north, south or central); nationally, among all master trainers. It is important to share what you are learning with your colleagues in different places. Perhaps they are experiencing the same thing and don’t know what to do. Perhaps they have a solution to a difficult situation. While regular sharing is useful, it also needs to be proportionate – i.e. you shouldn’t be spending more time evaluating that actually doing the activity! <br /> The important thing with M&E is that it is used and leads to action. <br />
  • This diagram shows how evaluation fits into the training cycle – it is not a process with a beginning and end, but rather it continues on and on… <br /> As mentioned earlier, there is a huge amount of information about M&E available and this presentation is just to outline some of the basics. I’ll give you a handout which includes some links to further reading, which you can refer to if you are interested. <br />
  • The reason why Kirkpatrick wanted to develop his Four-Level Model was to clarify the meaning and process for determining ‘evaluation’ in a training program. <br /> If there is no change in behavior, but there is a change in skills, knowledge, or attitudes, then using only part of the model (not all levels) is acceptable. If the purpose of the training program is to change behavior, then all four levels apply. <br /> Other authors on evaluation of training programs have proposed various strategies, but Kirkpatrick is given credit for developing and masterminding the Four-Level Model. <br /> Kirkpatrick focuses the Model for the executives and middle management. However, his model works well in most other training areas. <br />
  • These are questions asked by HRD coordinators on training performance and the beginning criteria and the expectations of the resulting training program. <br /> Business training operations need quantitative measures as well as qualitative measures. A happy medium between these two criteria is an ideal position to fully understand the training needs and to fulfill its development. <br /> Quantitative - the research methodology where the investigator&apos;s “values, interpretations, feelings, and musings have no place in the positivist’s view of the scientific inquiry.” (Borg and Gall, 1989) <br /> cont. <br />
  • The end results after an evaluation are hopefully positive results for both upper management and the program coordinators. <br />
  • 1. If and when downsizing occurs, this statement shall have more meaning than ever for some unlucky people. HRD departments are regarded by upper management as an overhead and not contributing directly to production. <br />
  • 2. Pilot courses may be implemented to see if the participants have the necessary knowledge, or skills, or behavioral changes to make the program work. <br /> 3. Kirkpatrick uses eight factors on how to improve the effectiveness of a training program. These eight factors closely follow the Ten Factors of Developing a Training Program. This is a feedback statement spinning off of the Ten Factors. <br />
  • The reactions of the participants must be positive for the program to survive, grow, and improve. <br /> Reactions reach back to bosses and subordinates alike. This word-of-mouth gossip reaction can either make the program or break it. <br /> Here ‘customer’ refers to the participants in the training program. <br />
  • A training program must accomplish at least one of these three learning traits in order to be effective for a participant to learn. The best case scenario is to see an improvement in all three traits. However, as according to Kirkpatrick, only one learning trait is all it takes to have an effective training program. <br />
  • Guidelines for measuring Learning: <br /> 1. Use a control group along with an experimental group to provide a comparison analysis, <br /> 2. Have a pre-test and a post-test, then measure the difference, <br /> 3. Try to get an honest and true 100% response to any interviews, surveys, or tests. <br /> 4. The use of a test to measure participant learning is an effective evaluation for both participant and instructor alike. However, this is not a conclusive fact. There may be other factors involved. Results must be measured across the spectrum of the Ten Factors of Development. <br />
  • Level 3 asks the question “What changes in behavior occurred because people attended the training? <br /> This Level is a more difficult evaluation than Levels 1 and 2. <br />
  • The employee must want to make the change. <br /> The training must provide the what and the how. <br /> The employee must return to a work environment that allows and/or encourages the change. <br /> There should be rewards - <br /> Intrinsic - inner feelings of price and achievement. <br /> Extrinsic - such as pay increases or praise. <br />
  • The employee may - <br /> Like the new behavior and continue using it. <br /> Not like the new behavior and return to doing things the “old way”. <br /> Like the change, but be restrained by outside forces that prevent his continuing to use it. <br />
  • With Reaction and Learning, evaluation should be immediate. But evaluating change in Behavior involves some decision-making. <br />
  • Use a control group only if applicable. Be aware that this task can be very <br /> difficult and maybe even impossible. <br /> Allow time for behavioral changes. This could be immediate, as in the case of <br /> diversity training, or it can take longer, such as using training for administration <br /> of performance appraisals. For some programs 2-3 months is appropriate. <br /> For others, 6 months is more realistic. <br /> Evaluate before and after, if time and budgets allow. <br /> Conduct interviews and surveys. Decide who is qualified for questioning, and, <br /> of those qualified, whose answers would be most reliable, who is available, and, <br /> of the choices, should any not be used. <br /> Attempt to get 100% response. <br /> Repeat the evaluation. Not all employees will make the changes at the same <br /> time. <br /> Consider cost vs. benefit. This cost can be internal staff time or an outside <br /> expert hired to do the evaluation. The greater the possible benefits, the greater <br /> the number of dollars that can be justified. If the program will be repeated, the <br /> evaluation can be used for future program improvements. <br />
  • Many of these questions do not get answered. Why? <br /> Trainers don’t know how to measure results in comparison to the cost of the training. <br /> Secondly, the results may not be clear proof that the training caused the positive results. Unless there is a direct relationship between the training and the results. (i.e. sales training and resulting sales dollars) <br />
  • Use a control group, again, if applicable, to prove the training caused the change. <br /> Allow time for results, different for different programs, different for each individual. <br /> Measure before and after. This is easier than measuring behavior because figures are usually available - hard data, such as production numbers or absenteeism. <br /> Repeat the measurement. You must decide when and how often to evaluate. <br /> Consider cost vs. benefit. Here, the amount of money spent on evaluation should be determined by - <br /> cost of training, <br /> potential results to be achieved, and <br /> how often the training will be repeated. <br /> And last, be happy with evidence of training success, because you may not get proof! <br />

Transcript

  • 1. Presentation SAJJAD AHMAD AWAN PhD RESEARCH SCHOLAR TE PLANNING DTSC KSB
  • 2. Imagine a day where there is no need for heavy manuals and stationary information.
  • 3. Imagine that problems and questions are answered in seconds, and maybe even by colleagues in far away locations.
  • 4. Imagine a day where you obtain information and learn new things exactly where the need arises.
  • 5. In this session we will discuss • HOW TO EVALUATE MT • ROLE OF MT • TYPES OF STUDENTS AND ROLE OF MT • HOW TO BE GOOD MT • CHALENGES OF ICT AND GLOBALIZTION TO MT • MT AND EFFECTIVE LEARNING • TIPS FOR SUCCESSFUL TRAINERS
  • 6. Phases of Skill Acquisition Acquiring Declarative Knowledge Knowledge Compilation Back Procedural Knowledge
  • 7. Trainer's 5 Golden Rules Good trainers practice their craft not for the money or because they have to, but because they truly enjoy it and because they want to. Good trainers couldn't imagine doing anything else." " Great trainers neither mock nor underestimate their students, ever.
  • 8. What is evaluation? • Using information collected during monitoring • Helps to answer: - How well are we doing? - Are we doing the right things? - What difference are we making? • Analyse/assess key issues → action
  • 9. FORMAL DEFINITION OF TRAINING • Training is a structured process that provides participants with the knowledge and skills needed to perform a task, the the desire to use them.
  • 10. THREE MAIN PILLAR TO ASSESS MT
  • 11. PERSONALITY • Appearance & dressing • Confidence • Communication & Presentation skills
  • 12. CONTENT • Well-versed with the subject & level of confidence • Delivery has relevance to the topic • Sequence in delivery
  • 13. METHODOLOTY • Objectives of the session shared • Learning outcomes were achieved • Attention given to all trainees • Encouraged participants to raise questions • Increased motivational level of participants
  • 14. METHODOLOGY • Activity based training / shared examples in session • Time properly Managed • Summarized & Concluded the session properly • Used A.V Aids (PPT,Board,etc)
  • 15. TRAINER CHARACTERISTIC GAME • Distribute one set of “characteristic cards” and one “game board” to each team. • Have stewards explain how the game is played: • Designate one team member as the “team leader” • Designate one team member as the “signal person” • Read each “characteristic card” and place it under the proper heading on the “game board” • Each team leader makes sure there is consensus on placement • When finished, the “signal person” announces completion. • When all teams finish, distribute the “Trainer Characteristics” hand-out • Display the Game Key as an overhead, and discuss results as a group.
  • 16. GENERAL CHARACTERISTIC OF A GOOD TRAINER • Exhibits Professionalism • Has Good Communication Skills • Can Relate to the Group • Is well Organized • Has Positive Personality Traits
  • 17. PERFESSIONALISM • Serves as a Role Model • Demonstrates Mature Behavior • Exhibits Confidence • Enthused about Training Considers training as an opportunity to develop the skills of others.
  • 18. COMMUNICATION SKILLS • Sets Objectives • Clearly Explains Concepts • Demonstrates Tasks and Procedures • Creates a Supportive Learning Environment • Listens Actively and Sensitively Understands that actions often speak louder than words.
  • 19. RELATES TO THE GROUP • Exhibits Rapport • Friendly and Congenial towards Everyone • Encourages Questions and Discussion • Skilled at Conflict Resolution Has the composure to lead and control a group without being overbearing
  • 20. ORGANIZED • Can Balance Multiple Responsibilities • Manages Time Effectively • Develops Detailed Plans in Advance • Prepares for Alternatives or Mishaps Realizes that time is a valuable resource for everyone.
  • 21. GOOD TRAINER PERSONALITY TRAITS • Patient • Flexible • Empathizes with Others • Nurturing • Creative • Committed • Team Player Exhibits behavior that supports the transfer of knowledge to others.
  • 22. PERSONAL SUPPORT • What is an obvious qualification for a good trainer? • Te ability to do a job/task well, and superior technical knowledge or skills.
  • 23. PERSONAL SUPPORT • s technical expertise enough? • Although knowledge and experience provide an “edge”, it alone does not make you a “good trainer”. The ability to communicate effectively, and a willingness to constantly improve are more important.
  • 24. PERSONAL SUPPORT • How can leaders/trainers grow and improve? • Personal support outside formal training courses is one way of building self-confidence and establishing relationships. Basically this comes in the form of helping friends and seeking advice from others.
  • 25. THE GOOD TRAINER/LEADER EQUATION • Technical Proficiency • + • Personal Support • = • An Effective Trainer/Leader One of your primary objectives should be to help a friend.
  • 26. RESPONSIBILITIES OF TRAINERS • AS TRAINERS WE SHOULD STRIVE TO: • Keep our skills sharp, our knowledge current, and our hearts and – minds always open; • Provide personal support whenever it is needed to help develop – effective Scout leaders; • Build lasting friendships and weld them together with the spark – that is created by the fun of Scouting; • Utilize personal support to build a strong team of Scout leaders – who enjoy working together.
  • 27. Five Heads To Be Good MT • Respect • Accept • Be prepared • BE • Know
  • 28. Training Skills • Roles of a Trainer • What a Trainer Should Do Well • Feedback & Evaluation
  • 29. Your Roles as a Trainer
  • 30. Planning Role • designs the learning experience
  • 31. Expert Role • transmits information
  • 32. Instructor • directs the learning situation
  • 33. Facilitator • Helps the group to get to an agreed endpoint and helps learning take place
  • 34. Resource Person • Provides materials & information
  • 35. Model Role • Models or influences behavior & values
  • 36. Co-Learner • learns along side the trainee
  • 37. What a Trainer Should Do Well • Understands basic teaching methods and applies this knowledge • Communicating • Facilitating • Presenting (separate sessions)
  • 38. Communication “Communication is an exchange, not just a give, as all parties must participate to complete the
  • 39. The Interpersonal Gap Model* A’s private intentions A’s observable actions B’s private interpretation s SAJJAD AHMAD AWAN PhD SCHOLAR TE PLANNING DTSC KSB WORK Pass through filters and are transformed Into… Pass through filters and are transformed Into…
  • 40. Age Birth Order Gender Marital Status Religion Organizational Role EducationWork Background Income Family Norms What are YOUR filters? Ethnicity Physical Abilities Values ACADEMIC
  • 41. • exercise
  • 42. Why do we listen badly? • Lack of interest • Criticising speaker’s delivery • Boring subject, prejudices • Too long • 100 things to do • Hunger, or some other discomfort • Distractions/noisySAJJAD AHMAD
  • 43. Your Communication Style To use your communication style better, or to adapt it to different audiences, understand your style and its impact
  • 44. Who is a Facilitator? • A person who helps a group to work together in a collaborative way, by focusing on the process of how the group members work together • Helps the group to get to an agreed endpoint and helps learning take place (both for the group and
  • 45. Styles Available to a Facilitator Authoritativ e • Directing • Informing • Confronti ng Facilitative • Releasing tension • Eliciting • Supporting
  • 46. Questioning • Facilitator uses questions to help a group identify, explore, clarify and develop their understanding, and also help them decide what to do
  • 47. Understanding Group Dynamics Johari’s Window
  • 48. Johari Window
  • 49. Increasing Open Area thru Feedback
  • 50. initial stage OPEN BLIND HIDDEN UNKNOWN Figure 1: Small Green Window Pane
  • 51. application in leadership • To expand Leadership (Green area) you have the Red and Yellow Pills to offer • The Red Pill is disclosure and the Yellow pill is willingness to take in feedback. • Leaders who do not disclose and do not take feedback, do not make very effective leaders.
  • 52. OPEN ---> Ask for Feedback | | / Disclose and Tell about Self in Public BLIND HIDDEN UNKNOW N Figure 2: Large Green Window Pane improved stage
  • 53. What Type of Trainee are You?
  • 54. The Monopolizer • Takes up all the time with their own issues, making it difficult for others to participate • Interrupts, fails to listen and generally dominates discussions
  • 55. The Complainer • Continually finds fault with everything • Is not a problem solver, but a problem seeker
  • 56. The Silent One • Reluctant to participate
  • 57. The Hostile One • Makes confrontational remarks • Attacks (verbal) other participants or the facilitator
  • 58. The Negative One • Dwells on complications, problems • Avoids finding solutions or positive points
  • 59. The Dominator • Think they have all the answers, want to control the discussion • Think they are superior to everyone else.
  • 60. The Whisperer-Conspirator • Has private conversations while the facilitator or others are speaking.
  • 61. The Clown • Uses humour to distract or put down others
  • 62. The Prisoner • Unhappy • Restless • In the session against their will
  • 63. Feedback and Evaluation
  • 64. What is it? Why do we do it? Feedback
  • 65. (Formal) Evaluation
  • 66. Results-Based Learning Inputs Activities Outputs Outcomes Impact Efficiency Effectiveness
  • 67. Level of Results Levels Description Result I Reaction: What is the participant’s response to the training? Output II Learning: What did the participant learn? Output III Behaviour Change: Did the participant’s learning affect their behaviour? Outcomes IV Organizational Performance: Did participant’s behaviour changes affect the organization? Outcomes V Return on Investment ImpactSAJJAD AHMAD
  • 68. Measures … Levels Description Measure I Reaction Satisfaction or happiness II Learning Knowledge or skills acquired III Behaviour Change Transfer of learning to workplace IV Organizational Performance Transfer or impact on society SAJJAD AHMAD
  • 69. Evaluation Tools • Daily Feedback • Session Feedback • End-of-Course Evaluation • Post-course Evaluation • Research
  • 70. UNITED NATIONS SIAPUNITED NATIONS SIAP PARTICIPANTSPARTICIPANTS SIAP’s Evaluation Framework Course Plan Diagnostic Test Grading Sheet Topics/Lessons Topics/Lessons Topics/Lessons Topics/Lessons Topics/Lessons Exams/Tests Exams/Tests Exams/Tests Pre-Course Expectation After-Course Evaluation Evaluation Forms Course Evaluation Directives for Training Organization Education Strategy GOVERNING COUNCILGOVERNING COUNCIL Program of Courses Program Evaluation Strategy Evaluation Strategic Plan Alumni Survey Clients Survey Course Info Request SENDING NATIONAL STATISTICAL ORGANIZATIONSSENDING NATIONAL STATISTICAL ORGANIZATIONS
  • 71. Why monitor and evaluate? • Learning and development - What happened and why? - What is working? What is not working? - Adapt to changing needs • Accountability - Evidence of successes or challenges for sharing with stakeholders
  • 72. What should we monitor and evaluate? • Consider the aim/s of the project • Prioritise outcomes (things that will happen because of the training) • Quantitative and qualitative data
  • 73. Prioritising outcomes • I’d expect to see… • I’d like to see… • I’d love to see… all participants deliver at least three workshops themselves regular discussion between all master trainers on the challenges they face and possible solutions 50% of participants training their colleagues to train others
  • 74. When should we monitor and evaluate? • Training needs assessment • During the event • Immediately after the event • Some weeks/months later
  • 75. Session Objectives: a. To define MT evaluation b. To discuss the purpose of MT evaluation c. To Identify different types MT evaluation d. To review and critique MT evaluation tools
  • 76. RESPECT
  • 77. RULE # 1: Respect yourself and your colleagues • Respect your – Ability – Confidence – Designation – Trust within your self • Respect Colleagues – Be cooperative – Be helpful – Show trustworthiness
  • 78. RULE # 2: Respect students and parents • Be humble • Be very polite • Give confidence • Give proper and equal attention to all • Be supportive • Win their trust • Be cooperative
  • 79. RULE # 3: Respect rules and regulations • Rules and regulations are the base of strong systems FOLLOW THEM!! • Make class rules and regulations • Follow and teach students to follow
  • 80. RULE # 4: Respect opinion of others • Do not follow you are wrong I am right approach • Use I agree but approach
  • 81. ACCEPT
  • 82. RULE # 5: Accept advice and work on it • Always take advices • Work on the advices • Advices help in brain storming
  • 83. RULE # 6: Accept students for what they are • Never underestimate students • Accept them and help them improve • Some students are slow learners help them by concentrating on them
  • 84. RULE # 7: Accept colleagues as equals • A friendly and cooperative atmosphere of a work place always helps in better results • There must be equality in the values even if the seniority prevails
  • 85. RULE # 8: Accept your responsibilities and carry them through • A responsible leader creates responsible public • Accept all your responsibilities and teach pupil to handle responsibilities
  • 86. RULE # 9: Accept diversity (variety) in your students and learn from them • The more the minds, more will be the variety of ideas • Handle the diversity and use it for better learning
  • 87. BE PREPARED
  • 88. RULE # 10: Be prepared to improve yourself in your subject • Have complete grip on your subject • Confidence is all about knowing • Know your subject as much as you can
  • 89. RULE # 11: Be prepared to be a life long learner • Every one is a student • There is no age limit for learning • Hadis: “Grasp knowledge from cradle to grave”.
  • 90. RULE # 12: Be prepared to carry out your responsibility • Taking the responsibility is one thing CARRY OUT your responsibility • Make the students do the same
  • 91. RULE # 13: Be prepared to share and cooperate • Sharing and cooperation gives rise to – Broadening of ideas – Growth of knowledge – Better working environment
  • 92. RULE # 14: Be prepared to inform and be informed • A teacher must be fully informed about every thing around him (in the school and in the outer world of IT) • Better informed teachers have positive impact on students
  • 93. RULE # 15: Be prepared to give constructive feedback • Give the students honest and accurate feedback, (response, opinion) • It helps pupil to improve • Parents can also take advantage of the feedbacks
  • 94. BE
  • 95. RULE # 16: Be compassionate (kindhearted) to your students and colleagues • One of the most important thing is being compassionate • A good behavior gives rise to a better behavior • Compassionate people are loved and liked
  • 96. RULE # 17: Be sincere (honest) with yourself and others • “Honesty is the best policy” • It is the key to success in every walk of life • Sincerity helps build better relations
  • 97. RULE # 18: Be sociable (friendly) with the students and colleagues • A very strict environment can limit the learning process and make the class less lively • All the time serious behaviors with colleagues limit the bondage • MAINTAIN DISCIPLINE!!
  • 98. RULE # 19: Be encouraging at all times
  • 99. RULE # 20: Be courteous (polite) at all times • People love to talk to courteous beings • It is the way of the Holy Prophet P.B.U.H • Your attitude in class will determine the attitudes of your pupil (students)
  • 100. RULE # 21: Be generous (kind) with your praise (admire) • Praise your students when ever necessary • Admiration lead to increased morale of the students
  • 101. RULE # 22: Be imaginative (creative) with your students • Go deep into the oceans of imagination, and take your students with you • Imagination leads to broadening of minds
  • 102. RULE # 23: Be creative in all your work • Creativity adds to the beauty of work • It helps enhance the interest of the students in the class • It brings sophistication (style) in what a person does
  • 103. RULE # 24: Be innovative in your classroom • Innovation brings in newer and unique ideas • Innovation helps fight resistance to change • Change is necessary, therefore INNOVATE!
  • 104. KNOW
  • 105. RULE # 25: Know that you have responsibility towards student, parents, and school • Being a teacher is being the VALUED • With greater value comes greater RESPONSIBILITIES • Students, Parents and School is the responsibility of a teacher
  • 106. RULE # 26: Know that you are accountable for all that you say and do • With greater responsibility comes greater ACCOUNTABILIT Y (answerability) • A teacher is responsible for all he says and does • BE VERY VERY CAREFUL!!
  • 107. RULE # 27: Know that you are part of a team • As a teacher in a school you are a part of a team • Work as a team • Fulfill the responsibilities of a team mate • Respect your team mates
  • 108. Considerations in Training Design • Designing a learning environment – Learning principles – Trainee characteristics – Instructional techniques
  • 109. Important Trainee Characteristics • Trainee readiness – Trainability tests • Have prospective trainees perform a sample of tasks that reflect KSAs needed for job • Trainee motivation – Arousal, persistence, and direction – Factors related to high motivation • Self-efficacy • Locus of Control • Commitment to Career Back
  • 110. Instructional Techniques • Traditional Approaches – Classroom Instruction • Lecture and Discussion • Case Study • Role Playing – Self-Directed Learning • Readings, Workbooks, Correspondence Courses • Programmed Instruction – Simulated/Real Work Settings • Vestibule training • Apprentice training • On-the-job training • Job Rotation/Cross Training
  • 111. New Training Technologies • Distance Learning • CD-Rom and Interactive Multimedia • Web-based Instruction • Intelligent Tutoring Systems • Virtual Reality Training
  • 112. Kirkpatrick’s Evaluation Criteria • Level 1 – Reaction – Did trainees like the training and feel it was useful • Level 2 – Learning – Did trainees learn material stated in the objectives • Level 3 – Behavioral – Are trainees using what was learned back on the job • Level 4 – Results – Are benefits greater than costs
  • 113. Assessing Training Outcomes • Goal is to identify training as “cause” of changes in on-the-job behavior or organizational results. • Experimental designs help researchers to link training to results • There are a number of reasons (threats) why it is difficult to determine impact of training on results – The Wisdom Pill
  • 114. Experimental Design • Controlling potential confounds – Goal of experiment is to “rule out” alternate explanations of what affected dependent variable • Confounds are threats to internal validity • Can be controlled through appropriate experimental design and procedures
  • 115. Internal Validity • Confounds Controlled by Experimental Design 1. History 2. Maturation 3. Testing 4. Instrumentation 5. Statistical Regression 6. Selection 7. Mortality 8. Selection-Maturation • Confounds NOT controlled by Experimental Design 1. Diffusion of Treatment 2. Compensatory Equalization 3. Compensatory Rivalry
  • 116. Pre-experimental Designs • Disadvantages – Controls none of the threats to internal or external validity – Basically worthless • Advantages – Can potentially provide information for speculation about training effectiveness Training Posttest Post with no Control Group
  • 117. Pre-experimental Designs • Cannot rule out any threats to internal or external validity – Except possibly mortality • Advantages – Can determine if change occurred – May be able to understand mortality Pretest Training Posttest Pre – Post with no Control Group
  • 118. Experimental Designs Experimental Training Posttest Random Assignment Control Posttest Group Differences Posttest-Only Control Group Design
  • 119. Experimental Designs Pretest Experimental Training Posttest Pretest Control Posttest Group Differences Group Differences Pre – Post with Control Group
  • 120. Experimental Designs Group 1 Training Posttest Group 2 No Training Posttest Group 3 Training Posttest Group 4 No Training Posttest Pretest Pretest Solomon Four Group Design
  • 121. Key Dates for Group Project • April 30th –Training Objectives Due • May 12th – Evaluation Materials Due • May 14th and 19th – Training Delivered • June 9th – Group Report Due
  • 122. Training Skills • Roles of a Trainer • What a Trainer Should Do Well • Feedback & Evaluation
  • 123. Our Roles as a Trainer
  • 124. Tips for Trainers
  • 125. Tips for successful training • Prepare beforehand • Check the venue • Facilitate learning • Introduce training and participants • Handle questions and discussion • Troubleshoot • Keep participants focused • Ask open questions • Summarize and evaluate • Make improvements for future training
  • 126. Preparation • Do background reading and get hands- on experience • Don’t have to be expert; OK to say “I don’t know” and research/ask AGORA • Read presentation notes and annotate for yourself • Remember your own workshop experiences-What did and didn’t work?
  • 127. Preparation • Do the computer exercises and identify any problems • Get list of attendees and information on their skill levels if possible • Get contact details for venue and organizers if off-site • Print out handouts and workbooks
  • 128. Check the Venue • Arrive early • Know support staff and their contact information and learn the layout if new venue • Set up and check computers and other equipment • Practice exercises again • Get computer log ins and bookmark web resources • Organize materials
  • 129. Facilitator’s Role • To “create conditions in which learning can naturally take place” • Encourage “active learning”-student discussion and cooperative, hands-on activities • Minimize passive listening and note taking • Be responsive to needs and interests of group
  • 130. Facilitator’s Role • Don’t sacrifice comprehension for coverage of all material • Build rapport and find out background and interests of participants • Provide short and varied activities • Check for signs of engagement and comprehension (eye contact, posture, facial expressions)
  • 131. Facilitator’s Role • Don’t talk to/read from screen • Make eye contact and try for conversational style • Encourage, listen and positively respond to participants’ comments, questions and feedback • Listen to discussions but don’t interrupt; remember comments and questions for group discussion
  • 132. Getting Started • Introduce yourself • Tell participants what will be covered and what they will gain • Explain the timetable and the activities • Point out the location of facilities (food, bathrooms, etc.) • Find out what people already know and what they are interested in learning • Make them feel at ease
  • 133. Getting Started • During introduction, allow participants to get to know one another (and you to know them) through planned activity (“icebreaker”) • Increases comfort level with collaboration • Examples
  • 134. Questions and Discussion • Use people’s names when addressing them • Tell people when you want them to ask questions (during or at end of presentation) • Explain that questions increase learning for whole group • Be enthusiastic and encouraging to all responses
  • 135. When Things Go Wrong • Overtime • Broken projector • Slow/no web connection • Difficult participants • No understanding for you and participants
  • 136. Keeping Focus • Listen to groups • Clarify questions for individuals or group • If unrelated discussion or web browsing, ask how participant’s doing and what conclusions they’ve reached • If questions are off-topic, save for breaks or after workshop • Assistant facilitators can help
  • 137. Closure and evaluation • Conclude activities with summary • Provide overall picture • Ask open-ended questions instead of “Do you understand?’ • Ask participants to reflect on their learning • Be positive about achievements • Hand out feedback forms
  • 138. Changes for next time • Reflect on problems and successes • Look for trends in feedback • Make notes on changes to be made to slides, exercises, handouts • Make changes immediately before you forget or run out of time • Ask on where you did not really understand
  • 139. Monitoring and Evaluating Training
  • 140. What is monitoring? • Collecting information about your project • Planned, organised and regular collection • Information about activities, services, users, outside factors
  • 141. Using evaluation forms • Widely used; great variety • Relevant questions; useful answers • Comparing different forms - 4 forms to look at - Advantages (pink) - Disadvantages (yellow) - Choose a favourite (green)
  • 142. Bringing information together • Collected information must be used • Regular review/evaluation of data • Locally/regionally/nationally • Discussion of challenges and solutions • Lead to action
  • 143. Kirkpatrick
  • 144. The Four Levels • Reaction • Learning • Behavior • Results
  • 145. All about Kirkpatrick In 1959, Kirkpatrick wrote four articles describing the four levels for evaluating training programs. He was working on his dissertation for a Ph.D. when he came up with the idea of defining evaluation. Evaluation, as according to Kirkpatrick, seems to have multiple meanings to training and developmental professionals. Some think evaluation is a change in behavior, or the determination of the final results.
  • 146. All about Kirkpatrick (continued) • Kirkpatrick says they are all right, and yet all wrong. All four levels are important in understanding the basic concepts in training. There are exceptions, however.
  • 147. Kirkpatrick: Evaluating Training Programs • “What is quality training?” • “How do you measure it?” • “How do you improve it?”
  • 148. Evaluating “The reason for evaluating is to determine the effectiveness of a training program.” (Kirkpatrick, 1994, pg. 3)
  • 149. Reasons for Evaluating Kirkpatrick gives three reasons ‘why’ there is a need to evaluate training: 1.“To justify the existence of the training department by showing how it contributes to the
  • 150. Reasons for Evaluating 2. “To decide whether to continue or discontinue training programs.” 3. “To gain information on how to improve future training programs.” (Kirkpatrick, 1994, pg. 18)
  • 151. The Four Levels • Reaction • Learning • Behavior • Results
  • 152. Reaction: is the measuring of the reaction of the participants in the training program. is “a measure of customer satisfaction.” (Kirkpatrick, 1994, pg. 21)
  • 153. Learning: is the change in the participants’ attitudes, or an increase in knowledge, or greater skills received, as a result of the participation of the program.
  • 154. Learning The measuring of learning in any training program is the determination of at least one of these measuring parameters: • Did the attitudes change positively? • Is the knowledge acquired related and helpful to the task? • Is the skill acquired related and helpful to the task?
  • 155. Behavior Level 3 attempts to evaluate how much transfer of knowledge, skills, and attitude occurs after the training.
  • 156. The four conditions Kirkpatrick identifies for changes to occur: • Desire to change • Knowledge of what to do and how to do it • Work in the right climate • Reward for (positive)
  • 157. When all conditions are met, the employee must: • Realize an opportunity to use the behavioral changes. • Make the decision to use the behavioral changes. • Decide whether or not to continue using the behavioral changes.
  • 158. When evaluating change in behavior, decide: • When to evaluate • How often to evaluate • How to evaluate
  • 159. Guidelines for evaluating behavior: • Use a control group • Allow time for change to occur • Evaluate before and after • Survey/interview observers • Get 100% response or sampling • Repeat evaluation, as appropriate • Consider cost versus benefits
  • 160. Evaluation Questions: • Increased production? • Improved quality? • Decreased costs? • Improved safety numbers? • Increased sales? • Reduced turnover? • Higher profits?
  • 161. Guidelines for evaluating results: • Use a control group. • Allow time for results to be achieved. • Measure before and after the program. • Repeat the measurements, as needed. • Consider cost versus benefits. • Be satisfied with evidence if proof is not possible.
  • 162. Good Luck with Presentation