Your SlideShare is downloading. ×
  • Like
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply


  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads


Total Views
On SlideShare
From Embeds
Number of Embeds



Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

    No notes for slide


  • 1. A PROCESS How to Use a Systems Approach for eLearning eValuation Based Upon Using Enabling Objectives © HOUSTON DENVER CHARLOTTE ORLANDO JACKSONVILLE ATLANTA AUSTIN DALLAS UNIVERSITY Developed and Presented by Jim (Mo) Moshinskie, PhD, CPT 1 Contact: ♦ 254-756-7535 Baylor Onsite Workshops Available from Dr. Mo: Powered by Vuepoint Learning System  E-Learning Made E-Z – immersion seminar for trainers and subject matter experts on how to plan, design, develop, and evaluate eLearning.  eValuation – How to build eLearning using enabling objectives to design and evaluate defendable results. 1 CPT = Certified Performance Technologist, a new professional certification from the International Society of Performance Improvement (ISPI). See See my professional vita at © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 1 of 10 - 4/30/2010
  • 2. Before the Course Development Begins Stage Procedures Principles (Steps to follow in order as shown) (Tips to consider as appropriate) 1.1 DETERMINE THE CONTENT • Involve the management, SME4, ID5, and Community of Practice (COP) in selecting 1. Determine exactly the right content to be taught based the needed topic using Business Analysis upon a thorough front-end needs analysis (FEA). and Performance Analysis techniques. 2. Link it directly and clearly to measurable business needs. • Consider also involving: 3. Discover how management defines success and get their approval / support of the eLearning content. • 4. Form the desired performance into a process (with discrete  Legal department independent stages)2. Do not worry about the content  Information Technology (IT) within each stage right now, just confirm the process.  Product Development  HRD  Marketing Process (Example)  Sales Start  Stage 1  Stage 2  Stage 3  End  International Establish that the expectations of everyone are clearly delineated. 5. Gather all the information that will become the content for • Confront any problems early. the course. • When gathering content, look for brochures, 6. Once all the content has been identified and approved, reports, sales information, manuals, job separate and mark the need to know material from the nice aids, slides, pictures, lectures, artwork, to know material.3 books, scientific data, research results, web searches, etc. Need to Nice to • Clearly mark the information found within Know Know each source above as to “need” and “nice” The minimum knowledge Extraneous information that can to aid the instructional designers. needed to perform the be used for reference and “tell process exemplarily. me more” screens. 2 Cognitive psychologists report that the number of stages ideally should be kept to 7 ± 2. 3 Nice to know content can be presented in hyperlinks to libraries, glossaries, museums, electronic documents. The need to know content is the minimum an employee needs to know and be able to perform for exemplary job performance. 4 SME = Subject Matter Expert(s) 5 ID = Instructional Designer © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 2 of 10 - 4/30/2010
  • 3. 1.2 WRITE THE TERMINAL OBJECTIVE (COURSE MISSION) • The terminal objective is written in two or three succinct sentences. 1. Write the terminal objective for the entire course. This • Example: terminal objective will appear within the course introductory At the end of this course, the learner will be screens. able to follow the five-stage Star Selling 2. Get stakeholder approval. Process to open and close a car sell successfully. 1.3 FORM CONTENT INTO A PROCESS • Use the Zoom Principle in which you first teach the overall process then “zoom” in and 1. Form each separate stage within the performance process teach what to do in each stage of the into short (10-15 screens) stand-alone modules.6 A generic process. model of this course would look something like this: • This gives the learners a useful “big-picture” of the desired performance, and shows how Module 1  Course introduction  Course terminal objective (See 1.2 above) it breaks down into convenient, easy to  Menu of topics (modules) covered in the course remember stages.  How to navigate the interface Module 2  A very brief, high-level overview of the entire process - (Ideal job for a FLASH animation with sound and interactivity).  Pre-Test (with questions linked to specific objectives and modules so VLS can build a customized roadmap).  Stress why this content is important. Modules 3-5 are the teaching modules in this generic course. Module 3  Teach Stage 1 using the ROPES model Adjust the number of teaching Module 4  Teach Stage 2 using the ROPES model modules according to the number of Module 5  Teach Stage 3 using the ROPES model stages in the process being taught Module 6  Build authentic scenarios that allow the learners (Remember 7 ± 2 is ideal.) to perform what they have learned and receive prompt and legitimate feedback. HINT: Start with easy scenarios, then move to harder ones. Module 7  Summarize the process again  Give the post-assessment  Print a completion certificate 2. Design each teaching module to teach one stage specifically using the ROPES instructional design model.7 In the example above, modules 3 - 5 are the teaching modules. Each other these modules could be developed as learning objects and perhaps be used in other courses. 6 Research by Dr. Moshinskie has shown that users spend an average of 1 minute per screen when teaching soft skills. This tends to be much shorter when teaching hard skills, like technical courses on how to use software. 7 ROPES = Review & Relate, Overview, Present and Practice, Exercise, and Summarize. How to develop interactive instruction using the ROPES model is taught in the one-day workshop available from Dr. Moshinskie entitled, “eLearning Made E-Z.” For details, see:, or call 254-756-7535. © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 3 of 10 - 4/30/2010
  • 4. 1.4 DETERMINE PROCEDURES AND PRINCIPLES  Input from SME, exemplary workers, research, and best practices can be used to 1. Divide the content of each teaching module into sets of devise the needed procedures (the steps) procedures (exact steps done in a certain order every time) and principles (the tips) for each stage. and principles (supporting guidelines derived from best practices and lessons learned). Procedures = The Actual Steps Principles = Tips and Guidelines A Sample Process 1. Consider presenting concepts and facts as Start  Stage 1  Stage 2  Stage 3  End nice to know information in the glossary, library, mini-lessons, or hyperlinked to Procedures Procedures Procedures Principles Principles Principles detailed reference screens. 2. VLS provides a searchable library environment for this support information. Note that this is the same method we are using in this paper: The first column numbers the stages, the second column names the stage and shows the sets of procedures to perform, and the third column lists principles to consider. 2. Determine how to present any prerequisite support information that the learners will need to know so they can understand the procedures and principles better (This support information includes the prerequisite concepts8 and facts9.) 1.5 WRITE THE ENABLING OBJECTIVES10  Example of enabling objectives: 1. Write a short enabling objective for each key procedure or At the end of module 2, you will be able to: principle identified in step 1.4. - Greet customers appropriately. 2. Assure that each enabling objective is measurable. - Ask the correct discovery questions. 3. Combine the enabling objectives into a single list (See - Determine their specific needs. exhibit 1). - Overcome any objections, and - Close the sale. Consider making each module so that it can exist as a separate, reusable learning object.11 (The enabling objectives are displayed on the screen as a bulleted list, each starting Note: VLS Content Creator allows you to link learning with a measurable verb). objects together into courses. In this manner, any time you update the original learning object, you automatically update it wherever it appears in any course. 8 Concept = a class of things with common attributes, e.g., chair, dog, house, software, hamburger, sales. 9 Fact = specific information, e.g., La-z-boy Rocker, German Shepherd, McDonald House, excel, Big Mac, TRUST Selling. 10 Based upon Kemp, J., Morrison, G., & Ross, S. Designing Effective Instruction. Upper Saddle River: Merrill. See pages 78-79. 11 In this method, 1 LO = 1LO (1 learning objective = 1 learning object). © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 4 of 10 - 4/30/2010
  • 5. During the Actual Course Development 2.1 CLASSIFY THE ENABLING OBJECTIVES (See Exhibit 1)  Involve SME, C of P, ID  We suggest you first teach the procedures 1. Classify all of the enabling objectives listed in item 1.5 required to complete that stage because it according to importance: sets the big picture in the learner’s mind, then present the principles with tips to C = considered a critical, must-do objective. perform the procedures better. Q = needs question in the pre- and post-assessments. 2. Sequence the enabling objectives within each module: Hint: First teach the procedures  then the principles; first teach any concepts  then the specific facts. 2.2 • Use good test writing procedures. WRITE THE ASSESSMENT QUESTIONS • Since you will collect statistics on the scores, you will need the right number of 1. Write the Pre-Assessment questions using a variety of VLS questions in the pre- and post-assessments. templates. Write at least one question for every critical The rule of thumb is 18-20 questions. enabling objective. • Consult an educational statistical analysis 2. Link the questions to specific modules within the course so book to confirm your particular corporate VLS software can automatically build a unique, customized needs. roadmap consisting only of those modules in which the learner missed questions. 2.3 DESIGN EACH MODULE • In VLS, the course tree showing modules and screens conveniently appears on the 1. Design and build each module using ROPES. left side of the Content Creator authoring screen. The order of screens can be easily (Note: VLS Content Creator tool easily provides a quick method changed, and modules and screens can be to electronically combine the storyboard and module creation renamed as well. simultaneously. This can be done using the VLS wizard or by • Using eye candy appropriately, such as manually inserting modules and screens.) animated movies made from Macromedia 2. Build the “REVIEW & RELATE” screens to include the FLASH, get the learners’ attention during the review and relate information. opening screens12. 3. Build the “OVERVIEW” overview screen, displaying the key enabling objectives. To see some interactive examples of actual award-winning corporate eLearning courses developed by Dr. Moshinskie and his students at Baylor 12 University, please see: © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 5 of 10 - 4/30/2010
  • 6. 2.4 DEVELOP THE “PRESENT” SCREENS • Title screens so linkage to specific enabling objectives are clear. 1. Build the “P” (present and practice) screens. • A good rule of thumb when teaching a 2. Design the screens to teach the enabling objectives in such a procedure or principle is to entitle the manner that clearly demonstrates linkage to the specific screen: “How to…” enabling objective. Use VLS templates to vary the screens. • Research shows the more the course is 3. Include meaningful interactivity on as many screens as entertaining, the better the results.13 possible using a variety of VLS templates. 4. Stress the “how to” of the desired performance. 2.5 DEVELOP THE “EXERCISE” SCREENS • Case study = a detailed presentation of an actual situation followed by a chance for the 1. Build the “E” (engage and exercise) screens using VLS learner to make decisions on how to handle interaction templates, FLASH templates, hot spots, rollovers, it. and FLASH movies. • Scenario = a short situation in which the 2. Incorporate authentic case studies, scenarios, and/or learner chooses correct behaviors simulations that allow learners to apply the critical enabling • Simulation14 = a graphic representation of a objectives to realistic job situations. Interview SME, DW, setting (e.g., sales office) in which learners COP, or exemplary employees, or read reports to devise interact with characters and/or objects on authentic scenarios, simulations, and case studies. the screen (e.g., angry customer or ringing 3. Give legitimate feedback. telephone). May be accented with audio. 2.6 WRITE THE “SUMMARY” SCREEN • Be brief in this summary. • VLS provides a built-in evaluation tool that 1. Write the “SUMMARY” (summary) screen – Restate the allows learners to view Level 1 questions critical information from the critical enabling objectives from about the course. The learners simply click the course. their response to each question. 2. Design the course so that when learners finish, they will be • VLS also allows the addition of open ended directed to the automatic Course Evaluation so they can questions, i.e., How can we make this evaluate the course immediately. course better for your specific job needs? 2.7 PREPARE THE POST-ASSESSMENT  Using the VLS Comment feature within the VLS interface, learners can type their 1. Write the post-assessment – Write a reliable, valid question suggestions, changes, and ideas and submit for each critical enabling objective.15 You can use the same them directly to the design team. pre-test as the post-test if you wish. Determine the passing  The design team can access and print these score. Comments from the VLS Administrator. 2. Decide what to do with learners who fail: Each Comment automatically includes who sent the message, when, and what screen is A) Re-take the entire course, or involved. B) Direct to another personalized Roadmap that VLS builds based upon Post-assessment results. 3. Publish the draft course to the server. 4. Field test your course allowing the learners to use the VLS Comment feature to give suggestions and changes directly to the design team. 5. Revise as necessary 6. Get key stakeholder signoff 13 Hassett, J., “How to Improve WBT: What Users Want,” session printed at ISPI Conference, Dallas, June 2002. 14 Prensky, M. (2001). Digital Game-Based Learning. New York: McGraw-Hill. 15 For more accurate statistical analysis, consider using 20 or more questions in the test. © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 6 of 10 - 4/30/2010
  • 7. Following the Course and During Any Blended Solutions 3.1 ANALYZE THE LEVEL 1 DATA • VLS software automatically collects Level 1 evaluation scores. 1. Collect data for a pre-determined time period. • The chart shows possible indications from 2. Determine the range16, mean, and standard deviation (SD) mean and standard deviation scores. on Kirkpatrick Level I Evaluations (How well the learners  As you become more experienced in enjoyed the course). interpreting results, you can fine-tune this chart to your specific topics and course. 3. Mean Standard Deviation (SD) Possible Indication High Low = Strength of training Low Low = Weakness of training High High = Tendency toward positive perception, but no everyone agrees Low High = Tendency toward having negative perception, but no everyone agrees Evaluate the results. Standard deviations (SD) below .5 indicate agreement and those above 1.0 indicate disagreement. 4. Use judgment for statements that fall in the middle (e.g., M = 3, SD = .75) 3.2 ANALYZE LEVEL 2 DATA  VLS automatically collects pre- and post- assessment scores. This information is 1. Compare the pre- and post-assessment scores statistically. available by users, by regions, by (Consider using the paired t-test17 test to statistically departments, etc. compare the results of the two assessments. Your goal is to  VLS can print a certificate for learners who determine if there was a significant difference between the pass the course. two sets of scores.) 2. Collect and analyze the means and standard deviations18 for the pre- and post-assessment scores. 16 Range = lowest score subtracted from the highest score; Mean = average of all scores 17 T-test is a statistical test which determines if the difference is due to chance. (If result is below .05 = greater likelihood that results were due to the training, if above = higher possibility of chance.) 18 Standard deviation reflects how consistent or inconsistent a group’s perceptions are. (If the standard deviation is low, the data are grouped together very closely to the mean value. If high, it is widespread.) © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 7 of 10 - 4/30/2010
  • 8. 3.3 CONDUCT THE BLENDED WORKSHOP • The ROPES model can be used to develop the agenda for the blended workshop. 1. Conduct an after-course blended workshop so learners can • Using a check sheet can help the workshop apply the enabling objectives in mock situations made as coaches evaluate the participants and authentic as possible. collect data on performance. 2. Evaluate the learners19 on their application of the critical • Input from SMEs, exemplary employees, enabling objectives during the blended workshop based upon line managers, etc. can help you devise a five-point Likert scale (1 = strongly disagree to 5 = strongly realistic workshop exercises. agree). • Blended experience could occur online 3. Calculate a mean and standard deviation on the scores for using VLS chat, push, and comment the learners and on the workshop, and analyze results. features. 3.4 ALLOW THE LEARNERS TO SELF-ASSESS  Examples: I can change a tire. 1. Ask learners to evaluate their confidence levels on I can close a sale. performing the critical enabling objectives using a five point I can manage my time. Likert scale (1 = strongly unconfident to 5 = strongly  Compare these scores to the workshop confident). evaluations (item 3.1 above) and later 2. Include opened ended questions such as: performance scores at the actual worksite Which performance areas do you need specific help on (item 3.3). now? 3.5 QUERY MANAGERS  Quantitative evaluation – numbers (Means that are 4.0 and above are high; those 2.5 1. Select a pre-determined number of employees randomly20 and below are low).21 and query their immediate line managers on the employee’s  Qualitative evaluation – Count the number of implementation of the critical enabling objectives in their positive and negative statements; look for actual workplaces. trends and similarities. 2. Base this query on a five point Likert scale as the supervisor or coach evaluates each critical objective (1 = strongly Hint: In qualitative analysis, consider disagree to 5 = strongly agree). throwing out the one best comment and 3. Calculate means and standard deviations as described the one worst comment before above, and analyze results. evaluating. 4. Ask a series of open ended questions, e.g., How could we train your staff better on this topic? 3.6 CORRELATE ANY POSSIBLE RESULTS • Consider 1 month, 6 month, and 1 year 1. Tie critical enabling objectives to actual customer surveys and correlate results (Consider using the Pearson R22 correlation coefficient). 2. Consider this when evaluating how well the course objective was achieved (See step 1.2) and predict future success. 3. Consider correlating scores to learning styles if this information is collected. 19 How many learners do you evaluate in the blended workshop? If the number of participants is small, perhaps evaluate all of them. If the audience is large, consider randomly selecting a pre-determined number of employees and evaluate them. Another possible approach is not evaluating all of the enabling objectives, but select only several key objectives to evaluate. Another approach would be giving a single, overall score for the participant’s performance during a blended workshop. 20 Randomization - To randomly select, try using any dollar bill. Use the serial number to select team members. The first number will determine which screen to choose the team member from in Vuepoint tracking. The second number will be which employee on the report. 21 After time, you can determine what is high and low for your particular learners and topics. 22 Pearson correlation coefficient shows strength of relationship between two variables. Ranges from –1 to +1. A small or zero correlation hints that they are not related, while closer to 1 hints of a relationship (e.g., when one goes up, so does the other and visa versa). © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 8 of 10 - 4/30/2010
  • 9. 3.7 ANALYZE THE ON-THE-JOB PERFORMANCE • Use someone like a team coach or project manager as the evaluator. 1. Randomly select a pre-determined number of learners and • Use a specific checklist to avoid interrrater evaluate results from a quality walk or shop. errors. 2. Base this evaluation on a five-point Likert scale (1 = strongly disagree to 5 = strongly agree). 3. Calculate a mean and standard deviation. 3.8 COMPARE TO OTHER COURSES • Visit • Visit 1. Compare statistics of this course to other eLearning courses • Read eLearning magazines and journals conducted by your company. • Attend conferences 2. Compare statistics of this course to any national statistics • Network with other performance that may be available. technologists 3.9 MAKE THE FINAL EXECUTIVE REPORT • VLS tracks the results by questions and by topics, departments, regions, etc. as you set 1. Meet with the team to review results, addressing such issues it up in VLS Administrator. as: • Always be alert to other variables that may have skewed results other than your - Were any assessment questions too easy or too hard? interventions, e.g., economy, weather, etc. - Were any topics that seemed too easy or too hard? • Date stamp courses so content will be fresh. - How were the results by the different departments, regions, or countries? • Determine if the results can be used to - What was the course completion rate? forecast future performance. - What did learners like and dislike the most? • Be honest. - Do you see any emerging trends or patterns? - What was the time on task? - What modules did the learners spend the most or less time taking? 2. Devise a written report that succinctly describes the eValuation results. 3. Share results with the key stakeholders. 4. Share results with the learners – they have the right to know. 5. Discuss any areas where the results were unexpected, both high and low. 6. Revise course, lesson, performance objectives, and questions as necessary. 7. Share best practices (knowledge management). 8. Toot your own horn as loud and often as you can. © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 9 of 10 - 4/30/2010
  • 10. Exhibit 1: Sample Enabling Objectives Course Title: HOW TO CLOSE A SALE No. Enabling Objectives Level23 C?24 Q25 Module 1 1.1 Describe the meaning of a selling process Principle √ 5 1.2 List the stages in the Selling Process Procedure 1.3 Explain why we use the Selling Process Principle 1.4 Describe the bridge to agreement and it’s function Principle √ 6 Module 2 2.1 Identify some ways of showing hostmanship Facts 2.2 List and describe the ways to Build Customer Trust Facts √ 14, 15 Module 3Needs 3.1 Explain why people buy Principle √ 11 3.2 Describe how to recognize needs Principle √ 2 3.3 List and describe the 3 types of probes Facts √ 3 3.4 List and describe the types of Follow-up Probes Facts √ 4,10 Module 4Motivations 4.1 Contrast Active and Passive buyers Facts 4.2 Describe how to determine where a buyer is on the active/ Procedure passive scale 4.3 Identify each of the emotional needs Facts √ 7 Module 5Demonstrate 5.1 Contrast Physical and Emotional involvements Facts 5.2 Explain how to get the prospect physically involved Facts √ 9 5.3 Explain how to get the prospect emotionally involved Facts √ 8 Module 6Close 6.1 Define a Logical Conclusion Close Facts 6.2 Describe why people don’t buy Facts √ 14 6.3 Describe what an objection is Procedure √ 13 6.4 List and Describe the stages of the Evolution of a Sales Procedure √ 1 Consultant 6.5 Show how to reach the Selling Close Procedure √ 12 23 Level = Is the content a procedure (steps), principle (tips), concept (a grouping) or fact (specific information)? This is an optional column that helps instructional designers during the design process. 24 C = Considered a critical enabling objective if this box is checked. 25 Q = the pre- and post-assessment question that addresses that specific enabling objective © eLearning eValuation Based Upon Objectives – Jim Moshinskie, PhD, CPT, – Page 10 of 10 - 4/30/2010