Your SlideShare is downloading. ×
Ray Magnan - Instructional Design Basics: The ADDIE Model
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Ray Magnan - Instructional Design Basics: The ADDIE Model

3,161
views

Published on

Ray Magnan presented Instructional Design Basics: The ADDIE Model at the Pubsnet DocTrain Conference in 2005.

Ray Magnan presented Instructional Design Basics: The ADDIE Model at the Pubsnet DocTrain Conference in 2005.


0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,161
On Slideshare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
117
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Instructional Design Basics - The ADDIE Model Pubsnet Documentation and Training Conference October 17, 2005 Ray Magnan M.Ed. Education Technology Consultant raymagnan@yahoo.com www.geocities.com/raymagnan Copyright © 2005 Ray Magnan 1
  • 2. Agenda • Discuss principles of Instructional Design (ID) – ADDIE Model • Review a sample course Copyright © 2005 Ray Magnan 2
  • 3. Goals • Familiarize you with the following aspects of Instructional Design – Key concepts – Terms – Processes – Issues • Help you ask the right questions about your projects Copyright © 2005 Ray Magnan 3
  • 4. My Background • Design education solutions for software industry – Instructor led training (ILT), self-paced, or blended. – E-Learning: Primarily deployed over the web. • Primary focus: – Software for the healthcare industry. • Business intelligence applications and others • Typically medium to large scale projects • Audiences: – End-users, analysts, software developers, installation consultants, support, sales, and marketing. Copyright © 2005 Ray Magnan 4
  • 5. And you are? • Involved with training? – Most of the time – Part time – Not currently • Industry: – Software – Healthcare – Hardware – Retail – Biotech – Insurance – Manufacturing – Hotel/Restaurant – Transportation – Other Copyright © 2005 Ray Magnan 5
  • 6. ADDIE Model ADDIE Model Copyright © 2005 Ray Magnan 6
  • 7. What is ADDIE? • Widely used methodology for developing new training programs. • Advantages: Flexible and scalable. • Provides a step-by-step system for: – Evaluation of students’ needs – Design and development of materials – Evaluation of training effectiveness Copyright © 2005 Ray Magnan 7
  • 8. ADDIE Model Phases • Analyze • Design • Develop • Implement • Evaluate Copyright © 2005 Ray Magnan 8
  • 9. Enhanced ADDIE Model • Analyze •Track Time – Initial analysis – Create project charter Spent –Use for future • Design – Course outline projects – User interface (UI) –Prove Return on – Prototype. Test technology and UI. Investment (ROI) • Develop – Create materials •Ongoing – Review, edit, QA Maintenance • Implement –Accommodate – Roll out to trainers and students. new features or versions. • Evaluate – Was training effective? Copyright © 2005 Ray Magnan 9
  • 10. Roles • Project stakeholders – Group/person requesting the training – Others with a stake in the success of the project • Management – Training department leadership – Project sponsor – Project manager • Resources – Subject matter experts (SMEs) Copyright © 2005 Ray Magnan 10
  • 11. Roles (continued) • Team – Technical architect – Instructional designer – Assessment and evaluation consultant – Technical writers – Trainers – Editor (Templates/standards) – Graphics designer – User interface (U.I.) designer Copyright © 2005 Ray Magnan 11
  • 12. Analysis Analysis ADDIE Copyright © 2005 Ray Magnan 12
  • 13. Analysis Training Request • Request from senior management or other departments. – New product – Enhancements to existing product – Problems with existing product or process • Stakeholders: Have a vested interest in the success of the training. Copyright © 2005 Ray Magnan 13
  • 14. Analysis Business Analysis • Does this require training? – Are there other alternatives? • Job aids • Existing resources • Other higher-priority training projects? – Sales pipeline • Budget for training? • Delivery date? • Other issues? Copyright © 2005 Ray Magnan 14
  • 15. Analysis Analysis: Info Resources • Stakeholders • Subject matter experts (SMEs) • Product managers • Development • Existing documents – Functional and technical specifications – Test guides • Sales and marketing – Sales pipeline. Business benefit. Copyright © 2005 Ray Magnan 15
  • 16. Analysis Needs Assessment • Interview cross-section of target audiences and management – What do they need to know to do their job – Past issues – Potential issues Copyright © 2005 Ray Magnan 16
  • 17. Analysis Audience Analysis • Who are the audiences for the training? – Size of audience. Location. • Is there any overlap in the required skills? – Could suggest a modular approach. • Examples – Data entry clerk – Department supervisor – Financial analyst – Support and installation – Senior management (Reporting) Copyright © 2005 Ray Magnan 17
  • 18. Analysis Task Analysis • What are the tasks associated with the subject? • What do people need to be successful in their jobs? Copyright © 2005 Ray Magnan 18
  • 19. Analysis Performance Gaps • What are the current skill levels of the audiences? • What do they need to know to be successful? • What are the gaps? • Are the audiences: – Experienced – Novices – New Copyright © 2005 Ray Magnan 19
  • 20. Analysis Media Analysis • Which delivery methods should you use? • What technology is available? – Development and delivery • Be careful about using new authoring or development software in a major project. • Consider the learning curve of the new technology. • Will it speed up or slow down development? Copyright © 2005 Ray Magnan 20
  • 21. Analysis Media Analysis • Budget considerations – Consider time, money, personnel, other resources. – Balance costs with quality of learning. Example. • Simulations and interactivity are resource intensive. • Use them where they give you the best return on investment (ROI) or “Most bang for your buck.” Copyright © 2005 Ray Magnan 21
  • 22. Analysis Media Analysis • Sample Alternatives: – Instructor led training: Classroom or distance education – Self-paced learning: Online or paper-based – Reference materials: Online or paper-based Copyright © 2005 Ray Magnan 22
  • 23. Analysis Blended Learning • Matches the most appropriate delivery method with the learning objectives. • Example: Course with multiple components: – Online self-paced learning: • Pre-requisites and core content. – Collaborative learning: • Conference calls, email, instant messaging. – Instructor led training: • To cover more complex topics. – Online resources for future reference after training Copyright © 2005 Ray Magnan 23
  • 24. Analysis Common Terms: E-Learning • CBT or WBT – Computer-Based Training / Web-Based Training • EPSS – Electronic Performance Support System – Example: Online help or other resources • VAC – Virtual Asynchronous Classroom – Not real time. Ex: WebCT • VSC – Virtual Synchronous Classroom – Real time. Ex: Centra or Sametime. Copyright © 2005 Ray Magnan 24
  • 25. Analysis Common Terms: E-Learning Learning Management Systems (LMS) • Overall infrastructure for managing training: – Lists of available classes – Student tracking • Personal learning path. Which classes they took. Scores. Learning Content Management Systems (LCMS) • Management of learning material and assets: – Word docs, PDFs, graphics, web pages, movies, simulations, etc. Copyright © 2005 Ray Magnan 25
  • 26. Analysis Results of Analysis • Create an initial proposal – Summarize your analysis – Identify the audiences and tasks – Include high level outline of training modules – Proposed delivery methods – Proposed technology • Provide alternate proposals – Plan A: Comprehensive. 6 months development. – Plan B: Less comprehensive. 3 months development. Copyright © 2005 Ray Magnan 26
  • 27. Analysis Project Charter (Proposal) • Defines scope of project – Incorporate feedback from initial analysis – Get approval and buy-in from stakeholders – Project plan with milestone dates. – Identify sponsor, stakeholders and others involved. • SMEs, reviewers, technical and other resources. • Define responsibilities and time availability for input and feedback – Communication strategy for this group. – Rollout plan for training. – Plan for handling Scope Creep and RAID. • Risks, assumptions, issues, and dependencies Copyright © 2005 Ray Magnan 27
  • 28. Design Design ADDIE Copyright © 2005 Ray Magnan 28
  • 29. Design Design Phase • Using information from the analysis phase: – Outline modules and lessons. – Create detailed instructional objectives for all modules and lessons. – Create sequence of learning. – Determine the delivery method for each portion of the training. – Establish look and feel of user interface. – Design assessment strategy to match the objectives. Copyright © 2005 Ray Magnan 29
  • 30. Design Types of Knowledge Skills Less Evaluation: Apply judgement Structured Synthesis: Put together elements Analysis: Break down into elements Application: Apply abstract information More Comprehension: Fully understand concepts Structured Knowledge: Recall facts and information Adopted from B. Bloom & D. Krathwohl, Taxonomy of Educational Objectives, Handbook 1 : Cognitive Domain. London: Longman, 1984. Copyright © 2005 Ray Magnan 30
  • 31. Design Instructional Objectives • Create detailed instructional objectives for all modules and lessons. – Be specific about what they should be able to accomplish after completing a section. – For example: • State the benefits of using this tool • Install the software on a computer • Develop and run a report Copyright © 2005 Ray Magnan 31
  • 32. Design Sample Training Grid Cross Reference • 3 Audiences • 10 Sets of training materials Includes: • Recommended courses • Recommended sequence Copyright © 2005 Ray Magnan 32
  • 33. Design User Interface • Goal is consistent U.I. – Looks more professional – Easier for student to navigate through material • Examples: – Templates - Paper and online – User interface for web delivery • Navigation • Online quizzes • Simulations Copyright © 2005 Ray Magnan 33
  • 34. Design Prototype • Very useful to create a prototype or pilot – Especially if using new technology • Validate with stakeholders and SMEs • Receive feedback early in the process • Example of possible issues: – Problems with plug-ins for web browser. – Users who travel may prefer to download material rather than connect to network. – Security issues. • Make necessary corrections Copyright © 2005 Ray Magnan 34
  • 35. Development Development Development ADDIE Copyright © 2005 Ray Magnan 35
  • 36. Development Project Plan • Use project planning methodology for large scale projects. – Keep on track with milestone dates. – Provides a clear view of interdependencies Copyright © 2005 Ray Magnan 36
  • 37. Development Content Production • Incorporate feedback from prototype • Create the module and lesson content for the selected delivery methods. • Create storyboards for development of simulations. • Include appropriate exercises, interactions, and activities to enhance learning. • If appropriate, plan group activities. Copyright © 2005 Ray Magnan 37
  • 38. Development Assessments • Develop assessments that tie into the learning objectives. • Could include: – Pre-assessment – Quizzes during training – Post-assessment Copyright © 2005 Ray Magnan 38
  • 39. Development Review • Subject Matter Expert (SME) Review – Completeness and accuracy – Due date for feedback – Allow time to incorporate their feedback • Editor – Formatting and standards – Corporate branding • Quality Assurance (QA) – Test interface and links in technology solutions Copyright © 2005 Ray Magnan 39
  • 40. Development Related Media • Prepare any related media for learning. • For example, set up data and environment for training on software application. Copyright © 2005 Ray Magnan 40
  • 41. Implementation Implementation ADDIE Copyright © 2005 Ray Magnan 41
  • 42. Implementation Implementation Phase • Roll out training – May be done with launch of new product. • Announce availability of training – Course catalog listings. – Email, newsletter. – Target employees that management wants to take the training. • Mandatory training – Announcement should come from senior level management. Copyright © 2005 Ray Magnan 42
  • 43. Implementation Implementation Phase • Produce materials – Copies of paper materials – Upload files for online training • Deliver training – Hand over to trainers (ILT) – Instructions on how to work with materials – Monitor initial training session Copyright © 2005 Ray Magnan 43
  • 44. Evaluation Evaluation ADDIE Copyright © 2005 Ray Magnan 44
  • 45. Evaluation Types of Evaluation • Evaluate the students – Did they find the class useful? – Did they learn the material? • Evaluate the class itself – Was it effective? – Did it accomplish the objectives? Copyright © 2005 Ray Magnan 45
  • 46. Evaluation Why Evaluate Training? • Ensure that training is effective – Materials – Instructors – Facilities – Delivery methods • Prove return on training investment (ROI) • Reality check – Did the training work as planned? – Was your analysis correct? Copyright © 2005 Ray Magnan 46
  • 47. Evaluation Why Test Students? • Evaluate training effectiveness • Use to reinforce training – Learners may not read a summary thoroughly, but you can present the same information in a test format, which forces them to read it. • Can boost confidence of learners – New employees – Using new skills Copyright © 2005 Ray Magnan 47
  • 48. Evaluation Evaluating Training Programs Donald Kirkpatrick: 4 Levels of evaluation – Level 1 - Reaction/Satisfaction: Did they like it? • ASTD study (1994): 75% of US companies use this. – Level 2 - Learning: Did they learn it? • ASTD: 41% of US companies. – Level 3 - Transfer/Application • Apply it to their job? (Difficult to measure.) – Level 4 - Business Results • Make a business difference? (Difficult to measure.) Copyright © 2005 Ray Magnan 48
  • 49. Evaluation Level 1: Did they like it? • AKA Smile Sheets. • Questions about: – Course: • Were objectives met • Would you recommend it to others • Will you be able to apply it to your job • Course length • Classroom conditions Copyright © 2005 Ray Magnan 49
  • 50. Evaluation Level 1: Did they like it? • Questions about – Instructor • Knowledgeable • Answered questions • Paced class appropriately • Encouraged interaction • Provided real-life examples Copyright © 2005 Ray Magnan 50
  • 51. Evaluation Level 1: Did they like it? • Questions about – Course materials • Detailed enough • Matched topic covered • Provided real-life examples • Contained practice exercises Copyright © 2005 Ray Magnan 51
  • 52. Evaluation Level 1: Did they like it? • Questions about – Distance education • Online materials well-organized • Variety of learning activities • Activities for sharing information with others • Technical difficulties • Would they take another distance ed class Copyright © 2005 Ray Magnan 52
  • 53. Evaluation Level 2 - Did they learn it? Option: • Learner self-assessments – Assess their learning and ability to apply it to the job. – Difficult to validate due to subjectivity Copyright © 2005 Ray Magnan 53
  • 54. Evaluation Level 2 - Did they learn it? Option: • Testing – Simulations or final projects • Great testing tools but potentially expensive because of the time involved. – Hands-on exam • May be necessary for certain types of skills. – Delivering presentations. Driving a car. Sailing a boat. – Written tests • Most commonly used method • Cost effective and relatively simple to create. Copyright © 2005 Ray Magnan 54
  • 55. Evaluation Standardized Forms • Important to develop and use a standardized form for each delivery method. • Makes it easier to compare results and measure them over time. • Easier for learners to consistently fill out for multiple classes. Copyright © 2005 Ray Magnan 55
  • 56. Evaluation Assess Knowledge Skills Knowledge Skills Assessment Methods •Evaluation: Apply judgement •Assess, critique, measure. •Synthesis: Put together elements •Design, formulate, predict. •Analysis: Break down into elements •Conclude, contrast, infer. •Application: Apply abstract information •Compute, interpret, use. •Comprehension: Understand concepts •Classify, explain, summarize. •Knowledge: Recall facts and information •Identify, list, match. Copyright © 2005 Ray Magnan 56
  • 57. Evaluation Guidelines for Evaluation • Use questions that measure the appropriate learning level. • Questions should map to learning objectives. • Questions should mimic the resources available back on the job. • Phrase the questions to match the lowest potential reading level of the learners. • Pilot the test. Copyright © 2005 Ray Magnan 57
  • 58. Evaluation Checklist for Questions • Contain important content, not trivial • Plausible choices • Clear directions on responding • List choices in a logical order • Use clear and concise wording • Use wording at appropriate reading level • Map to the learning objectives • Relate to the learner’s work environment Copyright © 2005 Ray Magnan 58
  • 59. Evaluation Suggested Strategy • Pre-assessment – Measure baseline knowledge. Do not report scores • Quizzes – Reinforce learning. Provide immediate feedback. – Prepares for post-assessment. Do not report scores • Assignments and exercises – Measures success of training. Report scores • Post-assessment – Measures overall success of training. Report scores Copyright © 2005 Ray Magnan 59
  • 60. Evaluation Credentialing Issues • Potential Human Resource issues in: – Promotions, raises, bonuses, terminations. • For test results to be defensible in court, you must prove both: – Validity: Measure the specific skills defined in the learning objectives. – Reliability: Test must perform consistently. Copyright © 2005 Ray Magnan 60
  • 61. Time Tracking Time Tracking Copyright © 2005 Ray Magnan 61
  • 62. Time Tracking Time Tracking • Track the time that is spent on each aspect of the project: ADDIE • Use for cost/benefit analysis of this and other projects. • Useful for planning future projects. • Compare – Costs of course development – Numbers of students trained – Results of evaluations Copyright © 2005 Ray Magnan 62
  • 63. Resources for Resources for More Information More Information Copyright © 2005 Ray Magnan 63
  • 64. Resources: Organizations • Society for Technical Communication – www.stc.org • American Society for Training & Development – www.astd.org • New England Learning Association (NELA) – www.nelearning.org • eLearning Guild – www.elearningguild.com Copyright © 2005 Ray Magnan 64
  • 65. Resources: Organizations • Chief Learning Officer magazine – www.clomedia.com • Learning Circuits – www.learningcircuits.org/ • Brandon Hall – www.brandon-hall.com • Masie Center – www.masie.com Copyright © 2005 Ray Magnan 65
  • 66. Resources: Books • Michael Allen – Guide to E-Learning • Walter Dick & Lou Carey – Systematic Design of Instruction • Margaret Driscoll – Web-Based Training • Edward R. Tufte – Visual Display of Quantitative Information • William Horton – Designing Web-Based Training • Jan White – Graphic Design for the Electronic Age Copyright © 2005 Ray Magnan 66
  • 67. Questions? Questions? Copyright © 2005 Ray Magnan 67

×